LIDAR may be best known right now for helping power autonomous cars (and infuriating Elon Musk), but the same technology could improve how we interact with smart speakers, a team of Intel-backed researchers suggest. SurfaceSight speculates on the potential for more useful IoT devices when they understand what’s around them, including object and hand recognition.
The goal was to give existing smart speakers and the applications they run some situational awareness. By stacking an Amazon Echo or Google Home Mini on top of a compact LIDAR sensor, researchers Gierad Laput and Chris Harrison of Carnegie Mellon University demonstrated how the devices could make inferences based on shape and movement about what was nearby. They’ll present their findings at ACM CHI 2019 today.
LIDAR uses lasers for range-finding, effectively bouncing non-visible light off objects and then building up a point cloud map based on the time it takes for that light to be reflected back. While it’s out commonly associated with autonomous car projects, where being able to create a real-time plan of the surrounding area is useful for avoiding traffic or pedestrians, it’s also commonly used in robotics, with UAVs, and other applications.
For SurfaceSight, the applications are varied. One possibility is using fingers and hands to do gesture input; alternatively, a smart speaker could track when a smartphone is placed down on the table nearby, and then automatically recognize that as the user intending to stream music.
The plane of recognition needn’t be horizontal, either. In another demo, SurfaceSight could track movement against a wall, with a LIDAR integrated into a smart thermostat. That could recognize taps, swipes, and circular motions against the wall, effectively turning the surface into an extended control pad. Think along the lines of Google Soli, but on a larger scale.
Where SurfaceSight really gets interesting is in how it uses LIDAR to recognize objects. The team trained the sensor on different kitchen objects, like scales and measuring cups, as well as workshop items such as tools. A multi-step recipe could use the LIDAR to track which part is being completed, advancing automatically. Alternatively, motion could be linked with spoken requests to lend further context, like shaking a measuring cup while simultaneously asking “how many ounces in this?”
It’s fair to say that smart speakers are at the commodity level right now, with Amazon and Google racing each other down to the most affordable price. While both companies have bet on voice being the preferred primary method of interaction, however, they do so at the expense of other modalities. Baking in LIDAR might not be the only way to solve that, but there’s no denying that a home hub-style device could be a lot more useful if it knew what you were doing, not just what you were telling it.
Comments
This blog is the author's personal website. It is not affiliated with MicroVision, Inc. or any company. This website does not recommend the purchase or sale of any stocks, options, bonds or any investment of any kind. This website does not provide investment advice. Disclaimer and Notices: Disclaimer: This website may contain "forward-looking" information including statements concerning the company's outlook for the future, as well as other statements of beliefs, future plans and strategies or anticipated events, and similar expressions concerning matters that are not historical facts. The forward-looking information and statements are subject to risks and uncertainties that could cause actual results to differ materially from those expressed in, or implied by, the statements. The information on this website includes forward looking statements, including statements regarding projections of future operations, product applications, development and production, future benefits of contractual arrangements, growth in demand, as well as statements containing words like believe, estimate, expect, anticipate, target, plan, will, could, would, and other similar expressions. These statements are not guarantees of future performance. Actual results could differ materially from the results implied or expressed in the forward looking statement. Additional information concerning factors that could cause actual results to differ materially from those in the forward looking statements are included in MVIS most recent Annual Report on Form 10-K filed with the Securities and Exchange Commission under the heading 'Risk factors related to the company's business,' and our other reports filed with the Comission from time to time. Except as expressly required by Federal securities laws, MVIS Blog undertakes no obligation to publicly update or revise any forward looking statements, whether as a result of new information, future events, changes in circumstances, or other reasons. Legal Notice: Although considerable care has been taken in preparing and maintaining the information and material contained on this website, MVIS Blog makes no representation nor gives any warranty as to the currency, completeness, accuracy or correctness of any of the elements contained herein. Facts and information contained in the website are believed to be accurate at the time of posting. However, information may be superseded by subsequent disclosure, and changes may be made at any time without prior notice. MVIS Blog shall not be responsible for, or liable in respect of, any damage, direct or indirect, or of any nature whatsoever, resulting from the use of the information contained herein. While the information contained herein has been obtained from sources believed to be reliable, its accuracy and completeness cannot be guaranteed. MVIS Blog has not independently verified the facts, assumptions, and estimates contained on this website. Accordingly, no representation or warranty, express or implied, is made as to, and no reliance should be placed on the fairness, accuracy, or completeness of the information and opinions contained on this website. Consequently, MVIS Blog assumes no liability for the accompanying information, which is being provided to you solely for evaluation and general information. This website does not contain inside information, proprietary or confidential information learned or disclosed as part of employment relationships or under nondisclosure agreements or otherwise.
Comments
Post a Comment