Elliptic Labs sees a future where phones become hands-free
Ultrasound Technologies on mobile devices was what Elliptic Labs showed off last year, letting users navigate using simple gestures. The company now boasts multi-layer gesture support so you can get arm deep into that data according to your hands proximity to the screen.
Elliptic Labs was last seen a year ago showing off a way to use gestures with mobile devices. Elliptic Labs has improved the product and have made it work with multiple gesture layers which means you can do different things with a smartphone based on the proximity of your hand to the device.
Laila Danielson, CEO of Elliptic Labs described it this way: “Without touching the device, as your hand moves towards your phone for example, the screen lights up and information is displayed. As you continue to move your hand closer, different information is revealed. It’s all about improving the user experience and by presenting easier ways to interact with mobile devices.”
Several handsets have been launched that utilize infrared sensors or, as is the case with Elliptic Labs, ultrasound waves. This can be helpful for viewing data without having to touch the screen and for browsing through web pages, photos and many other implementations. The best function however, was the video playback controls appearing at the bottom of the screen as a hand moved closer to the display during a movie. A very intuitive way of approaching things and it felt as if the phone anticipated when the user wanted to hit pause or turn the volüme up.
Elliptic Labs has provided an SDK meaning that device makers and app developers can integrate their software with the ultrasound sensors. It’s up to the company to convince equipment manufactures to include this awesome new technology in phones, tablets and other mobile devices and we’re sure that this will be easy, given that this is a truly innovative use for hand gestures an done we haven’t seen in a long time since Kinect implemented them in awesome games.