The idea of interacting with our phones through in-air gesture controls is nothing new; companies like Samsung have been doing this trick for a while now. But so far the way of pulling that off has involved the use of optical sensors; is there a better way to do things? Elliptic Labs thinks so, and has just revealed its own ultrasound-based system for interacting with your phone at a distance.
Elliptic’s Multi Layer Interaction system measures both hand position and distance, letting users control phone software either by how close their hand gets to the screen, or with side-to-side swipe gestures. It can detect your hand from up to half a meter from the phone itself, and an ability to function even when the main SoC is asleep could let the system be used to wake your phone simply by moving your hand towards it.
So when will we see a phone using this kind of ultrasonic hand-tracking? That’s a good question, and as of now we haven’t heard of any OEMs committing to licensing this tech for their handsets. Elliptic will be demoing it at CEATEC in Japan this week, with the eye of capturing some manufacturer interest, so with a little luck we may well witness ultrasonic tracking come to a smartphone sometime next year.
Source: Elliptic Labs