It’s practically impossible to keep count of Google’s innumerable groundbreaking, sometimes weird, sometimes downright nutty “projects”, especially as they often evolve and unite towards a common goal.
Of course, you also have the occasional nonstarter, but in recent years, we’ve seen Aura, Ara, Tango and Loon, to only name a few, capture the imagination of the tech world and press on very close to the mass production phase.
Well, you can now add Project Soli to that list, although the wide commercial release of gadgets endowed with a new sensing technology that uses miniature radar to detect touchless gesture interactions is still no doubt many years away.
But the ATAP (Advanced Technology and Projects) division has nicely improved its futuristic concept since Soli’s first public demonstration a year back, flaunting something much cooler, polished, functional and fluent near the end of the 2016 Google I/O conference.
The magic begins around the 29-minute mark of the YouTube clip embedded above (though there are a lot of similarly neat prototypes presented over the course of the one-hour keynote), and sees one of the project’s lead engineers control a smartwatch through thin air.
No touches involved, just simple gestures aimed at simulating natural hand movement. When (if?) Project Soli bears fruit, you’ll apparently be able to gesture your way through wrist notifications, scroll through menus, pull up detailed information, and much more.
But the technology won’t be confined to wearable devices, with smarter-than-smart speakers already in testing, and a wide range of products being no doubt capable of fitting tiny radars for at-a-distance recognition of hand language. Yes, ladies and gentlemen, a future where “your hands are the only interface you’ll need” is almost here.