By Stephen Schenck | February 20, 2014 2:41 PM
A GPS receiver lets your smartphone know where it is. Accelerometers can detect motion. And for that brief period when smartphones were toying with autostereoscopic displays, they even had cameras capable of taking 3D shots. But despite all these technologies, phones have lacked a comprehensive understanding of their surroundings: what does the room they’re in look like? What’s their location within the larger building? Google is hard at work to enable phones to become better at mapping the world around them, and today reveals its Project Tango effort to do just that.
Project Tango is an intersection of multiple technologies, combining cameras, depth sensors, inertial measurements, and some advanced processing in order to grab a constantly changing stream of data from a phone’s surroundings and translate that into a real-time picture of the world outside. Applications could range from augmented reality gaming, to architecture and interior decorating services, and even to helping guide the visually impaired.
Google’s inviting developers to have a chance to work with Project Tango development hardware, accepting sign-ups as of today and ultimately giving 200 devs the chance to experiment with these devices.
Where this might go from here, we can’t yet say, but if Tango proves to be successful, it may well make its way to commercial hardware. Who wants a Nexus phone with built-in 3D scanning?