What To Expect From Google’s Project Glass
Following rumors that it was working on some sort of Android-powered glasses, early last month Google gave us our first look at its Project Glass, an augmented reality system built into a head-mounted eyepiece. While it seemed at first that we were talking about design prototypes, subsequent encounters with Google employees sporting Project Glass out in the real world revealed that this was functional hardware, in one form or another. What the story has been lacking so far is much commentary on just what the user interaction with these glasses is really like. Now the picture is starting to come together, but it might not be quite what we were hoping for.
When Google first publicized the Project, it showcased a video examining some of the ways we might take advantage of such a system. We saw pop-up notifications, along with high-resolution maps, superimposed over the user’s field of vision. New commentary from a Google spokesperson paints the picture of a Project Glass that, instead, lives on the periphery of a user’s vision.
While the UI is reportedly unfinished, the prototype models we’ve been seeing so far apparently only display information at the upper edge of a wearer’s field of view. Google also may be dialing-back expectations of what we might hope to do with such a gadget, suggesting it may be focusing on things like taking and sharing photos, rather than slightly more advanced tasks like accessing maps and getting navigation help. There’s also the possibility that safety issues are at play, and Google wouldn’t want to obscure a user’s vision too much.
This all sounds pretty reasonable for the company’s first stab at a gadget like this; we may have to wait a few more years for a fully-immersive AR experience, but we’ll cope. After all, there’s no sense in getting overly ambitious with the initial foray into something brand new, and we can wait while Google builds-up its experience with wearable systems.