By Stephen Schenck | April 28, 2012 11:00 PM
One goal that smartphone hardware and software designers seem to keep reaching for is the ability for our phones to quickly and easily read data that’s been embedded in the world around us. That could take the form of a QR code, a passive NFC tag, or maybe some more general image recognition like Google Goggles uses to identify landmarks. Some MIT grad students have come up with yet another system with its own advantages and limitations.
Called NewsFlash, the system uses your smartphone’s camera to pick up on a signal generated by video displays. The information is modulated into a band at the top of the screen, where rapidly shifting colors broadcast it to any NewsFlash-aware phone within view. These color shifts happen so quickly that, although your phone’s camera will catch them, the human eye isn’t supposed to notice a thing.
The big appeal of the tech behind NewsFlash – that it allows for a QR-code-like interaction with a display screen without needing to tie-up screen real estate with a potentially jarring, blocky QR code – also seems like it could be the project’s biggest limitation. We’re not going to walk around with our phones’ cameras on all the time, so there needs to be some way to alert people that a screen contains hidden NewsFlash data. If that means putting up some easily-recognizable NewsFlash logo, aren’t we defeating the point of that whole transparent implementation?
While we have some questions about how NewsFlash might be used, there’s definitely some potential here, and for all we know we might see a similar system polished-up and ready for commercial use in the coming years. So far, the team at MIT has been focusing on this tech’s use when it comes to displaying news articles, but given its flexibility, that sounds like it could be just the tip of the iceberg.