By Chuong Nguyen | April 9, 2010 12:16 AM
When Apple had introduced the iPhone several years ago, it brought to the consumer a mindshare of user experiences on a mobile smartphone operating system. The phone revolution has happened again, and this time it is coming from a company that does not make phones. With the Fuse concept phone–still in prototype–Synaptics and its partners are demonstrating the future of handheld computing, giving the industry what is possible with touchscreen phones–from one-handed usage to smart UIs that give users an immersive smartphone experience. The experience is more than the sum of the phone’s parts–which consists mainly of a bunch of sensors coupled with a 3D user interface. Rather, Synaptics has created a model of a powerful smartphone with no other comparables on the market today, though we hope that the company will convince makers like Samsung, Apple, LG, Motorola, and others to adopt, customize, and deploy the Fuse.
The Fuse is bursting with sensors–it seems that the Fuse in itself showcases all the available touch sensors that Synaptics has been working on in its labs; the device can sense on its screen, back, and sides! Two side sensors can initiate scrolling when you touch the phone’s side, or squeeze the side touch sensors and the OS will extrapolate your gesture as a grip and initiate an entirely different command. The back is also a touch sensor, much like the company’s touchpad technology found on many modern trackpads on laptops and netbooks. This way, users can hold their touchscreen phones and use an index finger to scroll through lists and menus, perhaps squeezing the phone to select a menu item, and navigate all using just one hand. Furthermore, the sensors allow you to navigate without having to touch the touchscreen–which is another Synaptics technology–so you don’t build up fingerprints or obscure your vision. These sensors, combined with an accelerometer and haptics feedback that is finely tuned from Immersion Technology, provides for a rich user experience with potentials that are waiting to be exploited by developers, much like how Nintendo created a different gaming paradigm with the motion-sensing Wii controllers.
The video above shows the Fuse in action, and we thank the folks at Synaptics for allowing us time into their labs to show us the technologies in action. The gestures and sensors are powered by a powerful TI mobile processor.
Basic applications of the Fuse concept model have either been featured or rumored on upcoming devices. The Android-based Motorola Backflip has a keyboard that faces outwards to the back when the clamshell is closed to allow users to control the phone, much like the trackpad sensor on the back of the Fuse. Similarly, Apple’s next generation iPhone is rumored to have backside gesture support, albeit with the camera.
The UI is developed by The Astonishing Tribe, or TAT for short. You may know TAT for its 3D Home UI and the company was instrumental in creating the Android OS UI look and feel. The amazing thing is that even though the Fuse is a rough concept right now, the hardware and software really does blend together to create a very nice experience. The haptics feedback isn’t just a random vibration of the screen, but you can feel that Immersion, Synaptics, and TAT had thoughtfully created something that will resemble real buttons. This is most apparent on the phone application where users can dial the phone from the back touchpad sensor. By running fingers across the back, users can move across the dialpad. When users move over from one number to the next, there is an acute vibration that simulates moving over the edge of a number key, as if you were running your fingers over the number pad of a touchtone phone. The 3D UI along with widgets, and the way that information is intelligently displayed on the small space all add to the user experience. We can only begin to imagine the different manners in which developers can take this platform–from rich, interactive games that utilize these sensors to powerful programs that take computing to the next level.
While Apple and capacitive touchscreens have redefined mobile computing with gesture supports–primarily pinch and zoom and two-finger inputs–Synaptics is providing us with a glimpse of the future with the Fuse, offering us a view of how we can control our phone with our fingers not on the screen. When you think about it, the screen represents about 40% of the surface area of a phone. Synaptics, with its capacitive touchscreen, is expanding touch to the remaining 60% of the phone creating a rectangular cube that’s entirely coated in touchscreen sensors. While the demo of the phone is remarkable in itself, the possibilities in terms of software applications and games are endless and we’d love to see developers run with the idea and create some amazing new ways to be immersed in the mobile computing experience.
When I had first posted the video of the Fuse on my YouTube channel–video courtesy of Synaptics, shown below–there were a few people who didn’t believe that this was possible. Synaptics and its partners proved them wrong and thankfully for the Nokias and Samsungs of the world, Synaptics isn’t in the phone business because this would be one heck of a phone. Though Synaptics and its partners are not commercializing the Fuse concept, the concept phone–in its entirety or in pieces–can be licensed as a reference design for manufacturers to build, customize, and adopt. Perhaps we can soon see an Android phone or an iPhone bursting with sensors, ready to be held, gripped, and touched.
You can learn more about Synaptics Fuse from the company’s press release. The phone was created as a concept between Synaptics, TheAlloy, Immersion Technology, Texas Instruments, and The Astonishing Tribe.