All it takes is the one killer app to transition a new technology from the cutting edge to the public consciousness mainstream.
A large part of the iPhone’s success was because of the fact that it offered an elegant, intuitive new way for users to interact with their devices–the multi-touch screen. It felt almost revolutionary at the time. The ability to launch an app with a finger, or to navigate a map by pinching, and dragging. But touch interfaces had been around for a long time. ATM machines and kiosks introduced touch interaction to a mainstream audience years before the iPhone was launched. But they were always seen as novelties, or worse, shoddy, and frustrating.
Apple’s innovation–aside from a deep understanding of user expectations from everything to how fast a list would scroll based on how quickly a user flicked their finger, to how quickly an app needed to launch after it was tapped–was the user’s ability to utilize more than one finger to perform actions–a multi-touch screen. Sensing how many fingers were touching the screen, and changing the type of action a user performed based on this information, opened up whole new ways for users to navigate, and interact. They could scale, rotate, and move photos, rather than just opening them, tap and swipe their way through maps, and lists, and interact directly with content such as music or movie, by simply touching it.
It’s amazing to me how quickly these new ways of interacting have become old hat. This is partially due to the fact that Apple designed the interface so well, but I think, mostly due to the fact that multi-touch is inherently intuitive, like finger painting, and removes an artificial barrier in the form of a button or control that stands between a user and the content they interact with. Who needs a button when you can just touch something? The speed with which babies and toddlers learn how to use an iPad is testament to this.
So, what’s next for touch interfaces? What other real world behaviors can we interface designers leverage to continue to make our interfaces disappear, and let users continue to finger paint their way through the applications we design?
These new technologies give hints as to what may be next:
developed by disney research in collaboration with carnegie mellon university, ‘touché’ is an innovative system of touch recognition that can sense not only whether a user is touching an object but also in what way and with what body parts (s)he is doing so, using only a single wire and sensor.
Senseg turns touch screens into Feel Screens. With Senseg touch screens come alive with textures, contours and edges that users can feel. Using Senseg technology, makers of tablet computers, smart phones, and any touch interface device can deliver revolutionary user experiences with high fidelity tactile sensations.
What new kinds of interactions can we design when we have access to a user’s body and movements? How does an interface change when it has texture, or can touch you back?