When the Samsung Galaxy S4 debuted, there was a lot of hoopla over its Air Gesture and Eye Scroll features. With Air Gesture, users can simply wave a hand over the phone and it will respond accordingly, by answering a call or skipping to the next track. Eye Scroll uses eye movement to move up or down a page.
Many users ultimately found the features underwhelming, but it doesn’t really matter. These new features — still in their infant stages — mark another milestone for mobile.
The problem with “smart” devices
When we took computers and crammed them into tiny phones, we brought with us the way we had always done things. The problem — as any smartphone user knows — is that it’s hard to control an application’s interface when it’s centered on tiny buttons. There’s no tactile feedback, and visual feedback is often insufficient. Victims of autocorrect know what I’m talking about.
If users get frustrated with these devices, they might give them up altogether. It’s imperative that user interface design becomes the core focus.
The future in motion
Some user interface interactions are slowly being taken over by voice. But with 85 percent of iOS7 users saying they haven’t used Siri, it’s clear that “silent interaction” is still an important factor in how people use smart devices.
Some companies are already implementing gestures in their interface:
- Dolphin browser: Dolphin is a Web browser that allows you to map gestures to specific websites. For instance, you can set it to go to Google when you draw a “G” or to your wife’s blog when you draw a heart. It’s much more efficient than typing full URLs.
- Keyboard innovation: These keyboards replace buttons, like the space bar and backspace, with gestures. Want to delete that last word? Swipe once to the left. Need to type a name? A swipe up will temporarily disable autocorrect.
- Aviate desktop: Aviate organizes my apps based on where I am and what time of day it is. When I wake up in the morning, one flick brings up my morning routine apps. Considering that U.S. smartphone users average 41 apps per phone, this makes navigating those apps extremely simple.
Motion is everywhere
It’s not just our smartphones that will soon be controlled by motion. Three years ago, Microsoft launched the Kinect gaming system, which uses an RGB camera, depth sensor, and microphone to allow users to control a video game without a controller. The Leap Motion device takes accuracy further — it plugs into a computer and lets users control the system using air gestures.
With more device manufacturers including gesture controls in operating systems, users are becoming more accustomed to using them daily. Startups that want to build the best apps will have to include motion controls in their interfaces to compete.