Haptics is to touch the way optics is to sight. It's a user interface that circumvents the cluttered inputs of sight and sound, and it's appearing in an increasing number of objects we interact with daily.
Your first experience with haptics was probably your phone vibrating in your pocket. Or maybe it was the rumble pack on your N64 controller. But whatever the case, you probably didn’t know it as a haptic interface.
Haptics is to touch the way optics is to sight. It's a user interface that circumvents the cluttered inputs of sight and sound, and it's appearing in an increasing number of objects we interact with daily. Vibration is just the beginning.
Any sort of information received through touch is haptic; braille could be considered haptic communication. But as it appears in technology, it's generally either tactile (expressing texture) or kinesthetic (expressing force or position). Haptics is used to better robotic control, to increase realism in gaming, and even to sit up straighter.
The roots of haptic technology are mechanical, says Will Provancher, a professor of mechanical engineering at the University of Utah and co-chair of the World Haptics Technical Committee.
"Right around the time of WWII, people were trying to handle radioactive materials, and if you have direct contact with these materials, you will eventually die," he says. "So to be able to handle these materials safely, people started making kinematic linkages."
That is, scientists and engineers used a mechanical apparatus to manipulate the samples — pull, and it pulls, turn, and it turns. But more recently, computers have become an interface between controller (master) and controlled (slave). Motor control is much finer, but that's not always enough.