With its Kinect for Windows program, Microsoft wants to make it common to wave your arms at or speak to a computer. “We’re trying to encourage software developers to create a whole new class of app controlled by gesture and voice,” says Peter Zatloukal, head of engineering for the Kinect for Windows program.
Zatloukal says the result will be on a par with other big shifts in how we control computers. “We initially used keyboards, then the mouse and GUIs were a big innovation, now touch is a big part of people’s lives,” he says. “The progression will now be to voice and gesture.”
Health care, manufacturing, and education are all areas where Zatloukal expects to see Kinect for Windows succeed. Kinect for Windows equipment went on sale in February for $249 and is now available in 32 countries.
Jentronix is using it to help people with physical rehabilitation after a stroke. Freak’n Genius, offers gesture-based animation software.
Mark Bolas, an associate professor and director of the Mixed Reality Lab at the University of Southern California, and his group are experimenting with using Kinect to track very subtle behaviors — monitoring the rise and fall of a person’s chest to measure breathing rate, for example. Displaying an indication of someone’s breathing rate during a video call allows others to understand a person better, he says, and can show when to start talking without interrupting.