Horizon Report – gesture-based computing
[This is the last in our series on the annual Horizon Report, which identifies technologies likely to impact teaching, learning, or creative inquiry.]
Gesture-based computing refers to the use of natural human gestures to control devices. This technology is expected to make its mark on teaching and learning four or five years from now.
Most of you have either played or heard about the Wii game system. It runs the usual video games, but the Wii remote has a motion sensor that can feed three-dimensional data into the console, allowing the system the simulate the experience of bowling and other activities. Similar devices from Microsoft and Sony are expected this year.
Devices like Apple’s iPhone and iPod Touch allow users to tap on a multitouch screen to indicate choices or enter data. You can also swipe your finger across the screen to activate certain functions. The iPhone and Wii are well-known gesture-based computing devices. Lesser known is Microsoft’s “Surface,” a table where the entire surface is a multitouch screen. The OIT Academic Technologies group at Notre Dame has been exploring applications of Microsoft Surface.
The video above shows “SixthSense,” a wearable gestural interface from the MIT Media Lab.
This technology clearly changes physical aspects of controlling a device, but it could also affect the psychology of interacting with a computer. Feelings of control and connection seem to be enhanced.
The Horizon Report lists a variety of applications for gesture-based computing with the elderly, medical schools, hearing impaired, and kinesiology. To learn more, see the full document or try these readings: