If you have ever seen movies like “The Matrix” and “Iron Man,” then you must have wondered what it would be like to control the TV, computer and other devices in your home with just a wave of your hand. These sci-fi dreams are quickly becoming a reality as gesture recognition technology matures.
What is gesture recognition?
Gesture recognition is a type of perceptual computing user interface that allows computers to capture and interpret human gestures as commands. The general definition of gesture recognition is the ability of a computer to understand gestures and execute commands based on those gestures. Most consumers are familiar with the concept through Wii Fit, X-box and PlayStation games such as “Just Dance” and “Kinect Sports.”
How is a “gesture” defined?
In order to understand how gesture recognition works, it is important to understand how the word “gesture” is defined. In it’s most general sense, the word gesture can refer to any non-verbal communication that is intended to communicate a specific message. In the world of gesture recognition, a gesture is defined as any physical movement, large or small, that can be interpreted by a motion sensor. It may include anything from the pointing of a finger to a roundhouse kick or a nod of the head to a pinch or wave of the hand. Gestures can be broad and sweeping or small and contained. In some cases, the definition of “gesture” may also includes voice or verbal commands.
How gesture recognition works
Gesture recognition is an alternative user interface for providing real-time data to a computer. Instead of typing with keys or tapping on a touch screen, a motion sensor perceives and interprets movements as the primary source of data input. This is what happens between the time a gesture is made and the computer reacts.
- A camera feeds image data into a sensing device that is connected to a computer. The sensing device typically uses an infrared sensor or projector for the purpose of calculating depth,
- Specially designed software identifies meaningful gestures from a predetermined gesture library where each gesture is matched to a computer command.
- The software then correlates each registered real-time gesture, interprets the gesture and uses the library to identify meaningful gestures that match the library.
- Once the gesture has been interpreted, the computer executes the command correlated to that specific gesture.
For instance, Kinect looks at a range of human characteristics to provide the best command recognition based on natural human inputs. It provides both skeletal and facial tracking in addition to gesture recognition, voice recognition and in some cases the depth and color of the background scene. Kinect reconstructs all of this data into printable three-dimensional (3D) models. The latest Kinect developments include an adaptive user interface that can detect a user’s height.
Who makes gesture recognition software?
Microsoft is leading the charge with Kinect, a gesture recognition platform that allows humans to communicate with computers entirely through speaking and gesturing. Kinect gives computers, “eyes, ears, and a brain.” There are a few other players in the space such as SoftKinect, GestureTek, PointGrab, eyeSight and PrimeSense, an Israeli company recently acquired by Apple. Emerging technologies from companies such as eyeSight go far beyond gaming to allow for a new level of small motor precision and depth perception.
Gesture recognition examples beyond gaming
Gesture recognition has huge potential in creating interactive, engaging live experiences. Here are five gesture recognition examples that illustrate the potential of gesture recognition to to educate, simplify user experiences and delight consumers.
1. In-store retail engagement
Gesture recognition has the power to deliver an exciting, seamless in-store experience. This example uses Kinect to create an engaging retail experience by immersing the shopper in relevant content, helping her to try on products and offering a game that allows the shopper to earn a discount incentive.
2. Changing how we interact with traditional computers
A company named Leap Motion last year introduced the Leap Motion Controller, a gesture-based computer interaction system for PC and Mac. A USB device and roughly the size of a Swiss army knife, the controller allows users to interact with traditional computers with gesture control. It’s very easy to see the live experience applications of this technology.
3. The operating room
Companies such as Microsoft and Siemens are working together to redefine the way that everyone from motorists to surgeons accomplish highly sensitive tasks. These companies have been focused on refining gesture recognition technology to focus on fine motor manipulation of images and enable a surgeon to virtually grasp and move an object on a monitor.
4. Windshield wipers
Google and Ford are also reportedly working on a system that allows drivers to control features such as air conditioning, windows and windshield wipers with gesture controls. The Cadillac CUE system recognizes some gestures such as tap, flick, swipe and spread to scroll lists and zoom in on maps.
5. Mobile payments
Seeper, a London-based startup, has created a technology called Seemove that has gone beyond image and gesture recognition to object recognition. Ultimately, Seeper believes that their system could allow people to manage personal media, such as photos or files, and even initiate online payments using gestures.
6. Sign language interpreter
There are several examples of using gesture recognition to bridge the gap between the deaf and non-deaf who may not know sign language. This example showing how Kinect can understand and translate sign language from Dani Martinez Capilla explores the notion of breaking down communication barriers using gesture recognition.
More on Augmented Reality advertising trends
- Augmented Reality marketing news update – Spring 2014
- Top 5 Augmented Reality trends for 2014
- VIDEO: AR interactive displays: SanDisk amazes with AR + Kinect
Talk to the AR and gesture recognition experience experts
To learn more about AR interactive displays with gesture recognition for live events and experiences, contact us at any time. Email Beck Besecker or call 727-851-9522.