Augmented Reality Company uses AR + Kinect for Interactive Display

SanDisk strives to demonstrate their culture of innovation and with high-tech, show-stopping interactive displays and live event experiences. For Mobile World Congress 2014, they wanted to create an experience that expressed these values while educating attendees about their latest products.

Their vision: Unveil the game-changing 128GB2 SanDisk Ultra® microSDXC™ UHS-I memory card, the world’s largest capacity microSD card, with an engaging interactive demonstration inspired by “Iron Man,” “Minority Report” and “The Matrix.”

The ultimate interactive display: Augmented Reality + Kinect

The SanDisk experience goes beyond a futuristic, cool tech experience. It taps into a user’s desire to control and manipulate the experience. The monitor in front of the user shows the device in their hand ‘explode’ in 3D space with a swipe of their hand. Swipe again to remove parts and finally again to view the SanDisk component and key features.

To showcase the new memory cards, Marxent used Augmented Reality and Kinect to create an“Iron Man”-style experience. Using gesture recognition, participants could launch exploded views of popular high-tech products that use SanDisk cards.

“This unique experience allowed attendees to virtually disassemble some of the most popular smartphones on the market to learn about the various SanDisk products that power them. We were very excited about introducing this experience in our booth at MWC 2014 this year,” said SanDisk’s Anurag Malhotra.

Here’s how it worked:

  1. One at a time, attendees stood in front of a giant LCD monitor. A Kinect sensor was mounted below the LCD Monitor next to an iPod Touch 5.
  2. The participant selected from a medley of mobile devices powered by SanDisk memory cards. The devices such as a Kindle Fire and a Moto X, featured custom AR trackables on their screens. Users also had the option of using printed markers, including a giant print marker that was used to see an exploded view of the Ardusat MK1 satellite.
  3. To initiate the experience, the Kinect sensor acquired skeletal recognition of the participant as an iPod Touch 5 recognized the unique mobile device or cardboard marker in their hand.
  4. A game server passed the recognized gesture data from the Kinect to the iPod Touch. The iPod translated the Kinect gesture input and projected the results onto the LCD monitor in front of the participant.
  5. The participant watched the monitor as the device in their hand ‘exploded’ in 3D space. They used the monitor as a reference as they manipulated and explored the device with gestures picked up and translated by Kinect.

iOS and Kinect do not innately interact with each other so a custom integration was designed to make the experience possible. We took two new, immersive technologies, implemented a technical work-around and merged them into an exciting interactive experience.