Augmented Reality Racing

A group of students at Technische Universitat Munchen, under the direction of Gudrun Klinker, created this augmented reality racing game.  The setup is similar to the AR Drone game in that it uses a real vehicle, however, using the illusions of AR, they turned their LEGO Mindstorm robots into cars.

The students had two (analog) radio cameras and mounted each on one Lego Mindstorm Robot. With this constellation the player was presented the view on the world from a robot’s perspective. The students also placed markers on top of the robots which were being used to calculate the pose of each camera in relation to a common coordinate system (using one ceiling mounted camera). With this pose, the view of the virtual camera was matched to the real camera’s view. The students augmented the real camera images with virtual graphics such as the race course and some special items (speedup, rockets, etc). The goal was to drive as fast as possible four laps or to hunt down the other player before he can finish the race.
The calibration of the tracking system and the tracking itself was done using Ubitrack and its MarkerTracker Component developed at our chair.

We think because of some aspects this game is very special:
– The game is a twoplayer game, so you share the same virtual (and real) world between the two players.
– There are no markers in the camera view of the player since the cameras for showing the graphics and doing the tracking are not the same.
– The (real) robots are being overlayed with car models in the camera images so you do not get the the impression of playing against Lego robots. (O.K. sometimes you can see the robot due to a lag in the tracking or since it is not overlayed completely)
– And best: It was playable and is not only a concept :-)

Good job Christian Waechter, Eva Artinger and Markus Duschl.  I think you’ll have a career in augmented reality.

The Multi-Sensor Problem

Sensor systems like cameras, markers, RFID, QR codes (and more) are usually done as single methods to align our augments.  One challenge for a ubiquitous computing enviroment will be meshing together the various available sensors so computers have a seemless understanding of the world.   

This video from the University of Munich shows us how a multi-sensor system works.  It appears to be from 2008 (or at least the linked paper is.)  Here’s the description on the project:

TUM-FAR 2008: Dynamic fusion of several sensors (Gyro, UWB Ubisens, flat marker, ART). A user walks down a hallway and enters a room seeing augmentations different augmentations as he walks on: a sign at the door and a sheep on a table.

In the process, he is tracked by different devices, some planted in the environment (UWB, ART, paper marker), and some carried along with a mobile camera pack (gyro, UWB marker, ART marker). Our Ubitrack system automatically switches between different fusion modes depending on which sensors are currently delivering valid data. In consequence, the stability of the augmentations varies a lot: when high-precision ART-based optical tracking is not available lost (outside ART tracking range, or ART marker covered by a bag), the sheep moves off the table. As soon as ART is back, the sheep is back in its original place on the table.

Note that the user does not have to reconfigure the fusion setup at any point in time. An independent Ubitrack client continuously watches the current position of the user and the associates it with the known range of individual trackers, reconfiguring the fusion arrangements on the fly while the user moves about.

The project brings up an interesting question.  Is anyone working with multi-sensor systems?  We know we’ll need a mix of GPS and local image recognition and markers to achieve our goals, but is anyone working on this complex problem for a real product?  We’ve seen good image recognition with Google Goggles or SREngine, and GPS/accelerometer based AR is popular, but I’d like to see an app use both to achieve their aims. 

If you’re working on a multi-sensor project.  We’d love to hear about it at Games Alfresco.