Weekly Augmented Reality Linkfest

Wow, what a busy week. I’ve listed below only a few of this week’s AR related news stories, just to protect you from an information overload. I hope to blog about the other stories in the coming days.

This week’s video is magnificent in its simplicity. Nothing more than a demo of Qualcomm’s AR platform, featuring virtual domino bricks, it made me think what would happen if they’ll scale this game. Anyone in the world could place bricks, and anyone could push a brick and start a world-wide chain reaction, but of the playful kind. A simple game that will cross borders and cultures, or maybe I’m a walking cliche?

Have a beautiful week!

Three Augmented Reality Demos at SIGGRAPH 2010

Earlier this week I reported about an interesting paper accepted to this year’s SIGGRAPH conference, studying how augmented reality makes cookies taste better. However, it’s far from being the only appearance of AR in SIGGRAPH. Following are three more AR demos that took the stage during the conference:

Camera-less Smart Laser Projector
Using a laser beam both for scanning and projecting images, researchers from the University of Tokyo (where the cookies paper also comes from) gained several advantages over the usual projector+camera setup. Mainly, they eliminated the need to calibrate between the camera and projector, and gained the ability to scan 3d objects without stereoscopic cameras.

Camera-less Smart Laser Projector. Cassinelli, Zerroug, Ishikawa, Angesleva. SIGGRAPH 2010

QR-Code Calibration for Mobile Augmented Reality
This work by researchers of Kyushu University uses QR codes in order to fix the user position in the world. It shows you the problem with academic conferences. By the time your papers gets published, some company may already deploy similar technology in the real world, as Metaio did with Junaio.

QR-code calibration for mobile augmented reality applications: linking a unique physical location to the digital world. Nikolaos, Kiyoshi. SIGGRAPH 2010.

Canon brings to life dinosaurs
Not only researchers come to SIGGRAPH, but companies as well. Canon demoed head mounted assisted mixed reality in their booth, featuring a cute velociratpor like dino (lacking feathers, they should get some new biology books). Jump to the 25th second to see it in action:

Nothing Like the Smell of a Freshly Augmented Cookie

SIGGRAPH, the world’s most important conference on computer graphics and a pixel-fetishist wonderland is being held this week in Los Angeles, featuring several interesting papers on augmented reality. One of them explores the augmentation of cookies.

Yes, we have all seen cookies marked with augmented reality markers. Games Alfresco has mentioned such cookies back in 2008. So how come such a paper got accepted into SIGGRAPH? Well, the simple answer is that you’ve haven’t seen (or rather tasted) such cookies before.

Created by researchers at the University of Tokyo, Meta Cookie combines together a head up visual display and head up olfactory display to trick your senses. I’m not quite sure what was the goal of that research, but were successful in changing how the cookie tasted. Probably some smart marketer will find a way to sell it as a weight loss device. You just need a way to print a marker on broccoli to make it tastes (and look like) ice cream (with a crunchy texture).

Meta Cookie. Narumi, Kajinami, Tanikawa, Hirose. SIGGRAPH 2010.
image via BusinessWire

ARScope: Augmented Reality Through the Crystal Ball

ARScope originally Presented at SIGGRAPH 2008 by the University of Tokyo (yes, the guys behind ARForce), is obviously not a new concept. However, as far as I can see, it got only little coverage at the time, and certainly deserves our attention.

ARScope is an interesting combination of old world metaphors such as a magnifying glass and a crystal ball, head mounted display and projected AR. The user holds a handheld device covered by reflective material, on which an image is projected from a micro-projector the user wears on his head. Two cameras, one on the handheld device and one on the headset, and a sophisticated algorithm are used to calculate the user’s point of view relative to the handheld device, and thus ARScope is able to project a suitable image. In the case of the crystal ball, it even allows two users to see two different perspectives on the augmented world.

Can’t understand why no-one has commercialized this idea yet. It seems to be far more natural than HMD that blocks your peripheral vision.

Another video can be found here, and an interview with one of ARScope’s creators is here. More details on the project’s website.

New Video for ARForce

Back when my blog had just a few visitors (only nine months ago), I wrote about ARForce. In a nutshell, ARForce is a concept input device that harness infrared to create something akin to a 3d multitouch marker.

Luckily, Tokyo University, which is behind ARForce, has just uploaded a new video demoing the use of ARForce is several scenarios, so this cool concept will gain larger exposure:

You can find more details, and another video of this device on my original post, or go directly to the project’s homepage.