Weekly Linkfest

I hope you are ready, here’s another linkfest:

  • Tish Shute talks with Brady Forest of Where 2.0 on commercial AR models, whether this is going to be the year of augmented reality and what were the breakthroughs of 2009 (and of course, ARWave)
  • RealVision.ae calls for using augmented reality when coming to teach history. That’s a winning proposal, however you probably won’t be able to be in the actual location to “AR-experience” most of history lessons (unless your school is super rich).
  • The guys responsible for Yelp’s Monocole give a lecture at Stanford’s iPhone application development class.
  • SparkView is a mobile AR browser for tracking people.
  • First game based on the Layar platform is demoed at the Mobile World Congress.
  • GlyphPlayer is a one stop shop for all your marker based AR needs (free for non-commercial use)
  • And, for the Chinese new year, now you can download an application for your iPhone showing where you can drink Tiger beer in London’s Chinatown. Apparently it’s the year of the tiger, but I don’t know if that’s a coincidence or not.

There’s a lot more links piling up, but, unfortunately I’m quite busy till the end of this month.
This week’s video is a demo a game called Sky Siege (link to appstore) that will set you back $3. I think that’s the best looking pseudo-AR game we have seen yet, too bad its website looks like it was built for geocities.

Happy year of the tiger and valentine to all our readers!

Augmented Maps with Photosynth

I don’t have anything intelligent to say about it, but I was compelled to post it. I’m a big Photosynth fanboy, and always believed it has great potential for augmented reality. On this talk from last Thursday at TED, We see that Microsoft has similar ideas in mind. I would expect more surprises coming from Redmond in 2010

You can read more details about it over at MIT Technology Review or on Bing’s blog.

Fancouver at the Winter Olympics

Yahoo! and augmented reality leader Total Immersion have come up with some nifty ways to bring consumers into the action at the world’s largest winter sporting event.  Yahoo!’s “Fancouver” exhibit enables passers-by to insert themselves into the festivities in a host of guises.   Kicking off yesterday, Feb. 12, Fancouver features an entertaining and versatile digital out-of-home display, with dual windows that use augmented reality (AR) face tracking and tracking to a brochure, respectively, to give fans a distinctly different view of the proceedings.

Today Stop-Motion Animation, Tomorrow Augmented Reality

Check out what two guys can do with stop-motion and 222 T-shirts.

Now imagine what you could do with augmented reality and just two T-shirts.

Metaio at Mobile World Congress 2010

In addition to the other AR happenings at Mobile World Congress 2010, Metaio will be featured at “Creation Day” with Sony Ericsson.  They’ll be showing off their latest feature tracking technology on a new Sony Ericsson device.

Peter Meier, the CTO of Metaio, will also be giving a speech on Wednesday at 4pm within the session: “Mobile Innovation — A Vision of 2020.”  This session will:  “take a visionary look into the services and applications that mobile communication will provide in 10 years time and the impact they will have on the way we live and communicate in 2020. The latter half of this session will look at Augmented Reality.

Thanks Jan from Augmented Blog for the update.  He promises some exciting releases and a movie after MWC2010 has concluded.

And once again, we won’t be able to attend – so if you’re there – keep us updated about your experience.


Augmented Reality at the Mobile World Congress

Next week, February 15-18th, will be the Mobile World Congress in Barcelona.  There will be a variety of AR related events during the MWC.

AR Showcase

Christine Perey has organized an AR Showcase on Wednesday, February 17th from 5:00-7:00, so AR companies can demonstrate their services and products to customers.  Designers will also have a chance to compare and contrast their products versus the competition.  The following companies have confirmed their attendance:

You can find the Showcase in the northeast corner of the courtyard.  Announcements for the AR showcase can be tweeted to #arshow (changed for length.)

The Mobile AR Summit is an invitation only event.  If you’re interested in joining, please contact Christine Perey at cperey@perey.com.    More information can be found here.

Other Related Events

Navteq Challenge
Sunday, 14.2.2010
Poble Espanyol, City Hall.
Wikitude Drive is an Augmented Reality navigation system. They are one of the 10 finalists at this years Navteq Challenge.

http://www.nn4d.com/site/global/market/lbs_challenge/about/emea/home/p_home.jsp
Mobile Premier Award in Innovation
Monday, 15.2., 2010 15:00 to 20:00
Petit Palau of Palau de la Musica
Mobilizy is with Wikitude one of the 20 finalists of the “Mobile Premier Award in Innovation”.
http://www.mobilepremierawards.com/
Martin Lechner, CTO Mobilizy, will present.

AR Summit

Wednesday, 17.2. 13:00 to 19:00
Location: to be announced.
Mobilizy CTO Martin Lechner presents a position paper “ARML an Augmented Reality Standard”. ARML is currently being reviewed by W3C (World Wide Web Consortium). At 17:00 there will be a Wikitude Showcase presentation.

We won’t be able to attend – so if you’re there – keep us updated about your experience.


Augmented Makeup Gets a Little Bit Better

A couple of months ago, I wrote about a Korean magic mirror that lets you try on makeup. I thought the results were less than satisfactory:

Well, the Japanese have tamed the virtual makeup down. Created for cosmetic giant Shiseido by Fujitsu, the following application can be found in kiosks across Tokyo:

More details on Engadget

And, if you are not lucky enough to live in East Asia, Walmart and the British drugstore chain Boots are piloting the following kiosk by EZFace. Unfortunately, it works on a static image and doesn’t augmented live video feed, so it’s not really AR.

The Multi-Sensor Problem

Sensor systems like cameras, markers, RFID, QR codes (and more) are usually done as single methods to align our augments.  One challenge for a ubiquitous computing enviroment will be meshing together the various available sensors so computers have a seemless understanding of the world.   

This video from the University of Munich shows us how a multi-sensor system works.  It appears to be from 2008 (or at least the linked paper is.)  Here’s the description on the project:

TUM-FAR 2008: Dynamic fusion of several sensors (Gyro, UWB Ubisens, flat marker, ART). A user walks down a hallway and enters a room seeing augmentations different augmentations as he walks on: a sign at the door and a sheep on a table.

In the process, he is tracked by different devices, some planted in the environment (UWB, ART, paper marker), and some carried along with a mobile camera pack (gyro, UWB marker, ART marker). Our Ubitrack system automatically switches between different fusion modes depending on which sensors are currently delivering valid data. In consequence, the stability of the augmentations varies a lot: when high-precision ART-based optical tracking is not available lost (outside ART tracking range, or ART marker covered by a bag), the sheep moves off the table. As soon as ART is back, the sheep is back in its original place on the table.

Note that the user does not have to reconfigure the fusion setup at any point in time. An independent Ubitrack client continuously watches the current position of the user and the associates it with the known range of individual trackers, reconfiguring the fusion arrangements on the fly while the user moves about.

The project brings up an interesting question.  Is anyone working with multi-sensor systems?  We know we’ll need a mix of GPS and local image recognition and markers to achieve our goals, but is anyone working on this complex problem for a real product?  We’ve seen good image recognition with Google Goggles or SREngine, and GPS/accelerometer based AR is popular, but I’d like to see an app use both to achieve their aims. 

If you’re working on a multi-sensor project.  We’d love to hear about it at Games Alfresco.

Sportpong – It’s Fun Being a Paddle

In Switzerland you can play Pong. Yeah, I know, you can play Pong for about 30 years all around the world, but you could never play it like this – outside, with your legs serving as paddles.

It’s nothing new either – you could have rent the setup for the game for at least a couple of years. The company behind it writes:

The setting is very simple: a reflector on each foot is the only physical tool to interact with Sportpong. The interface is integrated in the field which is projected on the floor. The players control the game with their feet, nothing else. This control is intuitive, naturalistic and very direct.

I really, really, can’t wait to try it out. Last year I had a session of Atari Pong (the first in twenty years) and enjoyed it immensely. This looks even better. Would be great having it on ARE or ISMAR.
More details on sportpong.ch via SwissMiss.

Weekly Linkfest

The time has come for another fun-packed linkfest.
But before we begin, a special message to those of you who like beer as much as AR, and live in the vicinity of Munich. Toby of augmented.org is inviting you to the first Munich AR regulars’ table. You can find more details here, and don’t forget to take pictures. Feel free to tell me about your own AR event/meetup!

And now, for the links:

There were a couple of other interesting things happening this week, don’t you worry, I’ll have a dedicated post about them in the very near future. In the meantime, this week’s video comes to us via @chrisgrayson. It’s very cool, but unfortunately, I don’t understand how it was done. Yes, ARToolKit was involved, but how was the robot augmented to fire rockets? What provided the distance readings? If any of you read Japanese and can shed light on those question – please do so in the comments:

Have a great week!