AR Goes Underground

That is, to London’s subway system. Most tourist facing augmented reality applications are focused on landmarks. But, there’s an even more important issue affecting a tourist’s visit to an unknown city – how to get around. The city’s residents know perfectly well where’s the closest subway station that will serve their needs, but a tourist needs to constantly look at maps and look around herself.
To the rescue comes AcrossAir, a British mobile application development company. They have created a simple, yet useful, iPhone application, that shows you where’s the nearest tube station. When held horizontally, the application behaves quite like a map, but when the phone is tilted upwards, bubbles signifying stations’ presence are overlayed on the video feed. Obviously, this only works on the new iPhone 3GS, since it requires a compass reading.

Here’s some obvious further directions from the top of my head, this kind of app could follow:

  • Add more cities around the world, New York should probably be the first.
  • Add more public transportation options, such as buses and regular overground trains.
  • Add route planning, so the application will also recommend which station best suits your needs.
  • Some real time info will be great, like knowing when the next train is due.
  • Make it work underground (using cellular tower triangulation in lieu of GPS read), so commuters can be advised where to get off a train, and which line should they take next.

So there’s much room to innovate, even in such a niche application.

IBM Serves an Augmented Reality Wimbledon

While SPRXMobile enables you to experience an augmented version of the Netherlands (and the whole world in the the future), IBM has more modest goals in mind. Available from next Monday on the Android store, IBM’s Seer Android Beta will add some AR magic to the Wimbledon tennis tournament. Using it, visitors can find facilities on ground (locating the nearest restrooms), but more impressively, they can “point the phone at a tennis court, find out the court number and also who’s playing and more crucially, who’s winning”.

As with Layar, IBM Seer Android Beta lets you choose between different layers of metadata (and it seems that unlike Layar, one can choose to see more than one layer at a time). Moreover, the not specialized name of this app (i.e. not calling it “IBM Augmented Wimbledon”) suggests that IBM may have more deep interest in augmented reality location based services. Days will tell if my hunch here is correct.

More details can be found here.
(via PicturePhoning.com)

Layar is Out

Surely, you have already read about it at Gizmodo, at ReadWriteWeb, or at any other of the millions of websites which reported the news today. However, if you are avid readers of this blog, you already knew about Layar, “the first augmented reality browser” for at least two weeks (see link for further details).

So luckily, all that is left for me to do, is (1) embed this new video demoing Layar:

(2) Quote from the press release:

Mobile innovation company SPRXmobile launches Layar, the worlds first mobile Augmented Reality browser, which displays real time digital information on top of reality (of) in the camera screen of the mobile phone. While looking through the phone’s camera lens, a user can see houses for sale, popular bars and shops, jobs, healthcare providers and ATMs.

The premier launch is for the Dutch market. … Layar will be launched per country with local content partners in order to guarantee relevent results for the end user. SPRXmobile is planning further roll-outs, together with local partners, in Germany, the UK and the United States this year. SPRXmobile will continue with regular releases of new layers after each local launch. The Layar application will be available via the Android Market. Other handsets and operating systems are in development with a prime focus on the iPhone 3G S.

(3) Tell you that it will be available tomorrow on the Android Store
(4) And once again wish good luck to Maarten, Raimo and friends from SPRXMobile.

Gamaray’s AR Explorer is Online

Since one augmented reality framework per week is not enough, here comes another one for Google’s Android. While other Android AR applications provide information about landmarks seen through your mobile’s camera, Gamaray’s AR Explorer shows virtual 3d objects not seen with the naked eye. Obviously, the technology is in its infancy, and it’s quite a bold move on Gamaray’s part to release its application in such an early stage:

Right now, Gamaray is focusing on utilizing their framework for building multiplayer games, the first one being a tank combat game. Founder Clayton Lilly, admits that “For a while we thought of creating a more general purpose AR platform, but I’m concerned that Google may already be developing a first person AR viewer for KML data and 3D models”. I for one root for the smaller companies in this new ecosystem, so good luck guys!

(link)

Layar is Online

Layar is a new augmented reality Android framework that comes from SPRXMobile. SPRXMobile, which previously brought us the ATM finder and this excellent post about the AR hype cycle, have kicked it up a notch with a full blown AR platform.
SPRXMobile don’t provide many details quite yet (they save it to Mobile 2.0), but here’s what was made public on their site:

  • It will be available for download on the Android Market before the first of July. However, at launch, the service itself will only be available in the Netherlands (one more reason to visit!)
  • Points of interest are shown on top of the video input using graphical symbols, their interpretation (the text describing them) is shown out of band, on the bottom of the screen.
  • You will be able to choose between different content layers. Companies will be able to create their own layers (e.g. a layer whose points of interest are Starbucks).
  • “Currently several companies have already signed up for Layar and will publish their own content in their branded layer soon.”

My educated guess is that they are using the compass+gps combo to identify points of interest. First, it works only on Android devices since “they are the only devices with a compass”. Then, “We have a little indicator showing you the accuracy of the location positioning”, which could be avoided using computer vision. If that’s the case, the main difference between them and Wikitude is having many content layers (which also justify their “first mobile browser” slogan).

Whatever is the case, seeing SPRXMobile previous projects, I’m sure this one will be a tight application with a lot of promise. Good luck guys! (Raimo, Maarten, feel free to comment).

(link)

New SREngine Video

Sein has just posted a new video on his blog (in Japanese, though an English version is apparently in the workings). I think it’s really amazing what one man can do on his own:

I’ve covered SREngine before, and so did Ori, and from video to video you could really see how this application takes shape.

Though using image recognition makes it a bit slow (for the meanwhile) in comparison to systems based purely on GPS and compass positioning , it allows it to identify smaller things, at shorter distance and within close quarters. I really can’t wait to see it available on the appstore.

X-Ray Vision via Augmented Reality

The Wearable Computer Lab at the University of South Australia has recently uploaded three demos showing some of its researchers’ work to Youtube. Thomas covered one of those, AR Weather, but fortunately enough, he left me with the more interesting work (imho).
The next clip shows a part of Benjamin Avery’s PhD thesis, exploring the use of a head mounted display in order to view the scenery behind buildings (as long as they are brick-walled buildings). If understood correctly (and I couldn’t find the relavant paper online to check this up), the overlaid image is a three-dimensional rendition of the hidden scene reconstructed from images taken by a previously positioned camera.

The interesting thing here is that a simple visual cue, such as the edges of the occluding items, can have such a dramatic effect on the perception of the augmented scene. It makes one wonder what else can be done to improve augmented reality beyond better image recognition and brute processor power. Is it possible that intentionally deteriorating the augmented image (for example, making it flicker or tainted), will make a better user experience? After all, users are used to see AR in movies, where it looks considerably low-tech (think Terminator vision) compared with what we are trying to build today.

Anyway, here you can find Avery himself, presenting his work and giving some more details about it (sorry, couldn’t embed it here, even after several attempts)