Weekly Linkfest

This is the last linkfest for this year. Though there were many more Christmas spectacles this week, I’m going to keep this linkfest holiday-spirit free (broke my nose, not feeling very festive).

  • Robert Rice on 2010, the first year in the decade of ubiquity – “The point though, is that all of these things calling themselves augmented reality now are just the start. Everyone is getting their feet wet, experimenting, exploring, and beginning to innovate. We can argue about what is or isn’t augmented reality, but it doesn’t really matter. What does matter is the continual push for advancing the technology, the industry, and getting people to start using it“.
  • On the same theme, Edo Segal writes for Techcrunch about the dawn of ambient streams – “Increasingly, we will be sensing the world with this sixth sense and that will change the way we collectively experience the world. Going back to the point made earlier, the watershed event is when we will be experiencing this “ambient sense” without being in a retrieval mode (i.e. not when we go to the computer or our mobile device
  • Whisper Deck is a cool voice operated AR interface
  • Jack Benoff of Zugara on what to do in case you are pitched an AR campaign.
  • ReadWriteWeb on the Brightkite’s new feature – AR ads.
  • Augmented Planet on Toozla, self-claimed world’s first audio AR browser (I believe Gamaray had audio support as well).
  • EyePly wants to augment your sports events.
  • And Tonchidot released its Sekai Camera browser worldwide.
  • It’s a couple of weeks old, but I finally got to read it – Wired on AR accelerated by Earthmine’s 3d city-maps.
  • Point your sneakers to your webcam in order to feel silly. Which should be on Mashable’s 10 awesome uses of AR in marketing list (what, only one car campaign? seriously, where’s that GE ad that started the fad?)
  • Denno Coil (AR fans number one anime) gets a mobile AR campaign (via @thomaskcarpente)

This week’s video is of a projected AR system coming to us from the University of Magdeburg, Germany. Though we have seen quite a few systems like that over the past years (even one coming out from Microsoft), I don’t think we have seen any as slick as that. You can read how it works (magic! infra-red markers) at New Scientist (via Augmented Engineering).

Have a great week!

Where 2.0: The World is Mapped – Now Use it to Augmented our Reality

O’Reilly’s Where 2.0 event is a tightly run ship. One track for all attendees, fast paced 20 minutes sessions, discussing laser focused topics.

Low tech location services at Where 2.0

Low tech location services at Where 2.0

I got my fair share (3 minutes!) to educate the audience about how AR could impact our life, as part of the Mobile Reality panel, covered by Rouli.

A show of hands survey confirmed that only 5% of the audience was familiar with the concept of augmented reality before the event. Not too surprising considering the percentage among the general population is less than 1%.

What came out strongly at the event is that unbelieveable amount of data is being captured about people, places and things around the world. This data combined with sophisticated models (such as Sense Networks) result in the existence of super intelligent information about the world that we still don’t really know how to use.

My point is not a shocker: all we need is to tap into this information and bring it, in context, into people’s field of view.


For some time now, researchers in the augmented reality community have attempted to leave markers behind and leap into the great world of outdoor AR (alfresco). These pioneers typically hit walls such as low accuracy of GPS, lack of 3D modeled environments, and the usual device-specific limitations.

Where 2.0 gave stage for two new approaches to map the world that may help overcome the traditional challenges: Earthmine and Velodyne’s Lidar.

Earthmine uses its own camera-based device to index reality, at the street level, one pixel at a time. They have just announced Wild Style City – an application that allows anyone to create virtual graffitis on top of designated public spaces. However, at this point, you can only experience it on a pc!

Why not take advantage of their 3D pixel inventory of the world to make these graffiti work of arts available to anyone on the street? All is needed is some AR magic and a powerful mobile device.

The second novice approach is Velodyne’s Lidar. Remember Radiohead’s funky laser (as opposed to video) clip?

They did it with Lidar.

Now Velodyne is embarking on a broader mission to map the outdoors. Check out this experiment.

Can AR researchers harness these new approaches to index reality?

Reblog this post [with Zemanta]