Posted on April 11, 2011 by rouli
If you are a user of one the mobile AR applications, such as Layar, Junaio, Wikitude, Google Goggles (or any of the many others), and would like to help the (academic) research of augmented reality, boy do Markus Salo and Thomas Olsson have an offer for you.
The two researchers from Finland ask you to think about your most satisfying and unsatisfying experiences using AR application, and for your view on the usefulness of such applications and take the following survey. I must say that imho, mobile AR has not yet created a really satisfying moment, and the enjoyable moments it did create, didn’t last for long (though, some of the creative things people do with mobile AR are really mind boggling).
Best of all, by participating in this survey, you enter a raffle to win one of 10 Amazon-vouchers, 50 euros of worth each. Considering that AR is still a niche, you have pretty good chances to win it. Even better, Salo has promised to share the results, so we’ll can all learn from this survey.
take the survey
Filed under: Uncategorized | Tagged: Research | Leave a comment »
Posted on October 5, 2010 by rouli
If you thought that augmented reality can only place virtual object in real environments, think again. AR can also be used to ‘delete’ real object, making them transparent.
Case in point, Francesco I. Cosco’s work presented at ISMAR09 (which reminds me that ISMAR 2010 is less than two weeks away!). In this work, Cosco and other researchers from the University of Calabria, Italy and the Rey Juan Carlos university in Spain, tried to add haptic interaction to an augmented environment. Problem is, haptic devices are visually not attractive, and aren’t really a part of the scene. The solution they came up with was quite ‘magical’:
More details on Cosco’s home page.
Cosco F. I., Garre C., Bruno F., Muzzupappa M., Otaduy M. A. “Augmented Touch without Visual Obtrusion”. Proceedings of the IEEE International Symposium on Mixed and Augmented Reality 2009.
Filed under: Uncategorized | Tagged: Haptic AR, ISMAR, Research | 3 Comments »
Posted on August 31, 2010 by rouli
Mobile, image recognition based, augmented reality is very cool, as evident from the Popcode’s demos we posted yesterday. However, creation of a model used by the mobile phone to recognize a new image still requires a desktop, hindering realtime creation and sharing of AR content.
Thanks to the work of researchers from the Korean Gwangju Institute of Science and Technology and the Swiss EPFL, this needn’t be the case anymore. In a paper titled “Point-and-Shoot for Ubiquitous Tagging on Mobile Phones” accepted to ISMAR 2010, they present a method to scan surfaces and create “recognition-models” by using your phone (no data is sent to a remote server).
You don’t even need to take the perfect straight-on picture. As the video below shows, this means you can augment hard to reach surfaces. Best of all, you can share those models with your friends.
A little bit more detail over Wonwoo Lee’s blog.
Filed under: Uncategorized | Tagged: EPFL, GIST Korea, Handhelds and Cellphones, ISMAR, Markerless, Research | Leave a comment »
Posted on July 30, 2010 by rouli
Earlier this week I reported about an interesting paper accepted to this year’s SIGGRAPH conference, studying how augmented reality makes cookies taste better. However, it’s far from being the only appearance of AR in SIGGRAPH. Following are three more AR demos that took the stage during the conference:
Camera-less Smart Laser Projector
Using a laser beam both for scanning and projecting images, researchers from the University of Tokyo (where the cookies paper also comes from) gained several advantages over the usual projector+camera setup. Mainly, they eliminated the need to calibrate between the camera and projector, and gained the ability to scan 3d objects without stereoscopic cameras.
Camera-less Smart Laser Projector. Cassinelli, Zerroug, Ishikawa, Angesleva. SIGGRAPH 2010
QR-Code Calibration for Mobile Augmented Reality
This work by researchers of Kyushu University uses QR codes in order to fix the user position in the world. It shows you the problem with academic conferences. By the time your papers gets published, some company may already deploy similar technology in the real world, as Metaio did with Junaio.
QR-code calibration for mobile augmented reality applications: linking a unique physical location to the digital world. Nikolaos, Kiyoshi. SIGGRAPH 2010.
Canon brings to life dinosaurs
Not only researchers come to SIGGRAPH, but companies as well. Canon demoed head mounted assisted mixed reality in their booth, featuring a cute velociratpor like dino (lacking feathers, they should get some new biology books). Jump to the 25th second to see it in action:
Filed under: Uncategorized | Tagged: Canon, Kyushu University, Research, SIGGRAPH, University of Tokyo | 3 Comments »
Posted on July 26, 2010 by rouli
SIGGRAPH, the world’s most important conference on computer graphics and a pixel-fetishist wonderland is being held this week in Los Angeles, featuring several interesting papers on augmented reality. One of them explores the augmentation of cookies.
Yes, we have all seen cookies marked with augmented reality markers. Games Alfresco has mentioned such cookies back in 2008. So how come such a paper got accepted into SIGGRAPH? Well, the simple answer is that you’ve haven’t seen (or rather tasted) such cookies before.
Created by researchers at the University of Tokyo, Meta Cookie combines together a head up visual display and head up olfactory display to trick your senses. I’m not quite sure what was the goal of that research, but were successful in changing how the cookie tasted. Probably some smart marketer will find a way to sell it as a weight loss device. You just need a way to print a marker on broccoli to make it tastes (and look like) ice cream (with a crunchy texture).
Meta Cookie. Narumi, Kajinami, Tanikawa, Hirose. SIGGRAPH 2010.
image via BusinessWire
Filed under: Uncategorized | Tagged: head up display, olfactory, Research, SIGGRAPH, University of Tokyo | Leave a comment »
Posted on June 29, 2010 by rouli
Addicted to Farmville? Have a green thumb but no garden? Envy real farmers but got allergies?
The guys from TU Munich have the perfect solution for you:
Augmented Farmville could be one heck of a layer for Layar/Junaio/Wikitude once better positioning is available. Think of the gold rush to get a virtual plot in major cities, imagine Times Square as a flower bed! Using real meteorological data to those virtual farms would add another interesting and educational twist. It may be the most stupid idea I have ever featured in this blog, but then again, nobody would guess that a farm simulator will be one of the most successful games in 2010.
Filed under: Uncategorized | Tagged: head up display, Research, TU Munich, Visible Markers | 2 Comments »
Posted on June 7, 2010 by rouli
Here’s a fun video showing of the results of a paper by some students from Imperial College, London back in 2008.
The abstract says:
Accurate estimation and tracking of dynamic tissue deformation is important to motion compensation, intra-operative surgical guidance and navigation in minimally invasive surgery. Current approaches to tissue deformation tracking are generally based on machine vision techniques for natural scenes which are not well suited to MIS because tissue deformation cannot be easily modeled by using ad hoc representations. Such techniques do not deal well with inter-reflection changes and may be susceptible to instrument occlusion. The purpose of this paper is to present an online learning based feature tracking method suitable for in vivo applications.
In other words, they are augmenting a live beating heart, muhahaha!
Soft Tissue Tracking for Minimally Invasive Surgery Learning Local Deformation Online. Peter Mountney and Guang-Zhong Yang
Filed under: Uncategorized | Tagged: Imperial College, Research | Leave a comment »