Can Google’s G1 do augmented reality better then the iPhone?

The iPhone hype still rules the augmented reality devices charts, but as Walter Mossberg claims in his in-depth test drive:

that will all change on Oct. 22, when T-Mobile and Google bring out the G1, the first hand-held computer that’s in the same class as Apple’s iPhone.

Google's G1

iPhone

TechCrunch has its own view on the comparison.

Here’s my quick comparison of the two devices through an augmented reality lens:

They both have similar screen quality (480×320 65K color), a nice touch screen, similar CPU speed, GPU for graphics acceleration, accelerometers for sensing movement, and are both in the under $200 price category.

G1’s Screen size is reportedly narrower than the iPhone (3.2” compared with 3.5”) yet is bulkier (weighs 5.6 ounces to 4.7) and is much thicker; but it has a better camera resolution (3.1 mp compared with 2mp), though similar to iPhone – it can’t record video.

A downside for developers is the very low memory allocated for third party apps (128 megabytes) and the 1G storage space (expandable up to 8GB) which would seriously limit/irritate developers.

On the positive side it has 5 buttons, a real keyboard (you care?), a compass… and most importantly it’s built on Android, an open (yet unproven) Operating System  – which means easier to adapt for the specific needs of augmented reality applications. On the flip side, some developers hate the restriction that comes with Android: program in Java.

Bottom line, these are worthy competitors – each with its own advantages and caveats.  The real winner will be determined, as always, based on whoever offers the best content and the best reality experiences.

Remember WIFI ARMY? It was a very promising AR game for Android (made it to the #7 spot in my top 10 AR demos). But they went silent…their website is down. If you see something – say something.

The #9 disruptive technology of 2009 according to Gartner

ZDNet just published Gartner’s top 10 disruptive technologies for 2009:

  1. Multicore and hybrid systems
  2. Virtualization and fabric computing
  3. Social networking
  4. Cloud computing
  5. Web mashups
  6. User interface
  7. Ubiquitous computing
  8. Semantics
  9. Augmented reality
  10. Contextual computing

But keep in mind we are looking at a serious downturn which means only a few of the top 10 will actually get funded in 2009. The remaining technologies on the list (including #9) are dubbed by Gartner:

“projects to ponder in the future…”

Well, after all they are talking to CIOs and IT execs that need to play it safe: avoid technical disasters, keep the business side satisfied, and stay within budget – not the ideal breeding ground for disruptors…

Still, a nice tip of the hat by Gartner to the field of augmented reality, and it did raise some eyebrows among industry pundits. Matt Asay was so dazzled that he called his article:

“Gartner’s ‘augmented reality’ on IT spending”

They say any publicity is good publicity, and this case is no exception.

On a second look at the list it hits me: we will actually need every single one of these 10 technologies to design a great augmented reality experience.

Better Than Reality Project: A Look Behind the Augmented Reality Art Scene

In my pursuit of the ultimate augmented reality game, Jonas Hielscher popped on my radar. Jonas recently completed a residence at the Better Than Reality project for artists at the V2_Lab. I couldn’t resist asking him about his experience. Here are the results of the discussion.

games alfresco: Hey Jonas, thanks for taking a break from your art and joining us for a quick conversation; how about a quick introduction?

Jonas: Hi Ori, thanks for your interest in my work!

About me: I am a German artist, working as an assistant professor at the Academy of Media Arts Cologne in the field of 3D and interaction. In my artistic work I develop experimental games and media installations that explore the boundaries between the real and the virtual world (portfolio)

This year, my specific focus lays in exploring the possibilities of augmented reality. At the beginning of this year I gave an augmented reality game workshop at Mediamatic in Amsterdam, together with Julian Oliver. There we helped the participants develop small game ideas with ArtoolKit: a software library for building Augmented Reality applications using physical tracking markers.

Mediamatic Workshop

Back in Cologne, my students at the Art Academy were so enthusiastic about the results, that they also developed a small ArtoolKit project for the annual school exhibition.

Cologne Art Academy School project

And now I just finished 6 weeks of intensive work with the Augmented Reality system, they developed in V2_lab, Rotterdam.

V2_Lab Residency Last Pawn

V2_Lab Residency: Last Pawn

games alfresco: Tell us a little bit about the Better Than Reality project, how did it start? What was its objective?

Jonas: Two years ago V2_lab started developing the first version of their augmented reality system in collaboration with the artist Marnix de Nijs (NL). That resulted in the first public user test of a project called Exercise in Immersion 4 during the Dutch Electronic Art Festival 2007 (DEAF07).

After this presentation they went on refining the system and invited three artists-in-residence. They where chosen to explore various sub topics: Marnix de Nijs (NL) will continue his Exercise in Immersion 4 project, Boris Debackere (BE) researches spatial sound, and my focus was on 3D and visual aspects of augmented reality. In the fall of 2008, a series of workshops will be given, open to (art) students, artists and other professionals who are interested in artistic use of augmented reality.

The system V2_Lab has developed a software/hardware platform called VGE (V2_ Game Engine), based on Ogre3D with Scheme, OpenAL, Blender, ultrasound positioning and SIOS (Sensor Input/Output System developed at V2_Lab). The user wears a head-mounted display that shows a mix of 3D visuals and real world video taken by a head mounted camera.

Within this framework, I developed two small experimental projects during my residency. Inspired by the novel ‘Through the looking glass’ by Lewis Carroll, I wanted to create a dreamlike mixed reality environment, where real and virtual merge in an absurd experience. An important aspect for me was the connection between the virtual and the real elements.

In the first project, Last Pawn, a real table with a chess pawn stands in the middle of the room. Around the table are three virtual windows hanging in the air and a virtual avatar stands in the middle of the room. When the participant moves the pawn on the table, the avatar walks to a position in space. If the avatar ends up in front of a window, he opens it and a new virtual space appears behind it. Like this, different spaces and atmospheres can be experienced. Additional real theater light is controlled by the system to support the experience.

Last Pawn 1

Last Pawn 1

Last Pawn 2

Last Pawn 2

Last Pawn 3

Last Pawn 3

The second project Human Sandbox was developed as a side experiment. Here, one participant equipped with the AR-system stands in the middle of the room, while another participant places several physical objects on the table. They track and manipulate the virtual objects seen by the other participant. With this system we were able to test and experiment with different objects much easier. An interesting result with this approach was that game logic could be developed without having to program it. You could play ‘Hide and Seek’ , just by putting and moving objects on the table.

Residency Sandbox

games alfresco: what was your first encounter with augmented reality?

Jonas: During the ARS Electronica Festival 2004, I saw several augmented reality projects. One very interesting piece was Augmented Fish Reality by Ken Rinaldo. Instead of augmenting the reality of a human, Ken augments the reality of a fish. Here, five Siamese fighting fishes are living in separated rolling robotic fish-bowls controlled by the fish themselves. The fish can steer the fish-bowl and come closer to other fish. They can see and communicate with each other, but always have to stay in their own fish-bowl. It is fascinating to watch and observe the fish’s behavior. You could say they were augmenting their own reality.

games alfresco: in the future, AR is poised to have a major role in the way we interact with the world;  what does it mean for the future of art?

Jonas: In our high technological western world with ubiquitous computing and the so-called “Internet of Things”, where RFID and other embedded applications are part of our daily environment, we can question if we already live in augmented realities. The augmentation of reality has become invisible through all kinds of services, like for mobile phones or our biometric passports.

I think, one goal of artists working with these technologies should be to make things visible again. So that the effect of it can be reflected and discussed.

games alfresco: AR seems to get more and more attention in the art world, what do you think has captured the imagination of artists?

Jonas: I think one major fascination is to be able to directly manipulate the experience of the real world to comment, reflect or just create new mixed reality experiences. It is also the fascination about the ability to give more information about people, objects and places, that would be normally concealed by their static structure.

games alfresco: More and more AR demos are surfacing in YouTube and elsewhere; what’s your favorite augmented reality demo?

Jonas: Definitely, Julian Oliver’s game Levelhead. It is the beauty of a simple but very genius idea. In the game the player holds a solid cube in front of a camera and on-screen it appears that each face of the cube contains a little room. The objective of the game is to navigate a character from room to room by tilting the cube. It is the little magic of holding this small universe in your hands, that really fascinates me.

games alfresco: what would be your dream augmented reality device (way to interact with AR)?

Jonas: During my residency I noticed, that I am not a fan of head-mounted displays. It is always the lack of technology you experience (small Field of View, Frame Rate, heavy equipment, etc.). With HMD you feel somehow amputated by giving up your perfect view of your eyes and diving completely into the augmented reality. Personally I belief more in the idea of having a window (like a small hand-held) to the augmented world. Then you are free to choose and compare between the real and augmented world.

games alfresco:  Totally agree. As you are winding down the residence at Better than Reality – what are you envisioning as your next project?

Jonas: At the moment I have lots of new ideas and inspirations. I have some new ideas for the game CollecTic, that I developed in 2006. I really want to develop a new and different version of it for the iPhone, which I think is a really great platform to develop mobile mixed reality projects for. The initial idea of CollecTic is, that you go hunting for Wifi-Hotspots in your neighborhood and play a kind of a puzzle game. What I really like about the game is that you discover the hidden infrastructure of the wireless network coverage through playing. The game also generates auditive and visual feedback of the hot spots. So hopefully I’ll find some time to give the game another boost for the iPhone with some new gameplay ideas.

Furthermore I have some new ideas about other augmented reality projects, but they are still on the cooking pan :-)

games alfresco: Wow. Thank you, Jonas. This has been an eye-popping-mind-blowing interview. Can’t wait to see your video documentation of the Better Than Reality project.

***update***
Jonas was able to upload clips of his work: take a look

Nintendo DS Wants to Augmented your Reality

Rumors about a new device that could enter the Augmented Reality game were confirmed yesterday by Nikkei Net. Wired revealed it to the rest of us – thank you very much. The new Nintendo DS model will launch this year in Japan.

What’s all the rage?

One of the most popular mobile game devices ever, now with a camera, better wireless capabilities, and a larger display – all packaged under $200?

Sounds like a killer augmented reality device to me.

Game devices such as the DS made it to the #6 spot on my “10 Best Augmented Reality Devices” review:

“But here’s the caveat: PSP and the DS need to be complemented with accessories such as camera, as well as accelerometers, positioning and ubiquitous connectivity capabilities – to be able to play in this game.”

Well, Nintendo is racing forward towards the fifth position (MIDs), leaving Sony in the rear view mirror.

Nikkei does point out that the camera function of DS could be integrated with gameplay, by allowing games to use the photos taken with the hardware.

Here’s your confirmation. Augmented Reality applications can be built for the next DS. Who’s going to take up the challenge?

On a related note, Gizmondo is serious about making a come back; like the phoenix, it’s rising from the ashes and promises to hit the stores this Winter.

It’s going to be a hot winter.

To BeAR or not to BeAR?

Thomas Baekdal put together a nice post about PlayStation Eye Camera Games.

These are the kind of games Diarmid Campbell demonstrated live on stage while delivering the unforgettable keynote at ISMAR 2008 last week.

Purists would rant it’s not augmented reality: If a game takes place on screen, the fact that you interact with it using hand movements (even when you see yourself in the background) doesn’t qualify it as AR. It’s still “just” a camera game.

Yeah, I know – but if the experience is captivating and survives the novelty stage – does it matter how you call it?

The best type of fun is when technology gets invisible.

How would you call these games?

Total Immersion Is Open to Seduce You with Augmented Reality

Bruno Uzzan from Total Immersion sent me this invitation and I thought you might be interested as well.

He is inviting you, augmented reality avid fans, to visit their new offices and have a treat: try some cool augmented reality installations such as transforming yourself to the Joker, or to a hip-hopping break dancer, virtually drive a luxury sedan, hold a beating heart, and he promises even more…

Whether you go or not – you will probably enjoy this clip…

WHEN: September 26th, 2008 from 3-7pm

WHERE: 900 Wilshire Blvd. Suite 255 Los Angeles, CA 90036

WHO:  Media & Event Planners, Brand & PR Strategists, Creative, Marketing & Industry Professionals

RSVP TO: RSVP@t-immersion.com by September 19th, 2008

For more information, please contact Kris Woods at 323.617.4843

If you happen to stop by – tell Bruno I invited you, and ask him how come he missed ISMAR ’08…

Live From ISMAR ’08: Augmented Reality Demo Round Up

ISMAR ’08 is the epicenter of the world’s best augmented reality demos. Here are the audience favorite picks:

The most beautiful demo

Markerless Magic Books

Created by the only artist at ISMAR ’08…
(on the left menu bar click Interaction/Haunted Book)

Our demonstration shows two artworks that rely  on recent Computer Vision and Augmented Reality techniques  to animate  the illustrations  of poetry books.  Because  we don’t need markers, we  can achieve seamless integration of  real  and  virtual  elements  to  create  the  desired atmosphere.  The visualization is done on a computer screen to avoid cumbersome Head-Mounted Displays. The camera is hidden into a desk lamp for easing even more the spectator immersion. Our work is the result of a collaboration between an artist and Computer Vision researchers. It shows beautiful and poetic augmented reality. It is further described in our paper ‘The Haunted House’.

Camille Scherrer, Julien Pilet, Vincent Lepetit (EPFL)

The most invisible demo

Sensor-fusion Based Augmented Reality with off the Shelf Mobile Phone

OK, you see a Scandinavian guy standing in the middle of the yard with a cell phone held high in his hand. What’s the big deal ? Exactly!

We demonstrate mobile augmented reality applications running on the newly released Nokia 6210 Navigator mobile phone. The device features an embedded 3D compass, 3D accelerometer, and assisted GPS unit – the fundamental ingredients for sensor-based pose estimation, in addition to smart-phone standards: forwards-pointing camera, high-resolution displays and internet connection. In our applications sensor based pose estimation is enhanced with computer vision methods and positioning error minimization techniques. Also the user interface solutions are designed to try to convey the relative uncertainty of the pose estimate to the user in intuitive ways.

Markus Kähäri, David J. Murphy (Nokia Research Center)

The most 90’s demo

See-Through Vision for Mobile Outdoor Augmented Reality

(compare to the previous demo)

We have developed a system built on our mobile Augmented Reality platform that provides users with see- through vision, allowing visualization of occluded objects textured with real-time video information. The demo participants will be able to wear our lightweight, belt- mounted wearable computer and head mounted display. The display will render hidden locations captured from the University of South Australia. These locations consist of 3D models of buildings and courtyard areas that are textured with pre-recorded video images. The system includes a collection of visualizations and tools that assist with viewing these occluded real-world locations; e.g. digital zoom and texture highlighting.

Benjamin Avery, Bruce H. Thomas, Wayne Piekarski, Christian Sandor  (University of South Australia)

The most playful mixed-reality game demo

Mobile Phone Augmented Reality

In our demo booth we will show a compilation of recent developments created by the Handheld AR group at Graz University of Technology and Imagination Computer
Services. None of these demos has been shown before at a scientific conference making it a unique experience for every ISMAR attendee. All our demos are hands-on: During our demos we will hand out devices and let people experience our applications.

Daniel Wagner, Alessandro Mulloni, Tobias Langlotz  (TU Graz), Istvan Barakonyi (Imagination),  Dieter Schmalstieg (TU Graz)

The most crowded demo

Superimposing Dynamic Range
In a dark, corner room the size of a closet, about 150 people are gathering around an artifact from the future….

We present a simple and low-cost method of superimposing high dynamic range visualizations onarbitrary reflective media, such as photographs, radiological paper prints, electronic paper, or even reflective three-dimensional items. Our technique is based on a secondary modulation of projected light when being surface reflected. This allows boosting contrast, perceivable tonal resolution, and color saturation beyond the possibility of projectors, or the capability of spatially uniform environment light when illuminating such media. It holds application potential for a variety of domains, such as radiology, astronomy, optical microscopy, conservation and restoration of historic art, modern art and entertainment installations.

Oliver Bimber (Bauhaus-University Weimar), Daisuke Iwai (Osaka University)

The most iTouchy demo

Multimodal Mobile Augmented Reality on the iPhone

How do you spell ARToolkit in iPhonese? (Hype is a beautiful thing)

In this demonstration we show how the Apple iPhone can be used as a platform for interesting mobile phone based AR applications, especially because of its support for multimodal input. We have ported a version of the ARToolKit library to the iPhone and customized it for the unique input capabilities of this platform. The demo shows multimarker-based tracking, virtual object rendering and AR overlay, gesture-based interaction with shared virtual content, and accelerometer input. This demonstration shows some of the possibilities of AR when there is no hardware to configure, no interface to learn, and the interaction is natural and intuitive.

Philip Lamb (ARToolworks)

The most down-under demo

An Augmented Reality Weather System

You have to live down-under to conceive a machine that simulates bad weather…brilliant!

This demo presents ARWeather, a simulation application, which can simulate three types of precipitation: rain, snow, and hail. Our goal is to fully immerse the user in the simulated weather by multimodal rendering of audio and graphics, while preserving autonomous and free movement of the user. Therefore, ARWeather was developed and deployed on the Tinmith wearable computer system. Software highlights of this demo include: GPU-accelerated particle systems and video processing, spatial audio with OpenAL, and physics-based interaction of particles with the environment (e.g., hail bounces of the ground).

Marko Heinrich (U. Koblenz-Landau), Bruce H. Thomas (U. South Australia), Stefan Mueller (U.Koblenz-Landau), Christian Sandor (U. South Australia)

The most highbrow demo

AR Museum Presentation Room

I never would have learned about this ancient plate’s history – had AR not been invented. A Classic.

The artwork to which the augmented reality technology is applied, is a plate produced by the technique called metallic lustre. Around the exhibited real artwork,
information is provided by multimedia tools, offering the visitor various approaches to the artwork. Adding information with augmented reality is intuitive and offers an illustration of something that cannot be seen by the naked eye, without turning away the visitor’s eyes from the real artwork. The system is currently in use at the Louvre – DNP Museum Lab (LDML) – Tokyo/Japan.

T. Miyashita (Dai Nippon Printing), P. Meier (metaio), S. Orlic (Musée du Louvre), T. Eble, V. Scholz,  A. Gapel, O. Gerl, S. Arnaudov,  S. Lieberknecht (metaio)

The “I am waaaay ahead of you” demo

Mapping large environments using multiple maps for wearable augmented reality

video of last year's demo

A demonstration of a wearable robotic system that uses an extended version of the parallel tracking and mapping system by Klein and Murray from ISMAR 2007. This extended version allows multiple independent cameras to be used to build a map in unison, and to also create multiple independent maps around an environment. The user can explore an environment in a natural way, acquiring local maps in real-time. When revisiting those areas the system will select the correct local map and continue tracking and structural acquisition, while the user views relevant AR constructs registered to that map.

Robert Castle, Georg Klein & David W. Murray (University of Oxford)

Additional great demos weren’t included due to the lack of space on this post and lack of sleep of the author…

Live from ISMAR ’08: Awards, Winners, and Wrap up of the World’s Best Augmented Reality Event

They say, every good thing has an end…and this event is no exception; ISMAR ’08, the world’s most important augmented reality event, is coming to a close with in a high note and with fireworks (augmented, of course).

That’s the part where the event chairs recognize the organizers which have made it possible, and thank the keynote speakers, paper submitters, demo exhibitors, poster presenters, competition contenders, and all participants for making it such a memorable event.

Cut to…flashback. It’s last night at King’s College; Ron Azuma is the MC for the best paper award ceremony…

King's College "Cafeteria"

And the honorable mention goes to: Georg Klein and David Murray “Compositing For Small Cameras”….winners of last year’s best paper…this is excellent work…many other practitioners can use these results”

Best student paper (tied with best paper): Sheng Liu, Dewen Cheng, Hong Hua “An Optical See-Through Head Mounted Display with Addressable Focal Planes”…it’s a breakthrough…this strikes me as a memorable and important step forward for HMD technology

Best paper: Daniel Wagner, Gerhard Reitmayr, Alessandro Mulloni, Tom Drummond, Dieter Schmalstieg “Pose Tracking from Natural Features on Mobile Phones”…“WOW! SIFT and the Ferns running in (almost) real-time on mobile phones…the performance is truly impressive and opens the door to amazing future applications

Congrats. You guys are the 800 pound Gorillas in augmented reality.

Cut to…flash forward. It’s the present back in the Cambridge Engineering Department. The winners of the Tracking Competition are about to be announced by the competition team:

Tracking Competition setup

We defined the setup in a large room in the department, with reference points and coordinates and installed 8 different stations with many different objects in them. We made it really hard on the competitors. We gave them time to prepare; they got coordinates of 16 items which they had to pick using their AR tracking technology.

We started with 5 contenders: Metaio (Tobias Eble), Fraunhofer (Harald Whuest), University of Bristol (Sudeep Sundaram), Millennium 3 Engineering (Mark Fiala), and University of Oxford (Georg Klein). Mark Fiala unfortunately had to drop due to lack of sufficient preparation time. Bristol thought the room was missing some features…

And here are the results: in the second place came Metaio with 15 items picked in a little more than 10 minutes…and in first place [the audience favorite] Georg Klein who picked all 16 items in a record time 8:48!

Hallelujah!

Georg (Le magnifique) will return to Oxford with an extra 1000 pounds in his pocket. And he’s humble and gracious:

Thanks to Robert Castle for providing the method (parallel tracking and mapping) which I adapted this morning – it was greatly suitable for the task.

And for those who wonder what kind of bug drove me to write more than 10,000 words in 17 posts within 4 days – I have one word for you: passion…plus the amazing support I got from ISMAR attendees and chairs, and mostly – you guys: AR avid fans out there, that weren’t as fortunate and couldn’t attend the event this year. THANK YOU!

They also say it’s never over ’till the fat lady sings…and in this case Christopher Stapleton plays the role: he’s the last to come up on stage and his deep voice vibrates across the walls of the auditorium as he shouts into the mic:

ISMAR 2009 Experience starts right now!

If you want to be part of it, help or support it – just send a note to christopher@stapleton.net

Fade out. Credits. Slideshow. We’re outta here.

Live from ISMAR ’08 : Perfecting Augmented Reality

The last session of ISMAR ’08 is about to begin, and it concentrates on perfecting Rendering and Scene Acquisition in augmented reality and making it even more realistic.

First on stage is Yusaku Nishin with a challenging talk attempting Photometric registration by adaptive high dynamic range image generation for augmented reality.

His goal : development of photorealistic augmented reality with a High Dynamic Range (HDR) image.

Estimating the lighting environment of virtual objects is difficult because of low dynamic range cameras. In order to overcome this problem, they propose a method that estimates the lighting environment from an HDR image and renders virtual objects using an HDR environment map. Virtual objects are overlaid in real-time by adjusting the dynamic range of the rendered image with tone mapping according to the exposure time of the camera. The HDR image is generated from multiple images captured with various exposure times.

Now you are ready to watch the resulted effect. Incredible.

[youtuve=http://www.youtube.com/v/M53Tqqdk9w0]

~~~

Next on stage is the soon-to-be-hero-of-the-show Georg Klein (more on that later…) Compositing for Small Cameras

Blending virtual items on real scenes. It can work with small cameras. Video from such cameras tend to be imperfect (blurring, over saturation, radial distortion, etc) so when you impose a virtual item it tend to stick out in a bad way. Since we can’t improve the live video – we will try to adapt the virtual item to match the video at hand. Simply put, Georg samples the background and applies it to the image which matches blur, radial distortion, rotation, color saturation, etc) and he does it in 5 millisecond on a desktop… For details check the pdf paper; take a look for yourself and tell me if it works on Kartman:

Done! Georg is already working on the next challenge.

~~~

Following is Pished Bunnun introduces his work: OutlinAR: an assisted interactive model building system with reduced computational effort

Building 3D models interactively and in place (in-situ), using a single camera, and low computational effort – with a makeshift joystick (Button and wheels.)

In this case the video does a better job at explaining the concept than any number of words would…

Pished demonstrates it’s fast and pretty robust. You judge for yourself.

If you absolutely need more words about this – start here.

The team’s next challenge: make curved lines…

~~~

In the very last talk of the event Jason Wither courageously takes on another challenge to perfecting augmented reality, with his talk: Fast Annotation and Automatic Model Construction with a Single-Point Laser Range Finder

Jason is using a laser finder typically used by hunters (though he will not be shooting anything or anybody), mounted on the head or handheld, in conjunction with a parallel camera. First he wants to create an annotation. that’s totally trivial. But you can then orient the annotation according to a building for example.

Next, he is going to correct occlusion of virtual objects by real objects for improved augmented realism. Just click before and after the object and pronto:

Finally he will create a 3D model of an urban environment semi-automatically, by creating a depth map courtesy of the laser. To achieve that he’s using a fusion process. You got to see that video; the laser’s red line advancing on buildings reminds me the blob swallowing the city in that quirky Steve McQueen movie.

In conclusion this is a really low cost and fast approach for modeling and annotation of urban environments and objects. That capability would become extremely handy once Augmented Reality 2.0 picks up and anyone would want to annotate the environment (aka draw graffiti without breaking the law).

Next is the event wrap up and the results of the Tracking Competition. Stay tuned.

====================

From the ISMAR ’08 program:

Rendering and Scene Acquisition

  • Photometric registration by adaptive high dynamic range image generation for augmented reality
    Yusaku Nishina, Bunyo Okumura, Masayuki Kanbara, Naokazu Yokoya
  • Compositing for Small Cameras (pdf paper)
    Georg Klein, David Murray
  • OutlinAR: an assisted interactive model building system with reduced computational effort
    Pished Bunnun, Walterio Mayol-Cuevas
  • Fast Annotation and Automatic Model Construction with a Single-Point Laser Range Finder
    Jason Wither, Chris Coffin, Jonathan Ventura, Tobias Hollerer

Live from ISMAR ’08: Is Augmented Reality at Work Better than Reality Itself ?

Bruce Thomas introduces the afternoon session at ISMAR ’08 focusing on user studies in industrial augmented reality.

First is Johannes Tuemler which will talk about Mobile Augmented Reality in Industrial Applications: Approaches for Solution of User-Related Issues.

The study looks at psychological and ergonomic factors in augmented reality usage and create a requirements catalog for mobile AR assistance systems in diverse scenarios. This was a collaboration with Volkswagen, Ergonomics department in Ott-von-Wolfsburg,  Perception Psychology from Weymar University, and Information technology by the Fraunhofer Institute.

The reference scenario chosen was “AR picking”, where subjects would work for a couple of hours of picking items from shelves using a mobile AR device. The users reported no rise of stress level with an AR system compared with no AR (except for some visual discomfort). Since the AR system was less than optimal, the research may point to the fact that with a better AR system the stress level of workers – compared with no AR system – could be reduced!

~~~

As a direct follow up to the first study, Bjoern Schwerdtfeger comes on stage to describe the results of an Order Picking with AR work.

Traditionally the system includes a print out with instructions of what items to pick from bins on shelving.

How can an AR system help improve the performance of such an activity?

Glasstron by Nomad

They looked at mulitple visualization options: Frame tunnel, Rings tunnel, and 3D Arrow.

The results showed that the frame visualization was more efficient than the arrow. It’s not clear whether the rings visualization is superior.

~~~

Final speaker for this session is Gerhard Schall from Graz University to discuss Virtual Redlining for Civil Engineering in Real Environments.

What is virtual redlining? Virtually annotation paper maps or 2d digital information systems (mostly for the utility sector). This process helps significantly in the workflows associated with network planning or inspection.

The process involved mapping of 2D geographical data with 3D models of buildings and underground infrastructure. The tool developed allows for collaboration, inspection, and annotation.

Results of the usage study confirms that the AR system has significant advantage in civil engineering – in this redlining scenario. The color coding was important, as well as the digital terrain model.

Question from the audience: where do you get the 3D modeling of the piping?

Answer: Some utility companies have started to map the underground infrastructure. But in most cases we create it based on 2D maps which is only an approximation.

And that concludes the Industrial user studies session. See you next at the last session of the event: Rendering and Scene Acquisition, leading to the grand finale with the award ceremony for the winner of the Tracking Competition.

=============

From ISMAR Program:

User studies in Industrial AR

  • Mobile Augmented Reality in Industrial Applications: Approaches for Solution of User-Related Issues
    Johannes Tuemler, Ruediger Mecke, Michael Schenk, Anke Huckauf, Fabian Doil, Georg Paul, Eberhard A. Pfister, Irina Boeckelmann, Anja Roggentin
  • Supporting Order Picking with AR
    Bjoern Schwerdtfeger, Gudrun Klinker
  • Virtual Redlining for Civil Engineering in Real Environments
    Gerhard Schall, Erick Mendez, Dieter Schmalstieg