Who Should Attend The Augmented Reality Event in Santa Clara, CA June 2nd & 3rd, 2010

Over the last 2 years we have seen growing interest in Augmented Reality in various events – panels, dev camps, meetups – and many more. Due to growing demand for knowledge and expertise in augmented reality (AR), a group of AR industry insiders, backed by the AR Consortium have put together the first commercial event dedicated to advance the business of augmented reality.

How is are2010 different from ISMAR…

…previously touted here as the “World’s best Augmented Reality event”?

Well, ISMAR is still the best AR event for the scientific community. If you want to learn about (or present) the latest advancements in AR research – you should be in Seoul this October for ISMAR 2010. However, for the rest of us, who wish to take advantage of AR in practice, in the commercial world, and build a business around it – there was a gaping hole.

That is, until now.

Meet the Augmented Reality Event.

Who’s this event for?

For established and start up AR companies –

For established and start up AR companies (such as Total Immersion, Metaio, Acrossair, Ogmento, Circ.us, Mobilizy, Layar, Zugara, Neogence, whurleyvision, Chaotic Moon Studios, and many more) – are2010 is a stage to showcase their products and services; a venue to form partnerships, learn about latest innovations, and most importantly speak with clients. Bruno Uzzan, CEO of Total Immersion will wow the audience with a cutting edge augmented reality show; Peter Meier, CTO of Metaio, will speak about his companies latest products. Early stage startups and individual developers will receive guidance from Cole Van Nice (Chart Venture Partners) for how to build a successful company in the AR space, including raising funding (from VCs that actually invest in AR), licensing technology and IP, legal aspects, forging partnerships, etc. Christine Perey will speak about the scope of the mobile AR industry today and it’s growth trajectory.

For Developers –

For developers, are2010 is a window into the latest AR algorithms, engines and programming tools. Learn from case studies and post mortems delivered by experienced developers from the leading companies in the space. Blair MacIntyre, director of the GVU Center’s Augmented Environments Lab at Georgia Tech, will speak about his experience with tools and technologies while developing augmented reality games. Daniel Wagner, one of the leading mobile AR researchers in the world, will bring developers into the wonderful world of mobile AR. Patrick O’Shaughnessey, which has lead the development of more webcam-based AR campaigns than anyone else I know – will share his knowledge of what works and what doesn’t. Mike Liebhold, Distinguished Fellow at the Institute for the Future , will speak about Technology foundations of an Open ARweb. Gene Becker, co-founder of AR DevCamp, will dive into augmented reality and ubiquitous computing, and Sean White, a pioneer in Green Tech AR will suggest concrete examples of how AR can help save the planet

For Mobile, Hardware, and Platform Companies

For Mobile, Hardware, and Platform companies (such as Vuzix, Nokia, Qualcomm, Intel, QderoPateo, Microsoft, Google, Apple etc.) are2010 consists of a captive audience to launch and showcase their latest devices, processors, AR glasses, sensors, etc. The best collective minds of the AR commercial world will be onsite to articulate the market demand characteristics and help influence the design of future hardware.

For Clients and Agencies –

For clients and agencies in entertainment, media, publishing, education, healthcare, government, tourism, and many more – are2010 offers everything you need to know about AR: how to leverage augmented reality to advance your brand, attract and keep your customers, and how to build successful campaigns and products that will delight users, including postmortems of landmark augmented reality projects.

Jarrell Pair, CTO and a founder of LP33.tv, will speak about “Augmented Reality in Music Entertainment: Then and Now”, Brian Selzer, co-founder and President of Ogmento, will deliver a crash course for clients and agencies about how to leverage AR in marketing campaigns. Marshal Kirkpatrick, lead blogger for ReadWriteWeb, will share the results of his AR survey collecting feedback from dozens of AR developers and their experience in delivering AR campaigns and apps. Kent Demain, designer of the visual effects in Minority Report, will open our minds with the talk: “Taking Hollywood visual effects spectacle out of the theatre and into your world”. And of course…

For any AR Enthusiast –

Are you an AR Enthusiast? If so, you’re going to feel like a kid in a candy store at ARE, with a soon-to-be unforgettable keynote by Bruce Sterling, demo gallery, exhibitors from leading companies, artists installations from AR artists such as Eric Gradman and Helen Papagiannis, and many more surprises.

If you are into Augmented Reality – are2010 is the one event you should attend this year.

Want to join the event? Early registration is now open!

Open Letter to Apple: Let us Augment Reality with the iPhone!

A letter sent to Apple Developer Relations.

Dear Apple,

We are a collection of augmented reality (AR) enthusiasts and professionals (from business and academia), who have been working on a multitude of AR apps for the iPhone. These apps are poised to change the way people interact with the real world.

But here is the rub: we are currently unable to publish these apps on the app store because the iPhone SDK lacks public APIs for manipulating live video.

We are asking Apple to provide a public API to access live video in real time, on the iPhone.
We will be happy to offer additional technical details.

The impact of augmented reality (AR) on our lives could be as significant as the introduction of the PC.
In 10 years, we believe augmented reality will change the way everyone experiences travel, design, training, personal productivity, health care, entertainment, games, art, and advertising (videos).

Looking back just a few years, AR pioneers had to hack a slew of components into ridiculously large backpacks and HUDs, and be confined to rigged environments. Nowadays, it comes in friendly, affordable packages and the iPhone is one of the first devices to have it all – except for a public API.

The battle to determine the winning device has already begun; a public API to access live video will give the iPhone a lucrative ticket to compete.
We believe Apple has a window of opportunity of about 3 months before developers start looking elsewhere. If Apple decides to publish the API in that time frame – in the next 10 years, everyone might be using the iPhone as the preferred device to interact with the real world.

Here is how augmented reality could open up new opportunities for the iPhone this year:

Arf (Georgia Tech)

a virtual pet you take anywhere

ARghhhh (Georgia Tech)

first person table-top action game

Sekai Camera (Tonchidot)

AirTag the real world

Kweekies (int13)

a portal to creatures in a parallel world

Layar (SPRXmobile)

Browse the world with an AR browserDetails

Artoolkit for the iPhone (Artoolworks)

the most popular AR kit now on the iPhone

StudierStube ES (Imagination, Graz TU)

the only AR engine designed for mobile devices, now on iPhoneDetails

PTAM on the iPhone (Oxford University)

next generation AR tracking with no markers or images

Wikitude (Mobilizy)

a travel guide that “tells you what you see”

Virtual Santa (Metaio)

interactive Christmas application using the augmented reality

Augmented Reality Sightseeing (Fraunhofer IGD)

Historic photographs overlaid on your field of view while strolling in a street

These are apps that are practically ready to go. There is a whole bunch of apps and games that are just waiting for the API to be available.

…And Apple, we know you can’t share your plans…so please surprise us soon!

Many many thanks for your consideration –
Sincerely,

Signed:
Michael Gervautz – Managing Director Imagination GesmbH
Robert Rice – CEO Neogence
Georg Klein – PhD PTAM creator from Oxford University
Stephane Cocquereaumont –  President & Lead Developer Int13 (Kweekies)
Maarten Lens-FitzGerald – Founder & Partner SPRXmobile, developer of Layar
Ori Inbar – Author of GamesAlfresco.com and CEO and founder – Ogmento (formerly Pookatak Games)
Philippe Breuss – Lead developer, Mobilizy
Philip R. Lamb – CTO, Artoolworks
Noora Guldemond – Metaio
Takahito Iguchi – CEO, Tonchidot
Blair MacIntyre – Associate Professor, Georgia Institute of Technology

Bruno Uzzan – CEO, Total Immersion
Michael Zoellner
Fraunhofer IGD
Andrea Carignano – CEO,  Seac02

If you are developing an AR app for the iPhone and wish to join this effort – just let us know.

Blair MacIntyre on UgoTrade

Tish Shute continues with her enlightening series of interviews on UgoTrade. After previously interviewing Ori Inbar and Robert Rice, Blair MacIntyre was a natural choice.
MacIntyre discusses his work at Georgia Tech (which I briefly wrote about here), and shares his perspective on future directions for mobile augmented reality.

A lot of folks think it will be tourist applications where there’s models of times square and models of central park and models of Notre Dame and the big square around that area in paris and along the river and so on, or the models of Italian and Greek history sites – the virtual Rome. As those things start happening and people start building onto the edges, and when Microsoft Photosynth and similar technologies become more pervasive you can start building the models of the world in a semi-automated way from photographs and more structured, intentional drive-by’s and so on. So I think it’ll just sort of happen. And as long there’s a way to have the equivalent of Mosaic for AR, the original open source web browser, that allows you to aggregate all these things. It’s not going to be a Wikitude. It’s not going to be this thing that lets you get a certain kind of data from a specific source, rather it’s the browser that allows you to link through into these data sources.

Read it all over here (and check some of the interesting links featured in the interview).
Curiously enough, a video of one of the games mentioned in the article, “Art of Defense“, was uploaded to Youtube today. It’s an interesting research in how people interact when playing a collaborative AR game (see Bragfish for a similar research with a competitive game):

GDC 2009: More Augmented Reality Demos at Game Developer Conference

Reporting live from GDC 2009 in San Francisco: it’s just getting better!

From Blair’s team at GA Tech:

Zombie Attack on Nvidia Tegra

From Beyond Reality at the Dutch pavilion:

Pit Strategy

Stay tuned for more…

ISMAR 2009: The World’s Best Augmented Reality Event Wants You to Contribute!

This post is directed at you. Yes – you: the augmented reality aficionado.

For those who followed my coverage of ISMAR 2008 – prepare to be blown away by ISMAR 2009.

ismar-2009-wants-you

If you are an AR researcher – ISMAR 2009 is as always the best event to learn about the latest and greatest in AR technology.

If you work in the AR industry – congratulations! Unlike previous years – at ISMAR 2009 you’ll see AR breaking out into commercial success.

If you work in interactive entertainment – come to ISMAR 2009 to experience the phenomen that will revolutionize interactive entertainment forever.

If you are an artist – at ISMAR 2009 you’ll have an opportunity to join this emerging industry and change the way people experience the world, literally.

Interested? Good.

Because all of the above will only transpire – if YOU contribute.

ISMAR 2009 is now officially calling for proposals.

Here are excerpts from the call for proposals. For details check out the official call.

The veteran Science and Technology track will be complemented this year with new Arts, Media and Humanities tracks. ISMAR 2009 will introduce expanded Tutorials, Workshops, Demonstrations and Competitions.

Topics of the Technical Track:

Sensing – Tracking technologies, calibration methods, sensor fusion, vision-based registration and tracking, acquisition of 3D scene descriptions
Information presentation – Object overlay and spatial layout techniques, handling of occlusions or x-ray vision, photorealistic augmentation, real-time augmentation, optical display technologies (HWDs, HMDs, HUDs, mobile projectors), aural or haptic augmentation, combined presentation across several displays (combining mobile and stationary devices), display and view management
User interaction – Interaction techniques and metaphors for MR/AR, collaborative MR/AR, multimodal input and output, tangible interaction, combined interaction with virtual and real objects
Human factors – Usability studies and experiments of MR/AR-based interaction and presentation concepts, acceptance of MR/AR technology, social implications
System architecture – Wearable and mobile computing, distributed and collaborative MR/AR, display hardware, performance issues (real-time approaches), embedded computing for MR/AR, integration of MR/AR technologies into wide-area pervasive computing environments
MR/AR applications – across all areas of personal and professional activities, such as: Personal MR/AR information systems, games, applications in industry, military, medicine, science, entertainment, architecture, tourism, art, cultural heritage, education, training etc.

Topics of the Arts, Media & Humanities Tracks:

Compelling applications of Mixed & Augmented Reality. Applications include artistic expression, experiential-media or interpretive pieces that reflect the study of the human condition.
• Art – The Art program is looking for notable artists that have stretched the boundaries of expression with the use of Mixed and Augmented Reality.  The creative interaction between real, virtual and the imaginary realities to create provoking experiences are highly encouraged.  Written papers and posters are to include position statement with notes and images on approach, implementation and the technology used.  A gallery show will be mounted to support physical entries (see call for demonstrations).
Media – submissions from media practitioners who have stretched the boundaries of the creative impact of Mixed and Augmented Reality.  This venue seeks innovative uses of creative techniques for communication and entertainment to enhance the experience of MR/AR through novel applications of head-mounted, embedded projection, or mobile displays.  Submissions may include new tools, conventions or taxonomies for developing these new media.
Humanities – academic submissions that relate to Mixed and Augmented Reality content that allows for innovative analytical, critical or speculative approaches to reflect the study of the human condition. and digital media.

Call for Innovation Workshops

ISMAR 2009 will have a series of workshops the day before the conference (Monday) to cover the innovative application of Mixed and Augmented Reality to specific industry domains.  Three workshops have been defined.  We invite you to submit papers or panel discussions that will address topics of transferring MR/AR to solve critical real world problems (see web site for details).  If your paper or panel does not fit the existing workshop, you may submit a proposal for a new workshop.
Designing the Future (Design and Manufacturing Workshop):  This workshop will continue the ISMAR legacy of showcasing the pioneering efforts of the auto and other industries’ use of Mixed and Augmented Reality as design tools.
Falling in Love with Learning (Entertainment & Education Workshop):  Entertainment and Education converge in Mixed Reality Experiential Learning Landscapes for museums, libraries, schools and parks.
Transforming Lives (Medical and Military Training Workshop):  Extreme Mixed Reality needs to meet high-risk, high-performance training to enhance human preparation for life and death scenarios.

Call for Pioneering Tutorials

To mark ten years of ISMAR, the 2009 conference will include a three-day comprehensive tutorial program that covers a wide spectrum of topics in Mixed and Augmented Reality.  We are looking for submissions from pioneers to share their experiences and insights.  Formats can be from 30 to 90 minutes.  These tutorials will be video captured and distributed as a series along with special features covering the work of pioneering laboratories worldwide.

Call for Demonstrations

There will be four formats to submit demonstrations.  Demonstrations can be related to papers, posters or panels, but that is not required.  A proposed abstract, floor plan and list of requirements are requested with the letter of intent (see websites for details).  Accepted Participants will receive notification by June 30th, 2009 to start coordination with the ISMAR 2009 Planning Committee.  To accommodate late breaking discoveries, we will accept “Laboratory Demonstrations” until the last minute, pending committee discretion and conference  accommodations (submissions received after August 15th, 2009 will not be included in the publications). Letters of intent are recommended for all submissions to assist in the planning.
Laboratory Demonstrations will provide the opportunity for “late breaking” research teams to informally demonstrate their latest inventions and allow for interaction with attendees and other pioneers.
Research Showcase will be a more formal presentation of innovative Mixed and Augmented Reality content that involve more production support and exhibit design considerations.
Art Gallery will present innovative Mixed and Augmented Reality artwork within a unique gallery format based on a combination of invited and submitted work.
Innovation Exhibitions (See industry/Sponsor relations) will feature the latest commercially available products and services for use in Mixed and Augmented Reality applications.  The exhibition will be available for rental to industry buyers from the entertainment, medical, military and educational markets.  A special “Start-up Park” will be available for small, first time commercial exhibitors at more affordable prices.  Early registration is recommended for the expected increased participation and limited space.

Tracking Competition

The first event of this kind at ISMAR 08 (http://ismar08.org/wiki/doku.php?id=program-competition) caught much attention. A sequel will be organized at ISMAR 2009. Details regarding the tracking task and the rules of competition will be made available on the web site. It is to be expected that only a limited number of teams can participate.

***********************

Don’t be wary.

Even if you do not have a rigorous research paper – you still have an important role in ISMAR 2009.

If you have developed a cool AR tool, app or game, or –

if you have conceptualized an interesting AR idea, or even-

if you just had a vision of the killer AR app  – we want to hear from you!

Don’t wait for the May 16th deadline – you can’t afford to. Submit your proposal today.

**********************

Send your proposals to:

Gudrun Klinker, Blair MacIntyre and Hideo Saito

Science and Technology Program Chairs (Science@ismar09.org)

Blair MacIntyre

Art and Humanities Program Co-Chair (Humanities@ismar09.org, Art@ismar09.org)

Jay Bolter

Humanities Program Chair (Humanities@ismar09.org)

Jarrell Pair

Media Program Chair (Media@ismar09.org)

Charlie Hughes

Tutorial Chair (tutorials@ismar09.org)

Christian Sandor

Laboratory Demonstration Chair (Demos@ismar09.org)

Sean White,

Research Showcase Chair (Showcase@ismar09.org)

Larry Davis

Innovation Exhibition Chair (Exhibits@ismar09.org)

Christopher Stapleton

Interim Workshop Chair (Workshops@ismar09.org)

Daniel Pustka

Tracking Contest Chair (Tracking@ismar09.org)

We want to hear from you!

As a member of ISMAR’s Media track committee – I hereby vow to emphatically review every single proposal in the media track. Your voice must be heard.

The Making of ARf: Me, My Dog and i-Phone

Blair MacIntyre sent me a nice proof of concept of an augmented reality virtual pet running on an iPhone.

So I thought, why not write about “the making of ARf”?

Shot a couple of questions to Blair and he conveniently turned it into a well structured interview. Thanks Blair!

Here it is for your edutainment.

games alfresco: Hey Blair, I’d like to write about ARf in my blog.

Blair: Great! :)

games alfresco: Is there anything beyond the video that I could share?

Blair: We (my student Kimberly Spreen, really) did this relatively quickly.  She figured out how to get video [on an iphone], and we’d been thinking about doing a virtual pet game for quite a while, so we decided to implement some of the ideas to test out the iPhone.

games alfresco: Could you share a description of the current features?

Blair: Right now, you can interact via the touch screen, and by moving the markers.  Kim did a nice little implementation of multi-marker tracking where you can just add new markers as you feel like and don’t need to preconfigure the multi-marker layout.  You can interact with the dog by touching it (touch its nose and it jumps up to lick, its tail and it chases it, rub its back and it rolls over to let you rub its tummy) or by touching the ground to send it somewhere.  If it gets near its water it drinks, near the other dog it plays, or near a smudge (that you put on the ground by rubbing the ground) it sniffs it (alas, the smudge looks like a little “pile”, which works, but wasn’t the intent).

games alfresco: Plans for a full game?

Blair: This is a project we’ve been thinking about for a few years, going back to our “Dart the Dog” project that we did in Director.  The goal is to explore what it means to let everyone have a virtual pet they can take with them, and interact with through different interfaces (desktop, handheld, handheld AR, etc).  Most importantly, we want the location (bedroom, living room, work, bus, bar, etc) and activity (sound level, light levels, etc) and presence of other pets to impact how the pet develops.

To handle the development, we are talking to some folks at an AI company, who are creating an engine for doing creature AI based on reinforcement learning.  They hope to have something we can use next year.  If we can get that, we will be able to really have pets that grown, change, evolve, etc.

A few company’s who are funding us are interested in this, so I hope we can devote some energy to it next year.  We’ll probably target a few platforms, but obviously the iPhone has a lot of appeal.  From a research perspective, I’m interested in it because there is the potential to release a research game and (with permission of the people who download it, of course), collect a lot of usage data.  Ironically, since the create AI engine is server based, I don’t know if we could handle a big success and provide the AI service to everyone who gets the game, but I’ll worry about that it we ever get there.

games alfresco: Can you share more details about the software? Is it a Jailbroken iPhone?

Blair: Official iPhone SDK, unhacked phones.  I have no interest in working with jailbroken phones;  the appeal of the iPhone is the potential for mass distribution to support broad evaluation and feedback.

Obviously, we have hacked the API to get at the camera, so we can’t release this until Apple creates an official API.

We are using StbTracker for tracking.  The rest of the software was written by us.

games alfresco: Cool. Thanks for showing us “under the hood” of ARf.

For a doggie game, the name ARf works nicely in English.

It could get weird when translated into:

  • Spanish – jau, jau
  • Afrikaansblaf
  • Albanian – ham, ham
  • Arabic – how how
  • Armenian – haf, haf
  • Basque –  zaunk-zaunk
  • Bulgarian –  jaff, jaff
  • Catalan – bau, bau
  • Chinese, Cantonesewow, wow
  • Chinese, Mandarinwang, wang
  • Croatian – vau, vau
  • Danish – vov, vov
  • Dutch – waf, waf;
  • Esperantoboj, boj
  • French – ouaf, ouaf
  • German – wuff, wuff;
  • Greek – ghav, ghav
  • Hebrew – hav, hav
  • Hindibho, bho
  • Icelandic – voff, voff
  • Indonesian – guk, guk
  • Irish – amh-amh
  • Japanese – wan, wan
  • Korean – mung mung
  • Latvian – vau, vau
  • Persian – vogh, vogh
  • Portuguese – béu-béu
  • Russian – gav, gav
  • Serbian – av, av
  • Slovenianhov, hov
  • Thai – hoang, hoang

Live from ISMAR ’08: The dARk side of Physical Gaming

Welcome to the late evening keynote of the second day of ISMAR ’08 in Cambridge.

The keynote speaker is Diarmid Campbell, from Sony Computer Entertainment Europe (London), and heads its research on camera gaming. And we are covering it in real time.

Diarmid comes on stage. the crowed is going crazy…

The talk: Out of the lab and into the living room

What a camera game? Simply put, you see yourself in the camera and add graphics on top.

The trouble with the brain: it fixes things you see (example of a checkerboard, a black square in the light has the same color as a white square in the dark.)

Background subtraction is the first thing you try to do. Using this technique, Diarmid superimposes him self in real time on top of…the ’70 super band ABBA…

User interface motion buttons – use virtual buttons that the user activates. The response is not as robust, but it’s more responsive.

Example of EyeToy Kinetic

Next is a demonstration of vector buttons and optical flow.

You have to keep the control on the side – otherwise the player’s body will activate it unintentionally.

It turns out Sony decided not to use this control…not just yet.

A similar control was actually published in Creature Adventures available online. Diarmid struggles with it. The crowed goes wild. Diarmid: “You get the idea…”

Good input device characteristics: Many degrees of freedom, non-abstract (player action=game action), robust and responsive.

Camera games have been accused in the past for not having depth (too repetitive). There are 2 game mechanics: skill based (shoot the bad guy) and puzzle based. This could become shallow – unless you deliver on the responsiveness and robustness.

To demonstrate color tracking, Diarmid dives into the next demo (to the pleasure of the audience…). For this demo he holds 2 cheerleader pompoms…

“It’s like a dance dance revolution game, so I also have to sing and occasionally shout out party…”

The crowd is on the floor.

See for yourself –

We are on to drawing games, Sketch Tech. He draws a cow that is supposed to land on a banana shaped moon. He succeeds!

Using a face detector from Japan, here is a Head Tracking game: a green ball hangs from his mouth (a pendulum) and with circular moves of his head he rotates it, while trying to balance it…

Eye of judgment, a game that came out last year (bought out by Sony) relied on a marker based augmented reality technology. It is similar to a memory game, with a camera and a computer, and cards.

We are starting to wrap up and Diarmid summarizes, credits Pierre for setting up all the hardware, and opens the floor for questions.

Question: How do you make the game interesting when you’re doing similar gestures over and over again…

Diarmid: When the game is robust and responsive – you’ll be surprised how long you can play the game and try to be better.

Blair MacIntyre (from the audience): Robust and learn-able is what makes the game fun over time.

Question: Is there anything more you can tell us about the depth camera? Will it be available soon to consumers?

Diarmid: No.

The crowed bursts into loughs.

Blair (jumps in from the audience) There is a company called 3dv in Israel which offers such a camera. It’s not cheap or as good as discussed before, but you can get it.

Q: What’s special about camera games beyond novelty?

Diarmid: The 2 novel aspects of camera games are that it allows you to see yourself, and you can avoid the controller. Camera games are also great for multi-players.

Q: Is there a dream game you’d like to see?

Diarmid: Wow, that’s hard…I worked on a game before Sony called The Thing based on Carpenter’s movie. It was all about trust. The camera suddenly opens up the ability to play with that. When people see each other, the person to person interaction is very interesting and hasn’t been explored in games.

Q: will we see camera games on PSP?

Diarmid: there is a game in development, and I don’t know if I can talk about it…

Q: when I look in the mirror I am not so comfortable with what I see…how do you handle that?

Diarmid:  We flip the image. It’s hard to handle a ball, when just looking at the mirror.

And that’s a wrap! Standing ovation.

~~~

After party shots…


Live from ISMAR ’08: Augmented Reality – What Users Are Saying

Everyone is back from lunch and the afternoon session is on: User studies in augmented reality.

First on stage is Benjamin Avery to talk (with an animated Australian accent) about User Evaluation of See-Through Vision for Mobile Outdoor Augmented Reality. This

The study took users outdoors in various scenarios to test the performance of AR vision using see through displays. Then, they compared it with a second group that watched the video through a desktop computer

[link to paper, videos, images to come]

The result demonstrates complex trade-off between AR and desktop visualizations. AR system provided increased accuracy in locating specific points in the scene. AR visualization was quite simple, beating the desktop in tracking and in better visualization.

Stay tuned for the demo (which was hauled all the way from Australia to Cambridge)!

~~~

Next on stage is Cindy Robertson from Georgia Tech (Honorable mention in ISMAR 2007) and she discusses An Evaluation of Graphical Context in Registered AR, Non-Registered AR, and Heads-Up Displays.

How are users affected when there are many registration errors  or in other words when when tracking is not perfect? Can the user handle it better if a graphics context is provided?

They tested it with a set of tasks encompassing placing virtual Lego blocks with groups using Registered AR, Non-Registered AR, and Heads-Up Displays.

Following an exhaustive analysis of the resulted data they uncovered the following insights:

  • Head movement and memorization increased performance
  • Head movement affected perceived mental workload and frustration
  • When you have graphics obstructing your view, and switching between it and real world is frustrating
  • HUD-visible case was surprisingly faster than the other cases. But people hated it…

Final conclusion: Registered outperformed both the non-registered AR and graphics displayed on a HUD. Non-registered AR does not offer any significant improvement.

Future plans are to test home-like scenarios and impose more complex tasks.

~~~

On stage Mark Livingston is getting ready to talk about The Effect of Registration Error on Tracking Distant Augmented Objects.

A basic assumption is that registration errors limits performance of users in AR. “We wanted to measure the sources (such errors are noise, latency, position and orientation error) and see the affect on the user – and then be able to write requirements for future systems.”

For this study, they used the nVisorST.

The tasks were trying to measure the users ability to understand behaviors and situational awareness in the AR application: following a target (car) when buildings stand in between.

Conclusions are straight forward though somewhat surprising:

  • Latency has significant effect on performance and response time – was the worse.
  • Noise was disliked but did not have significant impact on performance
  • Orientation error fifn’t have significant effect
  • Weather had significant impact on results: darker weather delivered improved performances. Brightness was a major distraction.

===============

From the ISMAR Program

User Studies (from ISMAR ’08 program)

  • User Evaluation of See-Through Vision for Mobile Outdoor Augmented Reality
    Benjamin Avery, Bruce H. Thomas, Wayne Piekarski
  • An Evaluation of Graphical Context in Registered AR, Non-Registered AR, and Heads-Up Displays
    Cindy Robertson, Blair MacIntyre, Bruce Walker
  • The Effect of Registration Error on Tracking Distant Augmented Objects
    Mark A. Livingston, Zhuming Ai

Live from ISMAR ’08 in Cambridge: Enjoy the Weather…

“Enjoy the weather” uttered sarcastically a kindhearted British witch (aka air hostess) while we were leaving the aircraft; surprisingly – we did in the first day. We were then promised this is accidental and surely the last day of summer. Splendid.

Venice? nope, Cambridge!

Venice, Italy? nope, Cambridge, UK!

I have landed in Cambridge, UK (where people go to augment their reality) and all I ever heard about it – is true: British meadow green, majestic 600 year old buildings, cosmopolitan young folks, fish cakes…a combination that gives this university city its unique aura; a great setting for the event starting tomorrow – reality only better, at ISMAR ’08.

St. Catherine College - can't ask for a nicer place to stay...

St. Catherine College - can't ask for a nicer place to stay...

For those who couldn’t make it, stay tuned for a live coverage of ISMAR ’08, the world’s best augmented reality event.

Featuring AR pioneers such as: Tom Drummond, Paul McIlroy, Mark Billinghurst, Blair MacIntyre, Daniel Wagner, Wayne Piekarski, Uli Bockholt (Fraunhofer IGD), Peter Meier, Mark A. Livingston, Diarmid Campbell, David Murray, Rolf R. Hainich, Oliver Bimber, Hideo Saito and many more —

— overing topics such as: Industrial augmented reality, hand-held augmented reality, displays, user studies, applications, layouts, demos, state-of-the-art AR, and don’t miss the highly anticipated tracking competition.

Welcome all speakers and attendees to the event, and don’t forget: look right first!

If you are at the event (or not) and want to chat, share thoughts, or ask questions – leave a comment here or send a message on facebook.

Augmented Reality Panel at Virtual Worlds Conference

For those of you who missed Virtual Worlds last week in LA, and therefore found it hard to attend the Augmented Reality Panel – David Orban was there for you.

Here are all 59 minutes of the discussion he recorded and uploaded. Thanks David!

Roo, the panel host, brilliantly captured a blow by blow of the discussion and included slides that were used to illustrate the panel. Kudos to Roo!

From the event program:

Augmented Reality: Virtual Interfaces to Tangible Spaces
Marc Goodman, Director, Alcatel-Lucent
Eric Rice, Producer, Slackstreet Studios
Blair MacIntyre, Associate Professor, School of Interactive Computing, Director, GVU Center Augmented Environments Lab, Georgia Institute of Technology
David Orban, Founder & Chief Evangelist, WideTag, Inc.
Andrew (Roo) Reynolds, Portfolio Executive for Social Media, BBC Vision (moderator)

Now repeat after me this tongue twister: I virtually participated in the Virtual Worlds panel about augmented reality and I really augmented my world.

Next, we are off to ISMAR ’08 September 15-18th in Cambridge, UK!