Augmented Tattoos

Last spring I worked with a friend of mine on a webcomic about augmented reality.  The artistic quality wasn’t out of this world, but it was fun to do for a few months.  Unfortunately, she had other obligations and couldn’t continue the series.  But seeing the video below reminded me of a mini story arc that we did involving using a tattoo marker.  I thought it was a good time, along with the actual video, to bring back that panel series.

And now for the real augmented tattoo video.

If anyone is a decent artist, I’d love to resurrect the web comic.  Drop me a line if you’re interested.

7 Ways Augmented Reality Will Change Your Brain

Sarah adjusted her new Oakley TrueVisions and slipped between the subway doors as they closed.  They pinched her nose, but she knew she’d get used to them like she had her previous pair.  Her ongoing chess game with Malcolm blinked, so she activated it and surveyed the board.  He’d taken her pawn, but she knew it’d been right to sacrifice.   She just didn’t know the next move.

The train car shimmied and rattled as it bent around the curve, but Sarah was focused on the board hovering before her that only she could see.  Finally, she decided to be aggressive and put her knight in an attacking position.  With her decision made, the program scored her play as a novice move and showed her four other possible maneuvers she could have made.  She wasn’t a good chess player now, but she knew if she kept at it, she’d be a Master in half the time it took her father.

Off the train and back at street level, she brought up the free way-finder layer for the best route to the interview.  A smattering of advertisements appeared, but so did a bright yellow line leading her to the Centurion Building.  Follow the yellow brick road, she thought.

A barrage of messages came flooding through and Sarah started reading them.  She’d gotten up early for the interview and all her friends were just waking up.  Her mind busy with the replies and following the yellow line, she almost stepped out into traffic when a burly gentlemen in a navy woolen sweater grabbed her arm.

“Thank you,” she mumbled and waited for the light to turn.

Sarah stopped at the coffee shop and reviewed the menu.  Her coffee-helper app flooded her screen with information about calories and cost per ounce and fat content and the effect on the environment.   The wealth of information made her head swim and so she ordered a double chocolate latte with extra cream.  She regretted it as soon as she paid.  Her drink was half her calorie count for the day and she wasn’t yet to her weight goal.

Halfway up the elevator in the Centurion Building, Sarah brought up her notes to review for the interview.  They led her right in and she sat down.  Sarah had prepared all night for the interview with Centurion Capital and she just knew she’d ace the interview.  Centurion was a forward-thinking company so they actually wanted you to interview with your AR specs on.

With her notes hanging mid-air to her left and the facial expression reader app overlain her interviewer’s face, she knew she out-gunned them.  The interview started well, but Sarah got distracted by each change in his face.  Often she would respond to the change in the interviewer, and he would respond to her and she got lost in the emotional feedback loop.  The notes helped, some, but she had them in a helter-skelter organization so at times she reacted slowly as she searched the notes and spoke without feeling.

By the end of the interview, she knew she’d blown it.  Dejected and with a migraine coming on, Sarah found herself back at the coffee shop ordering the other half of her calorie load.  At least tomorrow she had two new interviews to make up for the one she’d blown today.

***************************************************************************************

Augmented reality pundits, myself included, purport that the nascent technology will change our lives.  But really, that’s what technology is all about.  Even the lowly vacuum cleaner was sold as a way to free the housewife from her oppressive chores.

A better question might be to ask how augmented reality will change our lives, and more importantly, how will it change our brains.  The brief snapshot of its effect in the story above was to illustrate the result of a ubiquitous computing environment.   But to truly understand, we have to go deeper into the actual brain matter and watch how AR might change it.

1. Learning Optimum Strategies

The brain learns by failure.  In the 1970s, Wolfram Schultz a neuroscientist at Cambridge University did experiments on monkeys involving measuring their dopamine levels while getting rewards for behavior.  Monkeys were given a shot of juice as a reward for moving a joystick in a particular direction.  Once the behavior is established, the dopamine in the monkey’s brains rises before they actually get the juice and after they’ve pulled the joystick in the correct direction, further establishing the behavior.  The dopamine is a predictor of success, but if the brain doesn’t actually receive the reward, then it becomes frustrated and reduces the link to the behavior.

This is the essential strategy for learning, or as Neils Bohr once put, “An expert is a man who has made all the mistakes which can be made, in a narrow field.”  The brain is constantly trying to reduce the “error-signal” by finding the optimum strategy.

What AR has to offer in the future is a constant computing environment that can help us detect our failures more quickly, so we can reduce that error-signal.  World-class game players constantly and ruthlessly evaluate their performance looking for their failures so they can adjust.  An on-all-the-time computer that understand your world and is a conduit for your habits can help optimize your performance with the right algorithms.

2. Over Thinking the Pitch

Now imagine a minor league baseball player has read the first item on the list and decided he’s going to use the knowledge to be a better hitter.  He has a wiz-kid younger brother who can program like the wind and he makes him a hitting app to use with his conspicuous AR specs.  When he steps up to the plate, he’s presented with a ton of angles and information based on the pitchers body language that tells him what kind of pitch to expect.  As the arm comes around, lines draw in where the ball is expected to go including a predictive MPH meter hanging over home plate.   The hitter calculates where he thinks he needs to swing and starts the follow though.

What happens?  I propose that he swings and misses, and on average, his hitting percentage goes down.  Why would this be?  More information helped my chess girl learn.  But in a fluid game where reactions are in milliseconds, you can over think the pitch.

The most famous chokes in sports history came on the last hole of the British Open in 1999 by Jean Van de Velde.  His mistake?  He started thinking about how he was going to swing and stopped letting it happen.  Analytics can help us after the fact to learn from our mistakes, but in the heat of battle, our gut knows best.

3. Don’t AR and Walk

We all know that it’s dangerous to drink and drive.  And probably most even knew that it’s dangerous to text and drive (and as this article points out, it’s even more dangerous than drunk driving.)

But what about walking and using AR?  We’ll be living in an information rich environment and interfacing with our computers in ways that will tie up our brains.  The recent bestseller SuperFreakonomics points out how walking drunk is eight times more likely to cause your own death (of course, you’re also taking other lives into danger when you drunk drive — so don’t do it.)  This means that your distracted walking will become a hazard more dangerous than drunk driving.

If AR becomes the norm, will we see laws enacted prohibiting accessing our AR specs while walking?  There have been stranger prohibitions.

4. Too Much Information Leads to Poor Decisions

Information is a blessing and a curse.  The key to making it work is finding the “right” information.  I often teach people struggling with information overload at work to find the “actionable” information and not to get lost on the “vanity” information.

A simple experiment conducted by Paul Andreassen on a group of lowly MIT business students illustrated this point.  They broke up the subjects into two groups and asked them to pick stocks to maximize their profits.  One was given only the changes in the prices of their stocks and the other group was given access to cable channels and expert analysis.  The low information group doubled the cash of the information rich group.  The wrong information is more dangerous than no information at all.

Augmented reality proposes to be an information rich environment.  So we’d better be careful if we try to drink from the fire hose.

5. Increasing our Emotional Quotient

Reading people is one of the skills within the Emotional Quotient (EQ) set.  One of the basics is the ability to understand the emotional state of other people.  This makes us more capable of utilizing the right methods for getting what we want.

Paul Ekman pioneered the rules for reading and interpreting the expressions of the face in a five hundred page document called the Facial Action Coding System (FACS.)  The purpose of FACS is to tell researchers the inner-mind of their subjects during experiments.  Another researcher, John Gottman uses this system to tell if a marriage will last just by watching the couple for five minutes.  He has an amazing track record of correct predictions.

But what if we use computer recognition to utilize the FACS?  Then we can create live training systems that help us understand what people are thinking.  This can lead to better understanding of people and increases our EQ.  Or it could become distracting and impare our ability to have a conversation leading to people not wanting to talk to us.

6.  More Uptime of Learning

The book Outliers by Malcolm Gladwell talks about the magic number of 10,000 as the hours needed to become an expert.  An always-on computing environment with a screen hanging before our eyes can give us more opportunity to hit that number as we continue our work in the blank spaces between other events (meeting surfing will become even worse.)

Combined with the rapid feedback of error-signal learning, we can become experts in the field of our choice in less time than ever.  Unless our field is firing arrows into apples from a speeding horse while riding backwards.  Then I’m afraid, you’re going to have to do it the old fashioned way.

7. Information Makes You Fat

If I asked you to memorize a seven digit number and then while your mind is tied up trying to do the task, offer you a piece of cake or a bowl of fruit, you’ll be twenty-two percent more likely to choose the cake.  The human mind is only capable of holding seven pieces of data at any one moment (from psychologist George Miller’s Essay “The Magical Number Seven, Plus or Minus Two.”)

When the cognitive brain is tied up memorizing the seven digits, our impulse control is reduced.  This exists because working memory and rationality both come from the prefrontal cortex.  This would also help explain why millions of Worlds of Warcraft players gain so much weight and it’s not just because they’re sitting for hours at a time.

As we glut on a diet rich in information in our augmented world, we want to be careful that we’re not so busy that we glut on our impulses as well.

**********************************************************************************************

Augmented reality is really about information floating before our eyes, always on, always there.  Like many things in life, moderation is the key to getting the most out of it.  And while this forward look at the dangers and opportunities for our brains assumes that we’ll be wearing glasses and watching the latest Christina Aguilera video while picking our daily stocks and making our grocery list, it doesn’t have to be about augmented reality at all.  Information is both a blessing and a curse, and it’s around us all the time, even now.  Augmented reality will just make it a bit more hyper-aware.

If you liked the little vignettes of thought about information, you might like the following books that most of this research came from:  Blink by Malcolm Gladwell, Outliers by Malcolm Gladwell, SuperFreakonomics by Steven D. Levitt and Stephen J. Dubner, and How We Decide by Jonah Lehrer.

Your Face Is A Social Business Card

Last July TAT (“The Astonishing Tribe“) posted a concept video of their augmented social face-card system (okay, I made that term up, what else should we call it?).  The video tickled the imagination with over 400,000 views.

TAT has since teamed up with Polar Rose, a leading computer vision services company, to turn that concept into a reality.  The TAT Cascades system combined with Polar Rose’s FaceLib gives us this prototype called Recognizr.

It’s nice to see the technology is coming together, but I wonder how the social and ethical repercussions will play out.  My guess is the only way this will truly work is to be paired with one of the big social networks like Facebook or LinkedIn.  That way you have your protected status built into your social information.

But then it becomes less useful for business or conference settings, which is where I see the biggest use (and as they demonstrated in the concept video.)  So if the access to your information connected to your face is customizable and controlled by you, then it probably won’t cause too much heartache.  Hopefully, we’ll see a real product from them soon.  Having the Recognizr system available at ARE2010 in June would be fantastic.

Metaio Releases Unifeye SDK

Metaio released their Unifeye Mobile Augmented Reality SDK at Mobile World Congress 2010.

The Unifeye® Mobile SDK is the world´s first and only software development kit for creating mobile augmented reality (AR) applications. The professional toolbox is supporting all major mobile platforms and features the latest image recognition technologies, 3D rendering for animations with real time interaction and optimized components for mobile hardware. With the Unifeye® Mobile SDK software it is possible to create fascinating marketing experiences, intuitive information design, mobile augmented reality games or innovative retail solutions. Based on the proven AR platform Unifeye® by metaio it is possible to easily develop and deploy solutions at the interface between the real and virtual world.

Having used their beta Unifeye software last year, I can attest to the ease of use.  However, I have not used their mobile software development version so there may be some differences.

Fancouver at the Winter Olympics

Yahoo! and augmented reality leader Total Immersion have come up with some nifty ways to bring consumers into the action at the world’s largest winter sporting event.  Yahoo!’s “Fancouver” exhibit enables passers-by to insert themselves into the festivities in a host of guises.   Kicking off yesterday, Feb. 12, Fancouver features an entertaining and versatile digital out-of-home display, with dual windows that use augmented reality (AR) face tracking and tracking to a brochure, respectively, to give fans a distinctly different view of the proceedings.

Metaio at Mobile World Congress 2010

In addition to the other AR happenings at Mobile World Congress 2010, Metaio will be featured at “Creation Day” with Sony Ericsson.  They’ll be showing off their latest feature tracking technology on a new Sony Ericsson device.

Peter Meier, the CTO of Metaio, will also be giving a speech on Wednesday at 4pm within the session: “Mobile Innovation — A Vision of 2020.”  This session will:  “take a visionary look into the services and applications that mobile communication will provide in 10 years time and the impact they will have on the way we live and communicate in 2020. The latter half of this session will look at Augmented Reality.

Thanks Jan from Augmented Blog for the update.  He promises some exciting releases and a movie after MWC2010 has concluded.

And once again, we won’t be able to attend – so if you’re there – keep us updated about your experience.


Augmented Reality at the Mobile World Congress

Next week, February 15-18th, will be the Mobile World Congress in Barcelona.  There will be a variety of AR related events during the MWC.

AR Showcase

Christine Perey has organized an AR Showcase on Wednesday, February 17th from 5:00-7:00, so AR companies can demonstrate their services and products to customers.  Designers will also have a chance to compare and contrast their products versus the competition.  The following companies have confirmed their attendance:

You can find the Showcase in the northeast corner of the courtyard.  Announcements for the AR showcase can be tweeted to #arshow (changed for length.)

The Mobile AR Summit is an invitation only event.  If you’re interested in joining, please contact Christine Perey at cperey@perey.com.    More information can be found here.

Other Related Events

Navteq Challenge
Sunday, 14.2.2010
Poble Espanyol, City Hall.
Wikitude Drive is an Augmented Reality navigation system. They are one of the 10 finalists at this years Navteq Challenge.

http://www.nn4d.com/site/global/market/lbs_challenge/about/emea/home/p_home.jsp
Mobile Premier Award in Innovation
Monday, 15.2., 2010 15:00 to 20:00
Petit Palau of Palau de la Musica
Mobilizy is with Wikitude one of the 20 finalists of the “Mobile Premier Award in Innovation”.
http://www.mobilepremierawards.com/
Martin Lechner, CTO Mobilizy, will present.

AR Summit

Wednesday, 17.2. 13:00 to 19:00
Location: to be announced.
Mobilizy CTO Martin Lechner presents a position paper “ARML an Augmented Reality Standard”. ARML is currently being reviewed by W3C (World Wide Web Consortium). At 17:00 there will be a Wikitude Showcase presentation.

We won’t be able to attend – so if you’re there – keep us updated about your experience.


The Multi-Sensor Problem

Sensor systems like cameras, markers, RFID, QR codes (and more) are usually done as single methods to align our augments.  One challenge for a ubiquitous computing enviroment will be meshing together the various available sensors so computers have a seemless understanding of the world.   

This video from the University of Munich shows us how a multi-sensor system works.  It appears to be from 2008 (or at least the linked paper is.)  Here’s the description on the project:

TUM-FAR 2008: Dynamic fusion of several sensors (Gyro, UWB Ubisens, flat marker, ART). A user walks down a hallway and enters a room seeing augmentations different augmentations as he walks on: a sign at the door and a sheep on a table.

In the process, he is tracked by different devices, some planted in the environment (UWB, ART, paper marker), and some carried along with a mobile camera pack (gyro, UWB marker, ART marker). Our Ubitrack system automatically switches between different fusion modes depending on which sensors are currently delivering valid data. In consequence, the stability of the augmentations varies a lot: when high-precision ART-based optical tracking is not available lost (outside ART tracking range, or ART marker covered by a bag), the sheep moves off the table. As soon as ART is back, the sheep is back in its original place on the table.

Note that the user does not have to reconfigure the fusion setup at any point in time. An independent Ubitrack client continuously watches the current position of the user and the associates it with the known range of individual trackers, reconfiguring the fusion arrangements on the fly while the user moves about.

The project brings up an interesting question.  Is anyone working with multi-sensor systems?  We know we’ll need a mix of GPS and local image recognition and markers to achieve our goals, but is anyone working on this complex problem for a real product?  We’ve seen good image recognition with Google Goggles or SREngine, and GPS/accelerometer based AR is popular, but I’d like to see an app use both to achieve their aims. 

If you’re working on a multi-sensor project.  We’d love to hear about it at Games Alfresco.

What the Movie Avatar Can Teach Augmented Reality

The biggest news about the movie Avatar has been the 3D experience and the way its blown the doors off the previous records. The movie has garnered huge success because it pushed the boundaries of technology and told an interesting story.

I loved the movie and the way 3D helped give more perspective to the enviroment. My own Star Trek loving mother didn’t even realize the Na’vi were CGI. She thought they were people in blue suits (really… I’m not joking.) And though storytelling will become important to later advanced augmented reality applications, it’s not what I wanted to point out.

James Cameron is part art-dude and part tech-geek. He waited for years for the technology to ripen enough to do the movie the way he wanted. One of the innovations that he created for the movie was the Fusion camera for the live-action sequences. Normally, scenes are filmed before a green screen and then the CGI is added afterwards. The actors play a game of make-believe and the director has to guess at how the enviroment will unfold around them. CGI movies tend view flatly because the emotions are added later by the special effects guys and not the actors on the scene. Cameron has changed all that.

The Fusion camera system is an augmented reality viewport into the CGI world. When Cameron was filming the actors, he was able to direct them and see the results. When he looks through his camera, he can see them interacting with the world Pandora as the nine foot Na’vi and help them tell the story. The camera itself wasn’t even a real camera in the sense that it filmed the action. The camera allowed Cameron to see the action being recorded by multiple sensors and cameras.  Once the action was recorded, he could go back and reshoot the action from a different perspective, even with the actors gone.

Facial expression was another hurdle they had to jump to make the movie work. So they added little cameras hanging on people’s heads to capture their range of facial expressions and then tweaked algorithms to get them to react correctly.  Even now we can pull off this trick.

Together these systems are similar to an immersive augmented reality world. While we don’t have the HMDs, complete camera access and processing power to pull off the world of Pandora now, time and continued improvement will make lesser versions possible.

If you look at the Fusion camera system, the camera is essentially the HMD display, albeit a large and bulky one. Multiple cameras, RFIDs and tracking markers help the computer understand the world, and complex and powerful computers put all the pieces together. I can only imagine that this system could be turned into a mind-blowing game in an empty warehouse with the proper HMDs.

Essentially, the movie Avatar teaches us that augmented reality has sky-high practical possibilities. All the components of his Fusion system can be ported to the commercial world (not now, but in three or four years) and used to make complex and believable environments overlaid our own world.

In the future, you too can be a nine-foot tall blue Na’vi and you won’t even have to have your soul sucked through a fiber-optic tree.

Insights Into Augmented Reality from Total Immersion

Total Immersion leads the augmented reality industry in total projects (around 125 last year and they’re expecting over 250 in 2010.)  They’ve successfully created world-wide campaigns like Coke Zero and the Avatar i-Tag game line.  So when they talk about augmented reality, I want to make sure I’m taking notes.  Iriny Kuznetsova from 2Nova interviewed Nicolas Bapst about the company and their current activities.  The interview was short, but had a few interesting insights.

Total Immersion has done work for the military in creating augmented reality solutions that put simulated objects on the battlefield.  This is a much cheaper alternative to war-gaming with real equipment.  Hopefully this encourages the military to fund more see-through AR HMDs. 

Total Immersion expects that AR mobile marketing will be the new trend in the coming year and shows off a brief demonstration.  They’re converting their PC software to mobile to take advantage of the smartphone growth.  I found Nicolas’ observation about how augmented reality marketing applications give you direct access to your customers interesting.  By moving people from static newspapers to the computer (and especially the smartphone), then they can find out exactly who is interested in their product and then leverage social media to spread the word.  Nicolas explains they doubled time on websites by adding augmented reality content.  I’m curious if this increase will sustain as the novelty of augmented reality wears off. 

Nothing game breaking here, but worth a few minutes if you’re not familiar with the company.