Super Ventures launches first fund and incubator dedicated to augmented reality

1-YNMo-2e31fn6B7eqkPAIZg.png

For the past 20 years, our team has been building an ecosystem around technologies we believe are bringing superpowers to the people: augmented reality, virtual reality and wearable tech. We believe these technologies are making us better at everything we do and will overtake personal computing as the next platform. To help breed these superpowers, we are launching Super Ventures, the first incubator and fund dedicated to augmented reality.

“Establishing an investment fund and incubator furthers our influence in the AR space by allowing us to invest in passionate entrepreneurs and incubate technologies that will become the foundation for the next wave of computing,” said Super Ventures Founder and GM, Ori Inbar.

Today we are announcing our inaugural fund of $10 million along with initial investments in several AR companies including Waygo, a visual translation app specializing in Asian languages, and Fringefy, a visual search engine that helps people discover local content and services. Our fund will also invest in technologies enabling AR, and we are separately backing an unannounced company developing an intelligence system to crowdsource visual data.

“I am extremely excited to be working with Super Ventures,” said Ryan Rogowski, CEO of Waygo. “The Super Ventures team has an impressive amount of operational experience and deep connections in the AR community that are sure to help Waygo reach its next milestones!”

“We feel each and every Super Ventures team member provides substantial advantages to Fringefy from technical aspects, to the user experience, and business strategy,” said Assif Ziv, Co-founder of Fringefy. “We are excited to tap into the Super Ventures team’s immense network and experience.”

“Super Ventures is building an ecosystem of companies enabling the technology that will change how computers — and humans — see and interact with the world,” said the CEO & co-founder of the unannounced company. “We are truly excited to be part of this brain trust, and to leverage the network and expertise of the Super Ventures team.”

The Super Ventures partnership is made up of Ori Inbar, Matt Miesnieks, Tom Emrich and Professor Mark Billinghurst.

Ori Inbar has devoted the past 9 years to fostering the AR ecosystem. Prior to Super Ventures he was the co-founder and CEO of AR mobile gaming startup Ogmento (now Flyby Media — acquired by Apple). He is also the founder of AugmentedReality.org, a non-profit organization on a mission to inspire 1 billion active users of augmented reality by 2020 and the producer of the world’s largest AR, VR and wearable tech event in Silicon Valley and Asia, AWE. Ori advises dozens of startups, corporations, and funds about the AR industry.

Matt Miesnieks most recently led an AR R&D team at Samsung and previously was co-founder & CEO of Dekko, the company that invented 3D AR holograms on iOS. Prior to this, Matt led worldwide customer development for mobile AR browser Layar (acquired by Blippar), and held executive leadership roles at Openwave and other mobile companies.

Tom Emrich is the founder of We Are Wearables, the largest wearable tech community of its kind, which has become the launch pad for a number of startups. He is also the co-producer of AWE. Tom is well known for his analysis and blogging on wearable tech as one of the top influencers in this space. He has used this insight to advise VCs, startups and corporations on opportunities within the wearable tech market.

Professor Mark Billinghurst is one of the most recognized researchers in AR. His team developed the first mobile AR advertising experience, first collaborative AR game on mobile devices, and early visual authoring tools for AR, among many other innovations. He founded the HIT Lab NZ, a leading AR research laboratory and is the co-founder of ARToolworks (acquired by DAQRI), which produced ARToolKit, the most popular open source AR tracking library. In the past he has conducted innovative interface research for Nokia, the MIT Media Lab, Google, HP, Samsung, British Telecom and other companies.

“The next 12–18 months will be seen as the best opportunity to invest in augmented reality,” said Matt Miesnieks. “Our goal is to identify early-stage startups developing enabling platforms and provide them with necessary capital and mentorship. Our guiding light for our investment decisions is our proprietary industry roadmap we’ve developed from our combined domain expertise. The companies we select will be the early winning platforms in the next mega-wave of mobile disruption.”

“Our team will take a hands-on approach with the startups, leveraging our industry experience, research knowledge and networks to help them meet their goals while part of the incubator,” added Professor Mark Billinghurst.

“Unique to Super Ventures is the community we have fostered which we will put to work to help support our startups,” said Tom Emrich. “Our community of over 150,000 professionals not only gives us access to investment opportunities early on, it also offers a vast network of mentors, corporate partners and investors which our startups can rely on to succeed.”

Join us in bringing superpowers to the people.

Optinvent Unveils consumer oriented ORA-X Smart Glass

Optinvent Unveils the Smart Glass that doesn’t make you look like a Creep

 orax1 ORAX2

The ORA-X is a revolutionary mobile device in a disruptive form factor – Forget “Smart Glasses” that look geeky. Enter a new category: Smart Headphones – running Android – with high quality audio – and an adjustable see-through display.

 

January 05, 2014, Las Vegas, NV. – Optinvent, a world leader in smart digital eyewear, unveils today, for the first time anywhere, the design of its revolution ORA-X smart headphones.

Targeting music lovers on the go, the ORA-X is a brand new category that is part smart glass, part high-end wireless audio headphones. It runs Android and will feature quality audio sound as well as a disruptive see-through retinal projection technology.

“Smart Glasses have been plagued by what I call a ‘paradigm prison’. You can’t make a fashion accessory out of them no matter how hard you try. Consumers are just not ready to embrace looking geeky for the added functionality.” says Kayvan Mirza, CEO and Co-Founder of Optinvent. “The ORA-X is a clean break from this paradigm. Now not only can you hear music, but you can ‘see music’.   And that’s just the tip of the iceberg.   It’s also about hands free mobile computing without looking like a cyborg. It’s about “augmenting your senses” while still looking stylish.”

Imagine watching music videos and clips on the go, searching the web, video conferencing, taking pictures, sharing on social networks, GPS, and all the other uses that smart glasses promise…. without looking like a cyborg. Traditional headphones can give you high quality audio but their functionality hasn’t evolved much beyond that. However, they have become a fashion statement and people can be seen flaunting these colorful head-worn accessories. The ORA-X wants to go further by adding vision. This is a new and compelling user experience and this revolutionary device could mean the end of the “regular” headset. Why just hear when you can see?

ORA-X

The ORA-X is scheduled to be released in 2015. It will run standard Android apps – just like any standalone smartphone or tablet device. It is based on Optinvent’s cutting edge display technology and includes high end acoustics.

On the hardware side, specifications include:

  • Large, transparent virtual display
  • High fidelity speakers w/ active noise cancellation
  • Microphone (for calls and voice commands)
  • Front-facing camera
  • 9 axis motion sensor
  • Wireless connectivity (Bluetooth, Wi-Fi, GPS)
  • Trackpad (mouse and swipe) for tactile interactions with the device
  • High capacity Li-Ion rechargeable battery
  • Powerful microprocessor and GPU with enough memory to support complex applications

For more information, please visit http://www.optinvent.com

About Optinvent

Optinvent is a world leader in digital eyewear and see-through retinal projection technology. Optinvent’s team has 20+ years of experience in the field of consumer electronics and is recognized in the industry for developing cutting edge patented technologies and products.

 

Press contact:

Maiwenn Regnault

maiwenn@oxygen-pr.com

(415) 609-0140

Glass Explore and Explorer

Explore&Explorer

Google I/O kicked off today with not much fanfare around Glass. From a pure awareness stand point, Glass is the best thing that happened to Augmented Reality since the iPhone. And as a champion of the Augmented Reality industry from way back in 2007 – I am an avid supporter.

But Glass Explorers* make me angry (*users of the Google Glass prototype.)

I am not angry at Explorers because they love to walk on the street with Glass so that passerbys stop and ask them about it (although passerbys just want to take selfies with Glass)

Glass Selfie2

I am not angry at Explorers because they love getting into bars just to be denied service. Nor am I angry because they drive cars with Glass just to annoy highway patrol officers.

And you know what, I am not even angry at their eagerness to pay an exuberant amount of money to be testers in the most expensive beta program ever.

All that doesn’t bother me so much.

As Jon Stewart says : Intolerance shouldn’t be tolerated.

You know why I am angry at Glass Explorers?

Because they totally mistake the purpose of wearing.

In the Daily Show’s “Glass Half Empty” segment a Glass Explorer explains: “it’s basically a cell phone on your face.”

Ugh!

Daily show wearing Glass

“Make calls, get email, surf the internet…accessibility to everything on your cell phone” but now “right there on your eye”.

This is a bad case of skeuomorphism. Arrrgh!

iphone-dial-retro

Skeuomorphism: a dial phone on a touch screen!?

Can’t escape the comparison to Dumb and Dumber.

The “Glass Half Empty” Explorer argues: “With Glass you maintain in the here and now…”

So far – that’s brilliant. With Augmented Reality you Play in the Now.

But then he continues: “when I check messages I am looking in your general direction – I am not distracted.”

Just when I thought you couldn’t possibly be any more explorer. Or dumber.

Dumb and dumberer

My friend (and I mean it from the bottom of my heart), if you are reading a text message while talking to me – you ARE distracted. And looking in my GENERAL direction is like farting in my general direction.

Maybe these are just run-of-the-mill explorers regurgitating talking points.

So I asked a [very] senior [and very smart] member of the Google Glass team what’s the most compelling Glass app he’s seen so far. He didn’t flinch when answered: “Texting.”

Wah-What!?

This makes me mad!!!

The second Law of Augmented Reality design clearly states “Augmented Reality must not distract from reality”.

Second law of AR design

If it does distract you – it ventures into virtual reality which is an escape from the real world. The fundamental purpose of Augmented reality is to make you more aware of the real world and make things around you more interactive. Because in an interactive world everything you do is more engaging, productive, and fun.

The Simpsons’ Days of Future Future episode warns us about the consequences of not paying attention to the real world:

Epilogue

An incident that brought my anger to a head: A senior member of the Glass team which recently participated in a Glass Class at AWE 2014 didn’t agree to be video-taped or mentioned by name while at the same time was wearing Glass and [could have] recorded us all…

Aaarrrggggh!!

When I calm down, I’ll show what I consider good uses of Augmented Reality.

In the meantime check out over a hundred videos from AWE 2014 – the world’s largest event focused on Augmented Reality, Wearables, and the Internet of Things.

Guest Post: Harnessing the Power of Human Vision

Harnessing the Power of Human Vision

By Mike Nichols, VP Content and Applications at SoftKinetic

For some time now, we’ve been in the midst of a transition away from computing on a single screen. Advances in technology, combined with the accessibility of the touch-based Human Machine Interface (HMI) have enabled mobile computing to explode. This trend will undoubtedly continue to evolve as we segue into more wearable Augmented Reality (AR) and Virtual Reality (VR) technologies. While AR and VR may provide substantially different experiences to their flat screen contemporaries, both AR and VR face similar issues of usability. Specifically, how and what are the most accessible ways to interact using these devices?

The history of HMI development for both AR and VR has iterated along similar paths of using physical controllers to provide user navigation. Although the use of physical controls has been a necessity in the past, if they remain the primary input, these tethered devices will only serve as shackles that prevent AR and VR from reaching full potential as wearable devices. While physical control devices can and do undoubtedly add a feeling of immersion to an experience, in particular with gaming, you would no more want a smart-phone that was only controllable via a special glove, then you would want to control your smart-glass through a tethered controller. As the technology for AR and VR continues to evolve it will eventually need embedded visual and audio sensors to support the HMI. In particular, visual sensors to support a full suite of human interactions that will integrate with our daily activities in a more natural and seamless way than our mobile devices do today.

In Depth

A depth sensor is the single most transformative technology for AR and VR displays because it is able to see the environment as 3-dimensional data, much like you or I do with our own eyes. It’s the key piece of technology that provides us with the building blocks needed to interact with our environments – virtual or otherwise. The depth sensor allows us to reach out and manipulate virtual objects and UI by tracking our hands and fingers

A scenes’ depth information can be used for surface and object detection, then overlaid with graphics displayed relative to any surface at the correct perspective to our heads’ position and angle. Depth recognition combined with AR and VR presents a profound change from the way we receive and interact with our 2D digital sources today. To simulate this effect, the video below shows an example of how a process known as projection mapping can transform even simple white cards in astonishing ways.

“Box” Bot & Dolly

It’s not hard to imagine how AR combined with depth can be used to transform our view of the world around us. To not only augment our world view with information, but even transform live entertainment such as theater, concerts, sporting events, even photography and more.

Take a more common example like navigation. Today, when we use our smart phones or GPS devices to navigate, our brain has to translate the 2D information on the screen into the real world. Transference of information from one context to another is a learned activity and often confusing for many people. We’ve all missed a turn from time-to-time and blamed the GPS for confusing directions. In contrast, when navigating with depth-enabled AR glasses the path will be displayed as if being projected into the environment, not overlaid on a flat simulated screen. Displaying projected graphics mapped to our environment creates more context aware interactions, and becomes easier to parse relevant information based on distance and view angle.

Bridge the gap

As we look to the future of AR and VR they will both certainly require new approaches to enable an accessible HMI. But that won’t happen overnight. With commercialized VR products from the likes of Oculus, Sony and more coming soon we’ll have to support an interactive bridge to a new HMI through existing controllers. Both Sony and Microsoft already offer depth cameras for their systems that support depth recognition and human tracking. The new Oculus development kit includes a camera for tracking head position.

We’re going to learn a lot about what interactions work well and those that do not over the next few years. With technology advances still a ways off to make commercial AR glass feasible as a mass market option, it’s even more important to learn from VR. Everything done to make VR more accessible will make AR better.

Stay tuned for our next guest post, where we’ll take a closer look at how depth will provide a deeper and more connected experience.

4 years at ARNY: Augmented Reality meetup celebrates 1500 members, 200 demos, new startups.

I founded the Augmented Reality New York meetup (ARNY) exactly 4 years ago as a labor of love, and it developed a life of it’s own: attracting nearly 1500 members, introducing 200 Augmented Reality demos, helping advance AR in NYC, creating partnerships, helping AR enthusiasts find jobs, and spurring some fantastic AR startups.

How did we celebrate last night’s ARNY?

With a fantastic collections of speakers and demos from all over the world: Israel, Canada, Columbia and New York City.

Huge shout out for our wonderful host Mark Skwarek at NYU Poly!

1) Brendan Scully – Metaio – first ever SLAM demo on Google glass from the Augmented Reality Company

2) Niv Borenstein – Kazooloo – A truly fun to play Augmented Reality game-toy combination

3) Keiichi Matsuda – Hyper-Reality – A new vision for the future with plenty of Augmented Reality goodness. Back him on Kickstarter!

4) Dhan Balachand – Founder and CEO, Sulon Technologies – a new head mounted console that will fully immerse players into games where the real world and the virtual world become one.

5) Ori Inbar – The latest and greatest from around the world of augmented reality

AR on Sony Playstation 4 featuring Augmented Reality on mainstream TV

MIT Tangible Interfaces: inForm – Interacting with a dynamic shape display

8Tree – fastCHECK is a revolutionary surface inspection system that is amazingly easy to use combining a laser scanner with a projector to help inspect aircrafts.

2013 DEMO Gods winner review – Pristine – Delivering the next generation of telemedicine and process control solutions at the point of care through Google Glass.

Moto X box opening AR experience – Augmenting paper popup for story telling

One Fat Sheep’s Hell Pizza Zombie game (New Zealand)

TWNKLS cool Augmented Reality SLAM demo  for maintenance and repair at Europort

Re+Public new AR mural app – resurrecting murals with Augmented Reality

Nikola Tesla app IndieGoGo launch by Brian Yetzer

Stay tuned for a full video of the entire event

Press Release: Sulon Technologies announces a New Head-Mounted Video Game Console that 1ups Traditional Consoles

Introducing GVX, an Xtreme Reality head-mounted gaming console that adapts the game to your surroundings, allowing you to Live The Game.

Richmond Hill-based game technology company Sulon Technologies, Inc. (Sulon) announces their revolutionary new product, GVX, a head-mounted gaming console that offers avid gamers the freedom to play anywhere, whether indoors or outdoors. Sulon’s proprietary technology involves applying advanced Augmented Reality (AR) and Virtual Reality (VR) technology to create the most realistic and immersive gaming experience available, enabling players to have a Star Trek ‘holodeck’ experience in their own living space and bringing to life traditional table top games in 3D on any flat surface.

Image

Sulon’s solution to a tired console market is a brand new gaming device that introduces innovative gameplay by combining the benefits of console quality gaming on a mobile platform. Sulon is a team of experienced engineers, product development specialists, researchers and science fiction and gaming enthusiasts who have discovered and proven how to make any physical environment into a “holodeck” zone. “I’ve always been interested in new technologies and how society is quick to absorb them into their everyday lives” said Dhanushan Balachandreswaran, Founder and CEO of Sulon Technologies. “We are excited about creating technology that we originally thought to be fiction and turning it into a reality.”

The GVX system introduces the concept of Xtreme Reality (XR) defined as the one-to-one integration of the real world and the virtual world with the ability to scale the whole spectrum of AR to full VR. XR blurs the lines between the real and virtual worlds to actually place the player into their game by adapting their entire physical environment into the game world. GVX uses complex and adaptive algorithms, high end graphics processing, motion tracking and position tracking to achieve the XR experience. The system has the unique ability to map the actual environment as the physical characteristics and boundaries of the game environment. Unlike any other device available, it applies AR and VR to transform any space (even outdoors!) into a completely new game environment in real-time. It is a “wear and play” experience that expands a player’s gaming space from the area in front of their TVs and PCs to their entire home. GVX runs on the Android Operating System where applications can be developed quickly and games can span the entire spectrum of casual to hard core genres. XR games on GVX are classified as Active or Surface gaming.

Image

Active games are adrenaline inducing and interactive, where the system conducts an accurate and rapid scan of the player’s entire environment to adapt it into the game world using sophisticated SLAM algorithms. GVX is also completely wireless with all devices communicating via Bluetooth or Wi-Fi, giving users complete freedom of movement. Players can now live their video game by physically exploring and interacting with the virtual environment. With Active gaming, space limitations are not an issue as the system is able to generate new graphics and scenarios to the same space multiple times during one gaming session, allowing for an endless number of new gaming experiences.

Surface games on the other hand are a creative throwback to traditional gaming where games are augmented in 3D onto any flat surface. Players can watch their cities actively grow, wage virtual wars against other players around the world or enjoy a tabletop game with friends and family. Surface gaming provides a new reach for social gameplay and a new avenue and meaning for social interaction. Family game nights can be made possible even when family members are not physically present.“GVX really gives game developers creative freedom to take advantage of all the functions and capabilities of the system when designing games.” said Dhanushan Balachandreswaran.

What also makes GVX a game-changer is that Active and Surface gaming is not mutually exclusive and can be combined to create innovative gaming experiences. Jumanji is a great example of how surface and active gaming can be combined. The game board is augmented in 3D on a flat surface (surface gaming aspect) and game events that occur would require players to interact with the virtual environment (active gaming aspect). In addition to the XR experience of Active and Surface gaming, GVX is also highly flexible and capable of playing existing games such as PC games, games specialized for stereo VR or mobile games like Angry Birds.

“The concept of mixed reality has been around for years but nobody has succeeded in making it real” said Jackie Zhang, Vice President of Research and Development at Sulon Technologies. “With the latest technologies and innovation, we have made this concept possible. It’s a disruptive product whose limitation is the bounds of your imagination.”

GVX also features a removable component (the GVX Player) that offers players a variety of gaming options. The GVX player can be used on its own to play existing mobile games from the Google Play store as well as connect to a HDTV and a Bluetooth controller for those who enjoy traditional gaming on their television. These gaming options on the GVX Player can be combined with the XR experience on GVX (i.e. Active and Surface gaming) to create a multitude of unique one-of-a-kind gaming experiences. A game could feature as many or as little of these gaming combinations. For example, a game could begin as a simple mobile game, but an in-game event may prompt the player to switch to Active gaming in order to fully experience the event. These flexible gaming options allow players to easily pick and choose their preferred gaming experiences.

GVX is also a consumer-friendly gaming device that eliminates many of the problems associated with adopting AR and VR technology. Surface games (AR application) on GVX are hassle-free and do not require physical markers. VR gaming on GVX is safe and does not induce motion sickness. GVX features an option where players can adjust the level of graphics opacity. Lowering the graphics opacity allows more of the player’s physical surroundings to appear in the game environment, ensuring safe gameplay and providing players who are not ready for a full virtual environment an alternative. The opacity option, as well as the wand controller’s one-to-one true motion tracking prevents motion sickness because the player’s movement in the physical world matches their movement in the game world.

“GVX is truly a game-changer in the gaming and mobile space. Never before has a gaming platform been able to successfully bring to life that Star Trek “holodeck” experience in a simple and straight-forward product, while freeing the user from the constraints of a wired environment,” comments Ken Wawrew, the Chief Operating Officer at Sulon Technologies.

GVX developer kits are available for pre-order on the Sulon Technologies website.

For more information, visit www.sulontechnologies.com

Augmented World Expo™ (Formerly ARE) Opens Call For Proposals

Press release just announced:

AWE CARD

Computing is changing inside out. The world is now the platform. ARE is now AWE.

Augmented World Expo™ (AWE), announced today that the world’s largest gathering of designers, engineers and business leaders dedicated to solving real world problems in Augmented Reality is now accepting presentation proposals for the 2013 event. The deadline for proposal submissions is February 26, 2013.

The way we experience the world will never be the same. We no longer interact with computers. We interact with the world. A set of emerging interrelated technologies such as augmented reality, gesture interaction, eyewear, wearables, smart things, cloud computing, big data, and 3D printing are completely changing the way we interact with people, places and things. These technologies create a digital layer that empowers humans to experience the world in a more advanced, engaging, and productive way.

Augmented World Expo will showcase the best in augmented experiences covering all aspects of life: health, education, emergency response, art, media and entertainment, retail, manufacturing, brand engagement, travel, automotive, urban design, and more. It will be the largest exposition to bring together technologies for augmented humans in an augmented world.

Augmented Reality.ORG, the producer of AWE is proud to announce the maturing of an edgy conference about augmented reality into the world’s first expo dedicated to the augmented world.  If you loved ARE you are going to find Augmented World Expo the most important event of 2013.

AWE 2013 invites today’s technology leaders to learn, network, and share their expertise in technologies that change the way we interact with the world.  Presentations should cover a range of emerging technologies that relate to augmented reality such as gesture interaction, eyewear, wearables, smart things, cloud computing, big data, and 3D printing. Presenters can choose from a set of topics and industries that address the latest trends, strategies and business growth opportunities of the augmented world.

The AWE 2013 open call for proposals covers:

  • The Talks – 30 hours of sessions in 3 tracks: Business, Technology, and Production
  • The Auggies –  a series of Best Demo Awards
  • The ARt Gala – Augmented Reality art displays
  • The Startup Launch Pad – Showcase for innovative startups and competition
  • The Conference Expo – booths and displays on the exhibition floor
  • Augmented Future – Ground breaking ideas that will change the augmented world

AWE 2013 will be held at the Santa Clara Convention Center, on June 4-5, 2013 and is expected to draw 1,000 attendees. In addition to the presentation tracks, AWE 2013 will include mind-blowing keynotes by industry leaders on the main stage.  For more information and to submit your proposal, visit http://www.AugmentedWorldExpo.com/cfp 

###

Augmented Reality.ORG is a global non-for-profit organization dedicated to advancing augmented reality (AR).

It unites hundreds of companies, and thousands of entrepreneurs, engineers and designers – committed to promoting the true potential of Augmented Reality (AR) – an emerging technology that digitizes interaction with the physical world. Acting as the trusted partner of its members and supporters, AR.ORG facilitates and catalyzes the global & regional transformation of the AR Industry.

AR.ORG also owns and produces the largest international event for augmented reality and the global stage for AR innovation – Augmented World Expo (formerly ARE) . All profits from the event are reinvested into AR.ORG’s industry services.

Seems Like an Accident Waiting to Happen

It’s called OutRun, the brain-child of one Garnet Hertz. Although it’s an obvious example of mixed reality, rather than augmented reality, I believe it still falls under the jurisdiction of this blog

Augmented Reality Metal Slug on Wii U?

If it’s real, then wow, just wow

But it’s probably a fake, uploaded to youtube by a user who has joined the video sharing site just yesterday, and have managed to upload leaked videos of Mario Galaxy DS and Star Craft 2. The F word at the end of the clip above is also a telling clue. But, an AR fan can be hopeful, no?As a bonus, in case this really turns out to be a fake, here’s how an AR version Metal Slug should look like:

Weekly Linkfest

Quite an interesting batch of links we have today:

It seems that I have never posted here about Greg Tran’s work on augmented architecture. Which is a pity. I had the following video opened as a tab in my browser for a long time, just waiting to be published, and somehow I forgot about it. Luckily, Yanko Design posted another video of Tran’s work, which served as a necessary reminder:

Mediating Mediums – The Digital 3d from Greg Tran on Vimeo.

Have an excellent week!