Designing Augmented Reality Part 2

Something I get asked a lot is: "What sets augmented reality apart from virtual reality?" There are many definitions of AR, but a general idea is that it is when you take what you see in the real world, and then add to or modify that reality by using realtime computer graphics.

By Niklas Björkén, Director Innovations, Accedo

AR is different from VR in that you see the real world around you, where in VR the real world is blacked out and you are presented with a world that is fully computer generated.

The two major platforms for AR development today are Apple’s ARKit and Google’s ARCore. These frameworks use the camera and other sensors in smart phones and tablets to provide developers with information about the real world around the user, allowing us as developers to create and place virtual objects in a realistic way in the user’s real life environment.

ARKit and ARCore require us to use screens like in the pictures to show our AR apps. Obviously the end goal for AR is not to be seen through a small screen like this, but rather to be ubiquitous and part of your natural visual field. However, devices such as glasses/googles that can achieve this are not really on the market yet, so while we wait for the hardware to mature, the smart phone gives us a chance to experiment with AR and allows us a small window into the future of this technology.

The way we work with ARKit and ARCore at Accedo is true to our nature as a cross-platform UX company. We’ve constructed a custom and proprietary bridge between ARKit and ARCore which allows us to develop applications that run seamlessly on both platforms. We are able to add an abstraction layer forthings such as surface/plane detection, input methods and other specifics so that our developers can create a unified source for deploying apps to both Apple and Android AR devices.

In addition to Apple and Google, the secretive company MagicLeap has been working on an AR project for years now, taking in billions of dollars in funding. It is expected that their AR glasses will be the start of the consumer AR device market. They are expected to launch some time later this year. The glasses may not look great, and they probably won’t be everything we would want them to be when they first launch, but they could allow us to take a first step from using the smartphone as a window into the AR world, to a more immersive AR experience. This is the kind of device we should imagine for our users when we design an AR experience.

MagicLeap’s secret sauce is what they call “Digital Lightfield” technology, a mix of software, hardware and optics that will generate lifelike digital objects that seamlessly blend with the real world. Using their wearable AR glasses users will experience objects at different depth of field, tricking your brain into perceiving these objects as real world objects. While MagicLeap’s AR glasses are not yet available, an early version of their software SDK is. It allows us to get an understanding of the MagicLeap AR universe, and how it differs from those of Apple and Google.

The MagicLeap hardware is wearable, as opposed to the phones and tablets of ARKit and ARCore, meaning there’s a lot more at play to make it a compelling user experience. MagicLeap’s SDK hints at things like eye tracking and gesture recognition, things that will be necessary to make the AR experience truly immersive. So it shouldn’t be long until we can stop peering into the world of AR through a rectangular piece of glass, and instead enjoy it as an immersive, lifelike experience. In my mind, that will be the true game changer for AR.

If you want a glimpse of how AR will transform the video experience, come and check out our prototype at BroadcastAsia.

 

CUSTOMERS