insights & news > Insights

Designing Augmented Reality

José Somolinos

Senior UX Product Manager

April 9, 2018

Download

Download Now

Download

Download Now

Share

It couldn't be just any sport either, but something that requires context to understand and follow the action. If you have watched cycling routes such as Tour de France, Vuelta a España or Giro d'Italia then you know what I am talking about. The TV broadcast isn’t always enough to give that reference to understand where the riders are, so we wanted to create a way to enhance that experience using Augmented Reality (AR) and the second screen.

Second screen applications in AR

There are of course already dozens of applications to follow a wide range of sports, including Formula 1, rally driving, basketball, football etc. There is a great deal of investment going into enhance the customer experience for sports.

However, second screen AR apps have the potential to be a game changer here. Firstly, because there is very little difference between augmented reality devices and a second screen on a smartphone or a tablet, mostly because the consumer augmented reality devices today are smartphones and tablets! AR glasses like Magic Leap One will mean a more marked difference, with the UI suddenly happening in the living room itself. Secondly, because a second screen AR app is more visual, more natural and more engaging. Using realistic 3D models that will be perceived in 3D by the viewers is a game changer.

Designing Augmented Reality is easier than in Virtual Reality

We all know that AR and VR have their similarities but there are a few differences too.

Firstly, in VR there is a great deal of work involved in providing context and presence. Not only do you have to create the screen, the navigation, and the video controls, you need to carefully consider the exact environment the user will experience. This includes considering a number of associated factors, such as lighting conditions.

In Augmented reality, the context is already provided by the viewer’s own environment. This means we can focus on designing the interactions, with the option of applying filters and effects to the real world to enhance the scene should we choose. This is even easier for Mixed Reality, as you don’t need to have any awareness of the real world context in those scenarios.

Information in AR or HUD

This type of experience contains a huge volume of information, such as the map with locations of the bikers, along with metadata about riders and video from helicopters, for example. Rather than putting all of that in 3D panels around the user, we can place relevant information in the Head’s Up Display (HUD). It is like a virtual screen always set in front of the user’s eyes. Seen through tablets and smartphones, the UI appears stuck to the frame of the device. Once we have AR glasses, the virtual screen will be floating in front of the user’s eyes. HUD and AR are also seen as opposing concepts, but I believe they complement each other and knowing when to use each of them will help to create more vibrant applications in AR.

When delivering information in these types of UIs, ensuring readability is key. We tend to read better when text is closer to us and at the same time a list of information will not be optimal for visibility and interaction within the AR view. At the same time, the scroll interaction required is not yet precise in the AR frameworks available. Therefore, with limitations in both, we decided to display some information, such as a list of bikers, in the HUD view. Other panels are better placed into the context of the 3D model. In our case, we have the flags marking the location of the different groups during the race. These flags contain very few pieces of information, use clear typography and big font size. The panels are always oriented towards the viewer, both vertically and horizontally, making the information accessible from any position. They do a "funny" dance, but the advantage in readability is unquestionable.

New native interaction for AR: proximity interaction

AR and VR, like any new devices, are bringing new ways of interaction. Raycast and grabbing are two of the most common ways to interact in VR, and I believe that a "proximity interaction" would also be quite present in AR.

In our case, we wanted to show the video broadcast from a helicopter. First, we placed the helicopter model in the approximate position of the real vehicle during the race.

Then we could open the video by clicking on it, which for us it was the simplest way to perform this action. However, we thought that a more "natural" way to show this video was to get closer to the helicopter. To make this feature more discoverable, we added a subtle helicopter sound that increases volume when we get close to it, guiding the user to the helicopter.

At the beginning of something

AR design is still a new thing, which means that we are still only beginning to test out the capabilities of what it can do, as well as what it should do to enhance the experience rather than add technology for technology’s sake. As we move into this era more, stay tuned to see how this will progress.

If you want a glimpse of how AR will transform the video experience, come and check out our prototype at BroadcastAsia.

Download nowDownload now

Share this article

Want to take your video business to the next level?

Let's collaborate to define what is next for your OTT streaming service.

Contact Us
x