By Niklas Björkén, Director Innovations, Accedo
According to a report by Global Market Insights, Inc, the smart light market is set to be worth more than a staggering US$24 billion by 2024. What we are also seeing is a trend for linking those smart lights to what is happening on the TV screen.
Traditional light synching systems read absolutely everything and represent the colour scheme on the TV screen in lights in the living room. I don’t know about you, but personally I find that really distracting if every time my screen colour changes so do the lights. But if that were done better so that it only changed during something really intense where the lights would add to the general ambience, that could be really attractive.
That was precisely why the innovations team at Accedo set about trying to achieve that and we realised that Artificial Intelligence (AI) had the potential to make that happen. We knew from the outset that it should not be about mimicking Philips Ambilight. The idea is that we identify places where the lights can enhance and extend what happens in the video. Such places could be explosions or thunder and lightning, as well as sudden bright lights such as car headlights or other sudden and drastic changes in light and color conditions that would benefit from having smart lights extend the effects off-screen.
With this project we were looking to create a web service that produces instructions, or light cues, for smart lights based on identifying specific types of scenes in a video. We then needed to define a format for the light cues and a delivery mechanism for a video player to consume. Finally we would need a light cues interpreter video player plugin that also connects to a smart light hub and pushes the instructions to the smart lights in sync with the video.
We tried a number of different methods but the one that worked the best involved training Google’s Inception-v3 image recognition model with extra labels to recognize extremes of light. We did this using images from different trailers. Cues with more than 60% confidence are fed through the Google Cloud Vision API to extract the information on the dominant colours and this is used to generate the final cues file to send to the smart lights hub.
The result is a really cool demo showing just what is possible. There is already a demand for smart lights reacting to the video but making it only react where it will add to that experience means that viewers won’t get annoyed by constantly flickering lights. It also just scratches the surface of what is possible when it comes to enhancing the video experience with AI.
Come and visit us at NAB to find out more.