The future of the auto industry, it seems, rides on electric motors and self-driving sensors. But the windshields could prove to be just as impressive.
Envisics, a 10-year-old British company, is quietly turning the heads up display into a wonderland of holographs, a three-dimensional projected image a la “Help me Obi Wan” or a posthumous Tupac performance at Coachella. Instead of a simple speedometer and some crude navigation arrows, Envisics promises a layer of augmented reality stretching from the hood to the horizon.
At first, it will tell a driver more clearly where to go; once the self-driving robots take over, it may offer Pokemon-style games or your Instagram feed. Zoom calls are impressive; wait until you take one at the wheel and you’re talking to a hologram.
Founder and CEO Jamieson Christmas, who has a Ph.D. in holography from Cambridge University, says the venture has been a lifelong dream that started with a Star Wars fascination and progressed to a seminal deal with Jaguar Land Rover. Most recently, Envisics landed a US$50 million investment round led by General Motors’ venture capital unit and Hyundai Mobis, a giant South Korean tech supplier. We caught up with Christmas (via standard 2D Zoom) to learn more about the technology and when we can expect to see it on the road. (The interview has been edited.)
Science fiction tells us they’re possible; so I wondered, why aren’t they happening? It’s because it’s really hard, that’s the simple answer. Light doesn’t work the way science fiction tells us it does. But we were able to make some really significant breakthroughs. We designed a holographic modulator that enabled us to electronically change the speed of light and that’s really one of the core elements, along with mathematics, that makes what we do possible.
We realised very quickly that a very good place to make this happen was in the field of automotive head-up displays. We designed a head-up display that worked in seven Jaguar Land Rover vehicles. Basically, put our heads down for four years and the first unit shipped in Feb 2015.
What excited carmakers about the technology?
Active safety was a major thing, the ability to really orchestrate what the driver should be looking at, what hazards there were in an urban environment. But other really significant use-cases were things like navigation, the ability to take navigation as we experience it today, where you look at a map and look at the world around you and try to link that information together. The ability to actually overlay that information upon reality – to put the arrows on the place where you’re supposed to turn – is incredibly compelling.
In January 2019, we revealed our first augmented reality head-up display at CES. It demonstrated the ability to not only have a traditional display experience but on top of that we were able to paint information on three lanes of a motorway from about 20 meters in front of the car all the way out to the horizon. Almost every OEM on the planet came to visit us. It was the springboard for the company, the realisation that this was real and tangible.
Are autonomous systems a positive for you or a threat?
For level 2 and level 3, which requires you to pay attention, there’s a really compelling use of augmented reality. The car can visually tell you what it’s aware of, what it’s doing and why it’s doing it. It makes it that much more comfortable and it also means you can [shift] responsibility for actually driving from the car to you almost instantaneously. Whereas if you’re reacting to prompts, you don’t have the information you need to make informed decisions.
With level 4 and level 5, you are very much a passenger. In which case, the ability to create future revenue streams for the car companies is a really interesting opportunity-adverts or useful information about what’s going on in the world around you. The ability to take your everyday interaction with your cell phone and overlay it on reality. Those things all become possible.
What’s the hardest part in developing this?
The car gives us information that we then overlay on the world in three dimensions. Generally, holography is incredibly complex. The number of calculations we perform every second is astronomical. We are manipulating the speed of light hundreds of times per second. The material science involved in that, the custom silicon devices in that, the custom processing architecture in that-they’re all problems that we have taken and addressed and resolved. Our first production is to be available from 2023 onwards. We are engaged with numerous car companies worldwide at different levels of discussion.
I imagine there’s a gap between what you can do and what economically will be profitable. How big is that gap and how quickly does it close?
Our initial target is the more premium part of the market and that’s what we’ll launch in 2023, but we fully expect through advancements in our technology that we’ll be able to unlock all of the automotive market. We work with each [carmaker] individually to understand their vision of the future of an in-car experience and we help them to realize that using our technology.
What is it about your technology that people might not think about?
The automotive world is undergoing a revolution and most people assume that revolution is just the powertrain and the move to autonomous systems, but it’s not. If you think about what the future of cars looks like, we are invariably moving towards a world where there’s going to be consolidation of the market, there’s likely to be homogenisation of the car designs to maximise range and efficiency, and it’s very likely that one of the most distinguishing features that is going to help people decide which car to buy is going to be the in-car experience. Platforms such as ours are going to differentiate in that respect. The ability to deliver a truly personalised driving experience, that’s really what’s on the horizon. – Bloomberg
Did you find this article insightful?