CES 2019: Nissan Unveils its Invisible-to-Visible Technology Concept at CES

By Rob Clymo on at

Car tech is everywhere at CES these days. Alongside the robots, giant TVs and depressingly vast mountains of pointless plastic phone cases you’ll find lots of four-wheeled innovation on offer. Japanese car making giant Nissan, for example, has used the event to reveal its latest party trick – the impressive sounding Invisible-to-Visible or I2V.

However, rather than being a headline-grabbing novelty item we found that I2V is a very cool idea that could make motoring not only smarter, but safer too. According to Tetsuro Ueda, from the Nissan Research Centre, Invisible-to-Visible technology is effectively the interface that merges the data world and the real world. “It brings a new level of connectivity that connects cars to the metaverse,” he says.

Breaking that down into layman’s terms, the I2V technology calls upon augmented reality to keep an eye on what’s going on as you drive along. It also works for autonomous vehicles too, with the objective on that front being to soothe your senses and offer reassurance as you head down the highway using technology instead of your hands to turn the wheel and operate the controls.

Nissan had the system on show at CES using an interactive immersion experience at its booth, so we gave it a try. You have to don a pair of augmented-reality goggles and slip into a demonstration cockpit, which has been fitted with three-dimensional interfaces and displays. Once you’re in and sitting comfortably I2V takes you on a guided tour of a city.

The platform has been developed by Unity Technologies, which has its roots firmly planted in the gaming marketplace. That connection becomes obvious as soon as you try the demo because it’s just like being dropped headlong inside a video game.

Along the way there are examples of everyday driving scenarios, such as searching for a parking space in a packed shopping mall. More impressively, you get to see through buildings and around blind corners. Of course, in a simulator there’s no risk of anything bad happening, but the vast amounts of data coming from the Metaverse leaves you feeling pretty confident this could work in the real world too.

The Metaverse, by the way, is effectively a virtual cloud-based environment that not only lets you benefit from data shared from other users, but you can return the favour and tell them if there’s a potential traffic irritation on the horizon. The whole thing revolves around a system called Omni-Sensing, which in turn presents data collected as a series of graphics. Even people can be presented as avatars inside your car. Again, being dropped into a video game springs to mind as you cruise on down the highway.

The system works by using not only information gathered by sensors both inside and outside the car, but with data from the cloud. And, more importantly, by doing this, I2V allows you to get a better idea of what’s around the corner, quite literally, by highlighting things that could be out of sight or obscured by the surrounding landscape, buildings or other obstructions that obscure the road ahead.

Of course, that’s all fine and dandy in a demonstration scenario, but it’ll be more interesting to see how this latest innovation can fit into a real world scenario. Just where, and when, will it be seen in a production vehicle? Watch this space for now it seems.

The idea might well have legs, but there’s an additional plus point that comes with the concept. As well as offering obvious benefits on the safety front I2V is designed to offer passengers in autonomous vehicles a much more pleasing travel experience. For example, Nissan wants to be able to brighten your autonomous motoring day by projecting scenery from a sunny day inside the car when outside it might be cloudy, grey and pouring with rain.

While that might not seem necessary in a location like Las Vegas with its big skies and seemingly endless sunshine, for anyone living in the UK it might make I2V seem like a very good idea indeed.