Intel's VR Keynote Was Actually Pretty Cool

By Gizmodo Australia on at

Press conferences are typically a dull, boring affair filled with speeches. Until Intel took (mostly) everyone into a room at CES 2017 and told them to put on an Oculus Rift headset.

The virtual reality keynote was an opportunity for Intel to showcase three areas they expect virtual reality to become increasingly prevalent: sports and live entertainment, tourism and, of course, video games. And that meant telling everyone - or at least the press and industry folk who lined up early enough to get inside the conference - to pop on the Oculus Rift in front of them.

As far as Intel is concerned, VR can play a major factor in three things: travel, work and play. The travel pitch was pretty simple: imagine you could look and move around your next holiday spot, the hotel room you're going to book, the monument you'd like to go visit. So to demonstrate, Intel fired up a VR experience of flying over the desert, complete with jumping out of a helicopter while wearing a wingsuit.

The people in the conference weren't wearing wingsuits, but it was pretty good to see a few queasy reactions.

The next part required the co-founder of Hype VR, Ted Schilowitz. While video has enjoyed jumps in fidelity, resolution and quality over the years, it's still predominantly a moving image shot from a single camera, a single perspective.

Hype VR and Intel's answer was a high-def video from a Waterfall in Vietnam, viewed through the Rift. The kicker with Hype VR's technology was the ability to move around the physical space in the video, without distorting the perspective or point of view.

That theme revolved throughout the conference: being able to control the perspective, to become the cameraman. Perhaps the strongest theme of the keynote was Intel's alignment with real-world sports, both through a deal with La Liga to introduce their 360-degree video technology to three stadiums, and a live broadcast of a college basketball match.

The trick with the live basketball match in VR was the ability to look around and choose different camera perspectives. It wasn't a free-roaming camera like one would hope, and when Intel first cut to the game it was half-time (leaving viewers with bugger all to see beyond warmups). The live feed - which took about 40 seconds to beam simultaneously to all the headsets in the conference - was truly live courtesy of a platform called Voke.

Voke's currently available for the Samsung Gear VR, and is scheduled for release for the Oculus Rift at the end of the year. Krzanich added that the content available won't just be limited to sports, but music and other forms of live entertainment as well. It's hard not to see it as a future route for pay-per-view content, especially when broadcasters have the ability to overlay statistics and information throughout.

As far as work is concerned, Krzanich pitched the idea that virtual reality could be a way to reduce the danger associated with hazardous jobs. Intel then streamed 4K, 360-degree live video from a drone inspecting solar panels, a job that would typically be done by a human. While it wasn't the smoothest feed, it was also pretty functional. The idea is that VR could be used in these situations to scope out hazardous work, or for use in search and rescue missions where conditions make it too problematic for a helicopter and crew to investigate.

Intel also showed off the work they've been doing on Project Alloy, their open-source HMD first announced at the Intel Developers Conference last year. All of the sensors and processing power is done in the headset itself, using Intel's RealSense technology and a Kaby Lake Core i7 CPU. There's no word on how it matches up to a Rift or Vive that relies on far beefier hardware - but Intel said they would begin productising Project Alloy by the end of the year.

Apart from the promise of a VR headset that doesn't have wires and is hooked into something with more processing power than your smartphone, Intel also pulled off a neat trick with "mixed reality". It's basically a fancy term for scanning the space around you and incorporating that into the environment into the headset, which gets around VR's current problem of not having enough space in your living room to move around.

As an example, the stage opened up and showcased two people wearing Alloy prototypes in a faux living room. The sofas were transformed into bunkers that could be used for cover in-game, which is wonderful encouragement to leave large objects lying around every now and again.

Nobody in the crowd got to get hands-on with Project Alloy, but that was besides the point. The demo, much like the whole keynote, showcased ways in which VR might actually have a role in our day to day lives.

The technology behind VR has come an awful long way. But from a gaming perspective, developers are very much still figuring things out. Many titles are little more than short experiences, rail shooters and short experiments lasting no more than a few hours at most.

It's still very disposable. But being able to watch live sport, or a concert, or the live feed of a drone, isn't. It's something we already do every day, and will continue to do going forward.

The conference helped give VR a place in our lives that made sense, and that was the cool part. The technology is almost there, but what we don't have yet is purpose. It probably still won't happen by the time 2017 comes to a close, especially when you factor in the capabilities of Australian internet.


But it's not far off.

Gizmodo Australia is gobbling up the news in a different timezone, so check them out if you need another Giz fix.