Though the cover shooter felt revelatory back in the days of the original Gears of War, it's now a tried-and-tested formula that's started to feel a bit rote. Not only has it become a go-to basis for most third-person action games, the actual mechanic has barely evolved, and has never really mimicked the reality of ducking gunfire. If an alien grunt or Nazi trooper was gunning for me in real life, I'd hope for a more elegant safety procedure than having to use two sticks and a button on a control pad to to turn and move my body towards somewhere I could tuck my squishy head away out of danger.
Ubisoft's new open-world RPG shooter The Division offers an alternative. Teaming up with eye-tracking specialists Tobii and PC hardware manufacturers MSI, you'll be able to take cover across bullet-ridden New York using just your eyes – providing you've got the right gear, of course.
Tobii's partnership with MSI has resulted in the MSI GT72, a souped-up gaming laptop that has a sensor array below the screen that projects an imperceptible light pattern across your eyes. This is able to track the position of your pupils, even compensating for depth, and translate that into computing input comparable to moving a mouse cursor with your hands. It's startlingly accurate, verging on witchcraft – even the calibration program (which has you pop tiny onscreen bubbles with your death-stare) impresses.
In The Division, it allows for a cover control system that's intuitively, seamlessly integrated. It is literally as simple as looking at a cover point and pushing the "take cover" button. That's it. No turning your aiming reticule. No running towards a wall, rubbing your knees and chest against it as you bash into it, trying to activate the ducking animation. Take a glance, the game / laptop combo will let you know if any area makes for suitable cover, hit the button and BOOM. Safe.
I was surprised at how quickly the eye-controlled elements of The Division became practically second nature. Though I hadn't mastered the new control scheme in the 30-or-so minutes demo time I had, I definitely missed it every time the Tobii rep turned it off for comparison's sake. It was particularly useful for two reasons – firstly it allowed me to keep the camera in a position that meant I could spot any aggressors while still finding cover, and secondly it literally allowed me to spot useable cover at a glance (handy in The Division's cluttered environments, in which it can sometimes be difficult to predict which surfaces will throw out a symbol indicating that they are bulletproof). In a multiplayer match, I'd also have been able to tag enemies at a glance to co-ordinate team strikes.
It also had some aesthetic and time-saving benefits too. The HUD (though relatively inoffensive in The Division by default), could intelligently make itself more transparent until the laptop spotted that my gaze had lingered over a particular element, giving me a clearer view of the battlefield without having to manually switch off the minimap or waypoint makers. In addition, when viewing the fullscreen map, I could just look at where I wanted to place a GPS waypoint and hit the A button to set it, without needing to scroll over to where I intended it to go.
It's lightweight software too – switching on the Tobii feature has no notably detrimental effect to processor load or frame rates.
There's a whole bunch of other eye-tracking features that will be available to players at launch that I wasn't able to test during the demo in the explainer video below:
The Tobii tech has obvious benefits for those with physical disabilities – through a combination of glances, pauses and (where possible) button presses, you can conceivably use the Tobii tech to control any application or game, even if a user's physical movement is limited. Tobii is currently working on having its eye-tracking tech built into Windows at a fundamental level – Microsoft is interested (the MSI laptop is the only one to use both eye-tracking and facial recognition in tandem for Windows Hello unlocking, for starters), but has yet to take the plunge completely. As such, Tobii has built a serviceable but unrefined third-party Windows overlay that lets some elements of the OS be controlled through eye-tracking.
But perhaps the most interesting thing about Tobii's eye-tracking technology sits not with what it allows us to do at the moment, but what it may make possible in the future, particularly when it comes to virtual reality.
Take this quick, interactive example, for instance: look at your immediate surroundings. Let's assume you are at a desk. Take a glance at what's on your desk, and focus on one thing. Now check your movements – did you turn your head to look at what was on your desk? Or just twitch your eyes over to what was interesting? If you were wearing a virtual reality headset to look at your virtual desk, you would have had no choice but to have turned your entire head to take in even this micro-glance at the detritus on your table. But with Tobii eye-tracking built into a headset, the view from the VR screens could adjust to match the movements of your eyes. It's a far more natural way to view the digital worlds, making them more immersive to inhabit.
The VR applications go beyond input too, but on to storytelling and the sorts of experiences that can be directed. If a game knows where your gaze lingers, it can play with that information. Imagine a horror game that knows where to hide a monster to jump at you for the biggest scare? Or an RPG quest giver that can tell if you're not paying attention and react appropriately?
And eye-tracking also could up the fidelity of virtual reality visuals by allowing for selective detail rendering. Though it's a bit reductive to think of the human eye in terms of resolution, you can roughly correlate 20/20 human vision to around 576MP. Even the most advanced PC and virtual reality headset set-ups currently struggle to smoothly render relatively-lowly 4K visuals. But eye-tracking technology could be used to create foveated imaging. This replicates how the human eye works – focussing the definition of your field of view on a point of interest or subject, while your peripheral vision blurs out slightly to stop your brain exploding from information overload. With eye-tracking, a headset could identify a focal point, ramp up the detail in that area, and dial it down elsewhere to minimise processor load and maximise visual impact.
Here's how that concept could work in motion:
Tobii's representatives told me that the company is in regular contact with all of the major virtual reality headset manufacturers. But in terms of solidified partnerships, the only headset to currently employ Tobii technology is StarVR. Perhaps next-generation for Oculus, Vive and PlayStation VR, then.
For now, Tobii eye-tracking will be a niche luxury for a tiny subset of gamers – those specifically using the MSI GT72, or the few with Tobii's separate eye-tracking accessory, the EyeX. But I – every pun intended – look forward to seeing its future application. As virtual reality moves us further and further away from traditional controllers, eye-tracking this precise will be exactly what we need to interact with virtual worlds with accuracy.
The Division is out on PC, PS4 and Xbox One on March 8th. The MSI GT72 with Tobii eye-tracking built in is out now, as is the EyeX accessory.