Self-Driving Cars Will Be Looking at You as Much as the Road

By Jamie Condliffe on at

Self-driving cars spend a lot of time looking at their surroundings to know how they should respond to the road. But autonomous cars will likely spend some time looking at you to work out how they should behave, too.

A team of researchers from Cornell University has been developing a new system which trains a camera on the driver in order to understand what they’re up to, predicting their actions from three seconds out. In the first instance, the system could be used to supplement driver aids that already appear on cars, but in the future it could be used to help train autonomous cars by providing them with a rich seam of extra data.

The team has built a Recurrent Neural Network which takes in a number of data sources to predict what the person behind the wheel might do. It assembles measurements of speed, GPS positions and head orientation of the driver in order to attempt to assign a probability to whether or not the car will change lane in the next few seconds.

Self-Driving Cars Will Look at You, as Well as the Road, to Learn What To Do

After collecting data from 10 different drivers who covered 1,000 miles of freeway over two months, the team annotated the footage in order to train their neural network. With 700 events — 300 lane changes, 130 turns, and almost 300 randomly chosen examples of driving straight — chewed through, it was then let loose on live data to guess what might happen. The team write in a paper published on arXiv that the software “can anticipate [manoeuvres] 3.5 seconds before they occur in realtime”.

That’s pretty impressive — but then, lane changes on freeways are probably some of the easiest to predict. It will be interesting to see what the team can actually do with the data for current and forthcoming cars, and how they can tweak it to predict more subtle actions of busy city streets, too. [arXiv via arXiv Blog]

Image by HomeArt/Shutterstock