One of the clear automotive technology trends at CES this year is cars that drive themselves. From Audi to Lexus to Ford, the world's largest car companies are beginning to follow Google's lead in an effort to produce cars smart enough to drive themselves.
The thought is that autonomous cars will reduce the number of traffic deaths, while simultaneously allowing car owners to do more productive things on their car trips, like work or read. All of this sounds magical, especially to a traffic-jammed Angeleno like myself, but let's get real: How soon do we actually think government legislators are going to cotton to the idea of robot cars all over the roads?
To be sure, self-driving consumer cars will initially be prohibitively expensive for the vast majority of drivers (some guess costlier than a Ferrari), but if that price point comes down, expect them to be as prevalent as hybrids are today. The one thing standing in the way of that prevalence, of course, could be legal hang-ups.
Optimists will tell you that robot cars have already breezed into street-legality in Nevada, Florida, and California over in the US. And that's true — but in each case they did so conditionally, and with a whole lot of uncertainty still lingering over their futures. For instance, in all three states in the US, a self-driving car must, at all times be, operated by an in-car driver, but how vigilant that driver should be is still a gray area. Under Nevada's law, anyone operating a "driverless" car is, unlike regular drivers, allowed to text. They are not allowed to drink alcohol, however, meaning that Nevada thinks self-driving cars should allow people freedom to not pay attention — but only to a certain point.
Further complicating things is who will be at fault in the likely event that autonomous cars are imperfect and get into accidents. Say a self-driving car on its way to pick up its owner were to blow through a red light — who would pay the ticket for that violation? The owner? The car manufacturer? The people who wrote the navigation software? This sort of query gets even thornier as the potential problems get more harmful: Who is liable if an autonomous car rear-ends a standard car with a driver? Will the robot always be implicated? Worse still, what if a malfunctioning robot car veers momentarily onto a sidewalk and kills a kid? Who will pay for that tragedy?
Unfortunately, despite the huge number of very serious questions people have about what sort of laws will eventually govern autonomous cars, some politicians refuse to even acknowledge such issues exist. In October of last year, when California Governor Jerry Brown signed the bill paving the way for legal self-driving cars in California, a reporter posed the question of who would be held responsible if a robot car indeed ran a red light. The governor dismissed the question as being stupid and simple. "I don't know — whoever owns the car, I would think. But we will work that out," he said. "That will be the easiest thing to work out."
I don't know what kind of government bureaucracy Governor Brown is used to, but from what I've seen of the political system, very few policy questions are "easy" anymore, especially not ones relating to emerging technologies and public safety. To be sure, cars that can drive themselves will be amazing, and I look forward to one day programming a Prius to come pick me up from the airport or a pub at last orders. But it's probably best to hold off on celebrating the coming fleets of robot cars until we have a serious conversation about the rules and regulations that will almost surely slow their public adoption, if not halt it entirely for years.