From A to B: Why It's Such a Bumpy Road to Driverless Cars

By Rich Wordsworth on at

In an ironic sort of way, it sometimes feels like we’re cruising toward driverless cars on auto-pilot.

Science fiction is good at imagining the effects of self-piloting cars on a small, personal scale. Wouldn’t it be great to be shuttled to work in your driverless car while watching TV on your laptop (or home from the pub at 1am belting out karaoke highlights)? Of course it would. But oh, wait, what will happen to the cab drivers? What happens if my driverless car goes wrong and crashes into your house? Do I buy flowers if my driverless car kills somebody, or just a new front bumper?

But questions like these are just the tip of a gargantuan legal, ethical and practical iceberg. Driverless cars will up-end whole industries (and not just transport); they will present new opportunities for employment in some areas while causing mass unemployment in others; they will drastically change whole nations’ relationships with the road. Before we strap ourselves - or our children - into anything that weighs a tonne and propels itself around without our hand on the wheel, these are the sorts of issues we need to debate - now, and publicly.

Dude, where’s my (driverless) car?

First we need to address maybe the biggest misconception about our incoming robotic chauffeurs: for all the hype, you may never actually own a driverless car at all.

“I think the main change that will happen in the future is that we’ll have fleets of driverless cars that basically provide our mobility,” says Dr. Alexander Hars, author of multiple driverless car papers and founder of Driverless Future, which tracks breakthroughs in the industry. “Most people won’t have their own cars anymore. We’ll have seamless mobility wherever we are, wherever we want to go. The whole way of getting around will change very significantly.”

Hars’ idea is that, basically, driverless cars will make owning your own vehicle old-hat. If getting to and from work is as easy as using a scheduling app on your phone, which will then dispatch a short-term rental to your front door, deliver you to your office and then pootle off to pick up someone else, why bother with the expense (and the faff) of shelling out for your own car, organising its insurance, finding somewhere to park it, taking it in for maintenance and so on?

Related: Driverless Cars in the UK – Who's Testing What, Where and When?

But whether you’re renting or owning, driverless cars also throw up a raft of juicy philosophical challenges. Some of them have become familiar to the point of triteness. You know the sort of thing: you’re sprawled out groggily in the back of your driverless car, speeding off to work, when suddenly you feel a bump. Sensing something’s wrong, your car slows, pulls over and stops. Oh God. You’ve run over a child. And now its friends are swarming around you furiously banging their satchels against your windows.

This is an example of your driverless car having to ‘think’ it’s way to the best of two terrible outcomes. You come round a corner, there are schoolchildren in the road. A swerve to the left hits just one child. A swerve to the right will save that child, but knock down a larger group like bowling pins. But whatever the decision your car ‘makes’, who is responsible for the death of the one child? Is it you, now scrabbling about in the backseat? The company that built the car? The software engineer who programmed it? What sort of authority should a piece of software have over life and death?

Self-navigating the legal minefield

Hars believes he has the answer: futuristic as these technologies may be, the culpability chain in an accident will stay pretty much unchanged from how it works today.

“When a self-driving car crashes… this would be fairly simple, legally,” says Hars. “The car crashes, in the first case we might not know who the culprit is, but the car is insured [so] by default the owner of the car is liable and will have to pay [damages].

“Now, the reason why the accident happened may be different. For example, the car may have been programmed in such a way that, [if it had been programmed differently], it could have prevented the accident. There may be a software defect. All the information is available, because there’s a black box with sensor information etc. So, if the car owner is held liable, he first has to pay, his insurance pays him, then the insurance might say, ‘Wait a minute, what really happened when the car crashed? We think there was a software defect.’ So the insurance can then go to the manufacturer and say, ‘You know, there was a software defect. This is a product defect.’ So, the manufacturer would have to pay.”

Automation also throws up more interesting, complex problems to think about when it comes to assigning responsibility for an accident. For example, what happens if you’re driving your driverless car yourself (in ‘manual mode’, if you like), and you run over a pedestrian? Could the family make a convincing legal case that your opting to drive yourself, rather than let the car take the wheel, directly resulted in the death of their loved one? A kind of death-by-arrogance in your own driving ability?

“I think it won’t take too long until for [exactly] this reason, driving your own car will be forbidden,” Hars predicts. “We won’t allow the fun of driving anymore, because there’s a big downside: people are likely to make very grave mistakes while driving. I think that will happen. It may happen in a way that you do still drive, but the car will monitor you. It may be that the car will counter your moves if you do something wrong. It will slow down, or take over. That may be a variant of this. But I don’t think that once we have ten or fifteen years experience with self-driving cars and we understand that they are significantly safer, we will accept people driving just for the fun of it and risking other people’s lives.”

But what if, in extremis, you wanted your car to break the law? The example I give to Hars goes like this: you are sitting at home with your wife, when suddenly she goes into labour. She urgently needs to go to the hospital, but you’re afraid it’s too far away for you to make it obeying the speed limit. Would your car ignore your desperate pleas to push the needle beyond what’s legal? Or worse, what if there was a bad storm blowing, and your car deemed conditions too dangerous to drive at all?

“[It’s] an interesting question,” Hars says. “My brother had this case, actually. His wife went into labour and had dangerous bleeding. So he went to the hospital with her right away, and the first thing they did was tell him that he should have never done something like that. He should have called an ambulance. Driving in that condition, it was dangerous for him, it was dangerous for his wife.

“[But it’s] a good point: where is the point at which humans would want to take over? Where they really feel constrained by the car and maybe it should let them take over instead? And I think we will come to these situations, where we feel that the autonomy of the human is limited by the car.”

We don’t like the idea of machines not doing as they’re told. But the answer to all these questions might come down to simple a simple cost-benefit analysis. A driverless car might make a mistake - even a fatal mistake - but so long as they make fewer mistakes than people, the right thing to do might be for them to wrench control away from everybody, regardless of how much a minority of individuals thump the dashboard or give birth all over the backseat. The humbling truth is that, simply put, we’re not as good at driving as we like to think we are.

“There is one thing I think about often with regard to self-driving cars,” says Hars. “If you look at movies, we have this view of ourselves that we take over from the machine in dangerous situations. [You can be] looking at a situation in science fiction where things get rough, and the machine is in control before things get rough, and when they get rougher the pilot takes over at the last instant. You know, meteorites are coming from all directions and at breakneck speed he races his spaceship through the most complicated situations. We have this vision of the human at the last minute taking over and performing better than the machine, and I think that’s fundamentally flawed. Yeah, we like this view of ourselves, but it’s just wrong. It’s not something that we’re good at.”

Victims of the robot uprising

While these what-if scenarios are fun to think about, in a grisly, Black Mirror kind of way, there’s another more practical element to the driverless revolution that bears more serious consideration: just how wide-ranging the disruption to our lives and livelihoods their arrival will be.

We know driverless cars are going to cost jobs. With cab companies like Uber already investing in driverless technologies, and London Underground arguably creeping towards (with the introduction of the 24-hour line service) the DLR-ification of the whole tube system, tomorrow’s public transport system is going to be a lot smaller than today’s.

But cab, tube and bus drivers will only be at the frontline of the transport industry purge. Driverless technologies will also decimate any industries that rely on the human element in transportation. With human error removed, driverless cars will require less maintenance - meaning fewer mechanics. Like the factories that built them, driverless trucks will run cheaper with no human at the wheel, meaning an end to truckers. Rubbish collection will be automated. Fewer cars mean fewer roads, lowering the need for construction workers. And if the public makes the switch from owning to effectively renting cars, what would be the point of car showrooms or expensive marketing campaigns? If we no longer drive for fun or see cars as status symbols, what would be the point of designing sleek new models at all?

“We need to ask what the economic landscape of the future is going to be,” says Hars, of the coming disruption. “It is very obvious that today you should really avoid going into anything that has anything to do with transportation, and definitely avoid the auto industry - they will have big problems over the next [few] decades.

“But it's not just the auto industry. Cars are very expensive devices and there are whole industries centred around the auto industry that build all kinds of things. [And] I am not convinced by the old saying that every new technology, if it destroys some jobs, will create many other jobs.”

Driverless cars are not just going to change an industry; they’re going to change the lives of millions in terms of work, leisure and mobility. For the majority, this will be good thing: convenience, freedom, time all raised up by a swarm of self-piloting, environmentally friendlier hatchbacks. But behind the Utopian vision lurks a serious and pressing question: how can we protect the millions who rely on the status quo of today from the automation wave of tomorrow?

Dr. Alexander Hars is the founder of Driverless Future and a managing director of Inventivio, which develops software, services and business models for future businesses.