In September of last year, we ran a piece on the UK’s use of armed drones. That same month, David Cameron announced that he had personally authorised the targeted killing of a British Citizen, Reyaad Khan, by a British Reaper drone in Syria, with defence secretary Michael Fallon telling the BBC that Britain “wouldn’t hesitate to take similar action again”.
A month later, the government announced in the October Strategic Defence and Security Review that the UK would be replacing its fleet of ten Reapers with more than 20 of its successor, the then-codenamed ‘Protector’, also built by US defence contractor General Atomics.
For better or worse, the future of the armed drone seemed to be secure. But as the most controversial military technology of the past decade, just what will that future look like?
What is the Problem With Drones Today?
Drone strikes have a notorious PR problem. A week after Fallon’s comments to the BBC, The Intercept published an eight-piece story on the US’s use of armed drones around the world, citing a cache of leaked documents with which they had been provided. The documents purported to show that nine out of every ten people killed in a US ‘targeted killing’ (drone assassination) between May 1st and September 15th, 2012, had not been the targets of the strike. Stories like those of Mohammed Tuaiman, a Yemeni 13-year-old who died in such a strike as reported by the Guardian, damaged their reputation further.
These stories and statistics jar with what the public is told about the use of ‘precision weapons’ (in the case of the Reaper, either the AGM-114 Hellfire missile, or the GBU-12 Paveway II laser-guided bomb). You might reasonably expect, then, that the next generation of drones will boast new levels of accuracy. But that assumption depends on an important issue: are the innocent people killed by drones dying as a result of the hardware, or the people controlling it?
“I don't think [collateral damage is] a failure of technology,” says James Rogers, associate lecturer in international politics at the University of York, who researches the repercussions of drone warfare. “In terms of the technological aspect, you can hit a football on the ground when a drone is flying at 30 or 40,000 feet. The technology and the capacity to hit the target is there. It's the intelligence issues that mean you hit the wrong target.”
Rogers draws an important distinction: there is a difference between hitting a target with precision, and hitting the right target. It doesn’t matter if you can reliably put a missile through a skylight if it then turns out your target was a school, rather than an insurgent safehouse.
Mistakes like this colour public perception, not just of those whose governments perpetrate drone strikes, but also – critically – those living underneath the drones in the conflict zones. The omnipresent threat of drones tortures communities living in their shadow.
“We see, in Afghanistan and Pakistan, high rates of depression,” says Caroline Kennedy, professor of War Studies at the University of Hull and author of multiple research papers on drones. “Suicide rates are worrying in those areas… How do we correlate the presence of drones with these reported rates of depression? That's a tricky question. But another would be this feeling of living constantly with the noise and the threat of a strike. But [there is] also the idea that, in what are quite private communities, privacy has been violated... The idea [is] that in these essentially very religious societies, very private societies, the constant surveillance is an intrusion.”
The combination of inaccurate targeting and ever-present danger creates a serious hearts-and-minds problem on the ground. As King’s Dr. Jack McDonald told us previously, drones do not end wars, but “make endless conflict manageable”. So if we can’t – or won’t – get rid of them, what can we do to make them better?
The problem with precision isn’t a problem with the machine. A precision weapon, whether guided by GPS or a laser, generally goes where you point it; it’s the human pulling the trigger, or more accurately the intelligence whomever has told them to pull the trigger has to hand, that is at fault. So if drones are good and people are bad, what is the solution?
“Better precision could happen with better communications,” says David Galbreath, professor of International Security at the University of Bath. “[Particularly] communications between different types of drone. This is where the idea of ‘swarming’ comes in. I'm not talking about swarming as in a flock of birds, but swarming in relation to the way that different types of drone are engaging with each other to establish a view [of the battlefield].”
The idea behind a drone swarm is to saturate the area of operations with fleets of UAVs that all share information with each other constantly. So, instead of having one drone high up being flown by a ground crew, you have multiple drones autonomously scouring an area for information and feeding off each other’s findings. The result, it is proposed, will be to give military personnel a vastly more comprehensive view of any operation, leading to fewer mistakes.
But the word ‘autonomous’ has another, more sinister connotation in the debate over the future of armed drones. If drones can gather and process tactical information faster than their human operators, why include a human pilot in the kill-chain at all? As with driverless cars, if robots make fewer mistakes overall, why not just hand over responsibility for life and death to them?
The idea of taking humans ‘out of the loop’ has become a political line in the sand. In July 2015, more than 1,000 prominent artificial intelligence experts – including Tesla’s Elon Musk, Stephen Hawking and Apple co-founder Steve Wozniak – signed an open letter supporting a ban on drones that could kill autonomously. How much their concerns will resonate with future militaries, however, is still an open question.
“I think there is something about the nature of war itself that is the problem [with autonomous killing],” says Galbreath. “We expect someone to be on the trigger; we expect someone to be a victim; we expect someone to be a hero; we expect someone to be actively involved in fighting our enemies in some way or another. It is politically unpalatable to say that we have systems that are taking care of ‘the problem’ for us.
“[But] let's imagine a human wasn't in the loop. Imagine that we had drones that were essentially just sitting and listening to someone for a year, processing their language and their relationships and so on, and they are convinced by their algorithms that this person is a threat to our national interests, and report that to a Reaper that’s circling high up above. You can understand how that could happen. We're not in that place right now, but you can understand how it could happen.”
But increasing the accuracy of drone strikes is only one half of the battle for hearts and minds on the ground. A second challenge is in proving to a local population that drones can do more than just deliver death.
“If you look at the fight against [improvised explosive devices] (IED), for example – ordinary folk going about their business are having legs blown off – drones are very very good at intelligence gathering on things like IED fields,” says Kennedy. “So could it be that rather than [being] simply machines of killing, drones [could] also become means of helping civilian population? We know this happens with drops of food and drops of medicine. There must be a virtuous path, going forward.”
The Next Ten Years
But if you look at the next generation of drones – the General Atomics Avenger or BAE System’s Taranis prototype (seen above) – you’ll notice something odd. These next-gen drones look a lot more like fighter jets than surveillance platforms. Their bodies are streamlined for stealth. Their weapons don’t hang off their wings, but are tucked away in internal bays, like those on fifth-generation fighters like the F-22 Raptor and F-35 Lightning II. These drones are not being designed to fight people on the ground with machine guns and four-by-fours.
“Essentially, people are developing weapons to fight a war with China,” says Galbreath. “[And] China is developing weapons to fight a war with the United States.
“At the moment the United States still has a technological supremacy. It largely has to do with area-denial and area access. China is distinctly interested in creating area-denial technologies – anti-air missiles, EMPs, and things like that. And what the United States is trying to do is use UAVs to project force into an area where area-denial had been enacted by the Chinese. How can you penetrate that? How can you establish bridgeheads? So [China is developing] electromagnetic pulses, alternatives to circuit boards, stealth - [all] so they could bring down American UAVs.”
But while China is developing high-tech platforms for war with the US, the other major concern are the drones it is producing for export. The United States still produces the most advanced drones in the world, but the list of countries it will sell them to is short. China appears to have no such qualms, and has already sold its armed drones to countries including Iraq and Saudi Arabia. Chinese drones are not the gold standard, but its CH-3 and CH-4 drones – which look suspiciously similar to the US Predator and Reaper – are comparatively cheap, and far more easily available thanks to China’s lax export policy.
The next ten years for drones, then, will change from the past decade in two important areas. First will be the rush to develop more and more powerful drones between, principally, the US and China – as well as measures by which to detect and counter them. The second sea change will be in the number of countries buying ‘good enough’ drones to project force or as status symbols – including many countries the west would really not have drones at all.
“I think you'll have [an] arms race over the drone technology,” says Rogers. “How are these going to be used in the future, as drones proliferate? How are the smaller nations going to be using this precision technology? Just because something is technically precise doesn't mean it has to be used in a precision way.
“The United States has a legacy of having an air power strategy based upon precision and being proportionate in times of war, in order to ensure it is as cost-free as possible to civilians, but also to their own military personnel. Is that the same for other nation states as they acquire drones? Well, probably not.”
David Galbreath is professor of International Security at the University of Bath.
Caroline Kennedy is professor of War Studies at the University of Hull.
James Rogers is associate lecturer in international politics at the University of York.