Metropolitan Police Commissioner Cressida Dick has had it, officially, with people criticising the police's facial recognition tech.
The Met has been aiming its Live Facial Recognition (LFR) camera at the public in central London recently, including in Oxford Circus with just two hours' notice. Unsurprisingly, lots of people were unhappy about this, including civil liberties groups like Big Brother Watch, Liberty, and even Amnesty International. You know you're on the wrong side when you've pissed off Amnesty.
Nonetheless, speaking at the Royal United Services Institute (Rusi) on Monday, Dick had a bit of a rant about the facial recognition naysayers, claiming they don't know what they're talking about:
"I and others have been making the case for the proportionate use of tech in policing, but right now the loudest voices in the debate seem to be the critics, sometimes highly incorrect and/or highly ill-informed."
She went on to use some quite alarming justifications for the tech:
"Speaking as a member of the public, I will be frank. In an age of Twitter and Instagram and Facebook, concern about my image and that of my fellow law-abiding citizens passing through LFR and not being stored, feels much, much, much smaller than my and the public’s vital expectation to be kept safe from a knife through the chest."
In other words, if you oppose this and get stabbed to death, you've basically brought it on yourself. Also, if you've ever put a selfie on social media, you're no longer allowed to complain when the police go all Big Brother on your face.
This is despite the fact that trials of the tech had an alarmingly high false positive rate – 96 per cent according to one study. In fact, as Big Brother Watch pointed out, an independent review on behalf of the Met itself still found an accuracy rate of just 19 per cent. One in five chance as being identified as a criminal? Well, that sounds swell, bring it on.
Dick continued to clarify some "myths" around the Met's facial recognition tech, specifically:
- It's been proven not to have an ethnic bias [could they teach Silicon Valley how to accomplish that, then?]
- It doesn't store biometric data
- There are always human officers deciding whether to stop someone or not [this is not reassuring at all]
- "The force has been open and transparent in its deployment." [need we mention again, two hours' notice? Which was given by tweet? Oh sorry, there were also signs, but you'd only see those when you were in the face-scanning zone]
Dick said the eight arrests that happened as a result of face-scanning trials likely wouldn't have happened otherwise, and that the only people on the wanted list were serious criminals. She didn't mention the guy who covered his face and ended up with a £90 fine, though, or the black man who was stopped and cuffed for "walking fast" shortly before another black man was stopped for "walking slow" during the LFR session.
Hannah Couchman from Liberty comments:
This from Cressida Dick is misleading and dangerous. This kind of oppressive mass surveillance technology undermines our safety by handing extraordinary power to the state to control our movements and behaviour. https://t.co/3rglen8iu9
— Hannah Couchman (@Hannah_Couchman) February 24, 2020
Main image: Big Brother Watch