The US city of Orlando, Florida has ended its pilot trial of Rekognition, Amazon’s face recognition system, with no current plans for future tests. But the city hasn’t suddenly had a change of heart due to ethical concerns about the technology—they reportedly just couldn’t get it to work.
Back in October, the city began a nine-month trial of Rekognition with four cameras in police headquarters, three in downtown Orlando, and one outside a community recreation centre, Orlando Weekly reports. This was the city’s second pilot phase with the technology, and when it ended on Tuesday, officials say even the most basic functions weren’t operational.
“We haven’t even established a stream today,” Rosa Akhtarkhavari, Orlando’s chief information officer, told Orlando Weekly. “We’re talking about more than a year later. We have not, today, established a reliable stream.”
The city sought to use Amazon’s Rekognition to match faces in a database against those spotted by the cameras in real-time. Akhtarkhavari, chief administration officer Kevin Edmonds, and chief of police Orlando Rolon wrote in a memo on Thursday to city leaders that “the objective was to determine if this technology would reliably allow law enforcement to locate specific identifiable dangerous threats as they move around the city closing in on possible targets.”
In an email to Gizmodo, a city spokesperson emphasised that the pilot was previously scheduled to end after nine months. But given the program’s reported problems, it’s hardly surprising that the city isn’t planning to keep pursuing the tech.
Akhtarkhavari told Orlando Weekly that due to bandwidth issues, they weren’t able to run Rekognition on more than one camera and that there wasn’t a reliable signal. “We’ve never gotten to the point to test images,” she said. The cameras also reportedly didn’t have sufficient resolution to obtain clear images, and a person who helped install them told Orlando Weekly that due to their distance from the ground, they only captured the tops of people’s heads. This source also reportedly said that when cops were able to get a stream going, the video feeds would disconnect.
These aren’t the first reports of Amazon’s powerful facial recognition tool failing to meet the expectations of Orlando police. A trove of documents obtained by Buzzfeed in October indicated that not only did cops receive little to no training on how to use the system, but that there were a litany of problems attributed to it, from late delivery to overall frustration with its flaws.
“The streams keep stopping….seems like this happens daily,” a March email from an Orlando official reads, according to the documents. “I started 4 or 5 streams the other day and as you said, now only 1 is still up. I thought you were working on a script to automatically restart them if there were issues?”
Its hard to say how many of these issues were due to the technology itself and how many were the result of flawed deployment and inadequate funding. But while such technical problems might slow the spread of facial recognition for the time being, they do noting to address the troubling fact that police are exploring surveilling the public with a technology that has yet to be proven as just and accurate.
It’s why other American cities like Oakland, San Francisco, and Somerville, Massachusetts have officially banned city departments, including cops, from using facial recognition at all. It’s why a digital rights group is calling for a “complete” federal ban on US government use of the technology. And it’s why civil rights activists, politicians, and Amazon workers have all voiced fierce opposition to such partnerships between the online shopping giant and police.
“Our company should not be in the surveillance business,” Amazon workers wrote in an internal letter to CEO Jeff Bezos in June of last year—the same month the Orlando Police Department’s first pilot phase with Rekognition started—to demand the company end its facial recognition contracts with law enforcement. “[W]e should not be in the policing business; we should not be in the business of supporting those who monitor and oppress marginalised populations.”
Featured image: Getty