A Group of MPs Is Calling for Regulation of Facial Recognition Tech Used by the Police

By Shabana Arif on at

The flagrant misuse of facial recognition tech by the police is finally being criticised by MPs, with calls to put a stop to it until there are proper regulations in place, which seems like it should've been a no-brainer.

The House of Commons Science and Technology committee cited concerns over bias and accuracy, and also highlighted the fact that a database of custody images held by the fuzz hasn't been properly audited to remove images of people who were never convicted.

The legality of facial recognition trials has also be called into the question, with the report stating that "there is growing evidence from respected, independent bodies that the ‘regulatory lacuna’ surrounding the use of automatic facial recognition has called the legal basis of the trials into question. The Government, however, seems to not realise or to concede that there is a problem."

Such trials include unmarked police vehicles testing mass surveillance/facial recognition technology on an unwitting public, and other shady-as-shit practices like fining passers-by who don't want to be scanned by inferring guilt from their non-compliance.

US lawmakers introduced legislation earlier this year that was aimed at blocking companies using face recognition tech from collecting and sharing people's data without their consent.

The obvious concern is that innocent people will have their images added to watch lists used by the police, potentially leading to false arrests, although at the moment, around 96 per cent of the people scanned in the trials carried out in the UK between 2016 and 2018 were falsely identified as criminals anyway.

"The public would expect the police to consider all new technologies that could make them safer," a National Police Chiefs' council spokesman told the BBC, trotting out the old 'giving up freedom for safety' card.

"Any wider roll out of this technology must be based on evidence showing it to be effective with sufficient safeguards and oversight." Apparently an almost 100 per cent failure rate isn't enough to scrap tech with the apparent sole purpose of keeping people safe, and we'll just keep on wasting time, money, and resources on it. [BBC News]