Durham Police paid international data broker Experian for access to its “Mosaic” database, complex credit profiling information that includes marketing and finance data on 50 million adults across the UK. Privacy experts balk at the idea of tying personal financial data, without the public’s consent, to criminal justice decisions.
Called HART (Harm Assessment Risk Tool), the AI analyses multiple data points on suspects, then ranks them as a low, medium, or high risk to reoffend. Authorities can then use that ranking to decide whether an offender should receive jail time or be allowed to enter a rehabilitation program.
While Durham police have used the HART “risk assessment AI” since at least last summer, Big Brother Watch’s report reveals that HART now uses consumer marketing data from Experian to assess risk.
A few of the datapoints Experian collects for its Mosaic profile (now included in HART) are, via Big Brother Watch:
- Family composition, including children,
- Family/personal names linked to ethnicity,
- Online data, including data scraped from the pregnancy advice website ‘Emma’s Diary’, and Rightmove,
- Child benefits, tax credits, and income support,
- Health data,
- GCSE [General Certificate of Secondary Education] results,
- Ratio of gardens to buildings,
- Census data,
- Gas and electricity consumption.
Experian’s Mosaic groups together people according to consumer behaviour, making it easier for marketers to target people based on their interests and finances. “Aspiring Homemakers,” for example, are young couples with professional jobs more likely to be interested in online services and baby/family oriented goods. “Disconnected Youth” are under 25, live in modest housing, with low incomes and modest credit histories. By having access to these categories, HART can almost instantly make sensitive inferences about every facet of their lives.
“For a credit checking company to collect millions of pieces of information about us and sell profiles to the highest bidder is chilling,” Silkie Carlo, Director of Big Brother Watch, says in the report. “But for police to feed these crude and offensive profiles through artificial intelligence to make decisions on freedom and justice in the UK is truly dystopian.”
Mosaic also sorts people into racial categories. “Asian Heritage” is defined as large South Asian families, usually with ties to Pakistan and Bangladesh, living in inexpensive, rented homes. “Crowded Kaleidoscope” are low-income, immigrant families working “jobs with high turnover,” living in “cramped” houses.
What do these financial groupings have to do with someone’s likelihood to commit crimes? If the profiles are influenced by race and poverty, is it discriminatory to use them as data points when assessing risk? In the US, a landmark 2016 Pro Publica report found that COMPAS, another risk-assessment AI, routinely underestimated the likelihood of white suspects reoffending, even when the suspect’s race wasn’t included in the dataset. The opposite was true for black suspects; they were generally considered greater risks. A 2018 study by researchers at Dartmouth College found COMPAS was about as accurate as humans guessing based on far fewer data points.
“We wouldn’t accept people going through our bins to collect information about us,” Carlo says in the report. “Nor should we accept multi-billion pound companies like Experian scavenging for information about us online or offline, whether for profit or policing. Parliament should urgently consider what place this big data and artificial intelligence has in our policing.” [Techdirt via Big Brother Watch]