• Text Size
  • Print
  • Email

    From:

    To:

International News

UK Using Big Data To Wirite Minority Reports

April 23, 2018 posted by Steve Brownstein

Police in the United Kingdom are partnering with credit reporting agencies to predict whether criminals will reoffend, a report from UK civil liberties group, Big Brother Watch, has uncovered.
 
Police in Durham, in Northeastern England, paid international data broker Experian for access to its “Mosaic” database, complex credit profiling information that includes marketing and finance data on 50 million adults across the UK. Privacy experts balk at the idea of tying personal financial data, without the public’s consent, to criminal justice decisions.
 
Called HART (Harm Assessment Risk Tool), the AI analyzes multiple data points on suspects, then ranks them as a low, medium, or high risk to reoffend. Authorities can then use that ranking to decide whether an offender should receive jail time or be allowed to enter a rehabilitation program.
 
While Durham police have used the HART “risk assessment AI” since at least last summer, Big Brother Watch’s report reveals that HART now uses consumer marketing data from Experian to assess risk.
 
A few of the datapoints Experian collects for its Mosaic profile (now included in HART) are, via Big Brother Watch:
 
Family composition, including children,
Family/personal names linked to ethnicity,
Online data, including data scraped from the pregnancy advice website ‘Emma’s Diary’, and Rightmove,
Occupation,
Child benefits, tax credits, and income support,
Health data,
GCSE [General Certificate of Secondary Education] results,
Ratio of gardens to buildings,
Census data,
Gas and electricity consumption.
 
Experian’s Mosaic groups together people according to consumer behavior, making it easier for marketers to target people based on their interests and finances. “Aspiring Homemakers,” for example, are young couples with professional jobs more likely to be interested in online services and baby/family oriented goods. “Disconnected Youth” are under 25, live in modest housing, with low incomes and modest credit histories. By having access to these categories, HART can almost instantly make sensitive inferences about every facet of their lives.
 
“For a credit checking company to collect millions of pieces of information about us and sell profiles to the highest bidder is chilling,” Silkie Carlo, Director of Big Brother Watch, says in the report. “But for police to feed these crude and offensive profiles through artificial intelligence to make decisions on freedom and justice in the UK is truly dystopian.”
 
Mosaic also sorts people into racial categories. “Asian Heritage” is defined as large South Asian families, usually with ties to Pakistan and Bangladesh, living in inexpensive, rented homes. “Crowded Kaleidoscope” are low-income, immigrant families working “jobs with high turnover,” living in “cramped” houses.
 
What do these financial groupings have to do with someone’s likelihood to commit crimes? If the profiles are influenced by race and poverty, is it discriminatory to use them as data points when assessing risk? In the US, a landmark 2016 Pro Publica report found that COMPAS, another risk-assessment AI, routinely underestimated the likelihood of white suspects reoffending, even when the suspect’s race wasn’t included in the dataset. The opposite was true for black suspects; they were generally considered greater risks. A 2018 study by researchers at Dartmouth College found COMPAS was about as accurate as humans guessing based on far fewer data points.

CrimeFX performs criminal record searches in Puerto Rico

rightside one