Minority Report-style tech used to predict crime is 'prejudiced'

Police algorithms designed to predict future crime are unproven and threaten to “entrench bias”, according to the new Home Office surveillance tsar in his first interview since taking up the role.  Fraser Sampson, who became biometrics and surveillance camera commissioner on March 1, expressed extreme scepticism over the effectiveness of Minority Report-style algorithms and computer  models used by forces across the country to anticipate a person’s likelihood of committing crime or re-offending.  “In the area of prediction, I generally come back to an observation made by [the writer] Malcolm Gladwell, who said that prediction in the field, where no prediction is possible is mere prejudice,” he said.  

While his predecessor, Tony Porter, issued few statements about these “predictive policing” tools, Sampson’s comments signal his tenure could mean fresh scrutiny over the use of the technology. In February, a predictive policing tool used by the Metropolitan Police caused controversy when it was revealed 1,000 young, Black men, all under 25, had been flagged in connection to gang violence. However, a review found they posed little to no risk of committing crime. 

The tool, called the Gangs Matrix, scores people’s “risk” according to how likely a mathematical formula believes they are to commit gang violence based on their previous offences, patrol logs, social media activity and friendship networks. Every individual in the database is assigned a colour – red, amber or green – depending on the level of risk as a victim or harm as an offender they present.  The Met says those scores enable the force to safeguard victims or to divert potential offenders away from gangs.

A spokesperson stressed the system uses a computer model, not an algorithm to calculate the risk scores. 

What is algorithmic bias?

Sampson compared technologies like the Gangs Matrix, which now contains 2,305 names, to last summer’s A-Level algorithm which was found to penalise outliers and mark-down high achieving students attending poor-performing schools.  “In terms of predicting activity, we know that is enormously difficult to do fairly and accurately,” said the former police officer.  “To try and predict whether that same 17-year-old will go on at some point to commit certain types of crime under totally unknown and unforeseen circumstances, I think involves so many variables that I wonder why we would do it or even presume to think we can do it fairly and accurately.” 

Despite on-going concern over the deployment of predictive policing tools, a 2020 report by think tank RUSI revealed constabularies in Durham, Avon and Somerset as well as Hampshire were all using algorithms to assign previous offenders with ‘risk’ scores, designed to assess how likely they are to commit another crime.  Inspector Greg Evans, from the West Midlands Police, said the force will start using its own predictive algorithm, which has already received ethical approval, once it has been fully tested.  “It will be used to assist police,” he said, “but it will not replace professional judgement.” 

Cressida Dick, the head of London’s Metropolitan Police, has also previously said police should use these tools for “augmented intelligence”, not rely on them. 

A spokesperson for the Home Office said :”The Home Office want police to use technologies like advanced algorithms to tackle crime and protect communities from harm.”

But Sampson concluded: “If all we’re going to do is perpetuate biases more quickly and more efficiently because we’re using algorithms to do it, I’m not really sure that that gets us very far.”

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *