14 British Police Forces Are Using Algorithms To Predict Crimes, And Campaigners Think It Will Lead To Biased Decisions

    A new report by Liberty warns that predictive crime-fighting software is being rolled out across the country without adequate safeguards.

    Matt Cardy / Getty Images

    Fourteen police forces across the UK are using algorithms to predict crimes, according to new research, and human rights campaigners are concerned that the “sinister” programs will lead to biased decisions and privacy breaches.

    Data compiled by the campaign group Liberty through a series of freedom of information requests, published for the first time in a report on Monday, reveals that forces including Avon and Somerset, Greater Manchester and West Midlands are using software to predict where crimes might happen and even the likelihood of an individual committing a crime.

    The campaigners warn that the widespread adoption of the technology is a serious threat to civil liberties. Instead of providing effective new insights to cut crime and make neighbourhoods safer, they worry, it will lead to more unfair targeting of already over-policed communities.

    “Life-changing decisions are being made about us that are impossible to challenge, and even the police often don’t know how the machines make their predictions,” Hannah Couchman, policy and campaigns officer at Liberty, told BuzzFeed News.

    In particular, Liberty is concerned that the algorithms will magnify a racial bias in British policing that has led to black citizens being three times more likely to be arrested than white ones. That, they say, will worsen relationships with minority groups that have historically been unfairly targeted.

    The technology also raises serious privacy concerns at a time of growing public anxiety about “big data” being used to monitor behaviour, the campaigners say.

    While the report acknowledges that software could help policing to become more efficient at a time of sharp budget cuts, and provide genuine new insights that could lead to reductions in crime, there is a danger, Liberty says, that it merely entrenches inherent weaknesses. Budget-strapped police forces may end up relying too heavily on data of dubious predictive value rather than developing human relationships and expertise.

    The software is being rolled out without proper external scrutiny, Liberty claims. There appear to be few safeguards to ensure that decisions made by the programs are fair, or mechanisms by which they could be challenged. In some cases, forces are using software developed by commercial providers who treat their algorithms as trade secrets, so the police don’t understand how the decisions are made.

    According to the report, there are two types of program being used.

    The most widespread, adopted by 13 forces, is “predictive mapping” software which aims to identify crime hotspots by analysing vast troves of data about historic crimes. In theory, this should help forces to make better decisions about where to direct their stretched resources.

    Adrian Dennis / AFP / Getty Images

    The other type is “individual risk assessments”, where software is used to predict the likelihood of a single person being a victim or perpetrator of a crime. At present, three forces across England – Avon and Somerset, Durham, and West Midlands – are using this kind of program, the report says.

    Avon and Somerset Police is singled out in the report as using an “alarmingly broad” array of programs to predict individual risks, including the likelihood of a person falling victim to or perpetrating crimes such as domestic violence, sexual assault or stalking.

    “The variety of ways in which Avon and Somerset Police use predictive policing is astonishing, and the assessment of victimhood is a highly controversial development,” the report said.

    Avon and Somerset police told BuzzFeed News: "It is important for officers and staff to have ready access to information as they direct resources to prevent crime, in response to potentially life-threatening incidents or to keep vulnerable individuals safe from harm."

    Avon and Somerset's computer systems, the spokesperson said, do not replace human judgements and the force makes "every effort" to prevent biases in its data models. The force's ethics committee has been consulted on the development of the data tools and it has invited academics at the University of the West of England to conduct an independent evaluation.

    Another force identified in the report is Durham, which has used a program called “Harm Assessment Risk Tool (HART)” since 2016. In its freedom of information response to Liberty, Durham police described HART as an experimental tool to help officers identify offenders who would be suitable for deferred prosecution arrangements.

    When Liberty asked about the potential for bias in the system, Durham replied that “no accuracy comparisons have yet been made between different demographic groups”, adding that the area the force covers is 90% white. “This is a deeply concerning response,” the report said. “The fact that Durham is not a diverse area with a low percentage of BAME residents does not negate the need for careful consideration and mitigation of potential bias.”

    Jack Taylor / Getty Images

    A spokesperson for Durham Constabulary said: “We are proud of Hart, which is part of our intervention programme to help repeat offenders turn their lives around, break away from the revolving door of prison and reduce crime.

    “All decisions are ultimately made by an experienced custody officer, but the Hart advisory tool gives them a clear indication as to who might be more at risk of re-offending – not so they are stereotyped, but so we can give them more support to turn away from crime.”

    Liberty is calling for police forces to stop using the technology or at least to make public details of the algorithms they’re using “in a transparent and accessible way”. The report is based on FOI requests to around 90 police forces across the UK.

    Alex Spence is a senior political correspondent for BuzzFeed News and is based in London.

    Contact Alex Spence at alex.spence@buzzfeed.com.

    Got a confidential tip? Submit it here