New Times,
New Thinking.

  1. Politics
14 November 2019updated 05 Oct 2023 8:41am

The Ministry of Justice’s prisoner risk algorithm could program in racism

A new system could result in ethnic minority prisoners being unfairly placed in higher security conditions than white prisoners.

By Crofton Black

A new algorithmic tool for categorising prisoners in UK jails risks automating and embedding racism in the system, experts and advocacy groups have warned.

The tool draws on data, including from the prison service, police and the National Crime Agency, to assess what type of prison a person should be put in and how strictly they should be controlled during their sentence.

Critics told the Bureau of Investigative Journalism that the new system could result in ethnic minority prisoners being unfairly placed in higher security conditions than white prisoners. This would exacerbate long-standing problems of discrimination exposed in an excoriating review two years ago by the Labour MP David Lammy. Higher category prisoners have fewer opportunities to develop skills and work towards rehabilitation compared with those held in open or lower security jails.

Responding to the Bureau’s findings, Lammy said: “This new information paints a damning picture of a criminal justice system that problematises people based on the colour of their skin.

“The way in which offenders are categorised has a huge impact on the opportunity they have for rehabilitation.”

Select and enter your email address The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

A preliminary evaluation of the new system by the Ministry of Justice (MoJ), obtained by the Bureau via a Freedom of Information request, concluded that there was no indication of discrimination. But experts have questioned whether the MoJ’s findings can support this conclusion.

The evaluation, completed in August 2018, assessed an initial trial on 269 prisoners, of whom 32 were black, Asian or from an ethnic minority (BAME). Although some numbers were redacted from the report, the Bureau’s analysis indicates that 5 of the 32 non-white prisoners – 16 per cent – had their risk category raised under the new system, while 7 per cent, or 16 of 230 white prisoners, did.

The evaluation highlights an inherent risk of the new tool – that potential racial bias already present in the prison system “may translate into more reporting on BAME prisoners through our intelligence systems.” Because the tool relies on these systems, “this may result in a greater proportion of BAME prisoners having an increase in their security category.”

Despite this warning, the MoJ concluded that the data in the trial showed that “there is not an adverse effect in terms of BAME offenders being negatively categorised” and “no evidence of discrimination”.

However, when challenged by the Bureau, the MoJ claimed the trial sample size was too small to judge the impact on prisoners of different ethnicities.

The tool’s reliance on intelligence sources, and the MoJ’s assessment of its findings, have provoked criticism.

Veneer of Objectivity

Patrick Williams, a senior lecturer at Manchester Metropolitan University and a specialist in disproportionality within the criminal justice system, criticised the use of intelligence to inform categorisation.

“It’s remarkable that in the context of this report the MoJ claims there is no evidence that BAME prisoners are at risk of discrimination,” he said. “Intelligence-informed categorisation will simply take police intelligence into the prison environment and may result in presenting BAME people as more risky to prison staff. This in turn will increase surveillance and offender management. To suggest this tool will not have a disproportionate effect ignores all available evidence.”

Tanya O’Carroll, director of Amnesty Tech, likened the new tool to the Metropolitan Police’s “Gangs Matrix”, which is being overhauled following an investigation by the Information Commissioner, after it was accused of being systemically racist and unfit for purpose.

“These kinds of systems are premised on the idea that police intelligence data can be a neat, objective measure of how likely someone is to offend or re-offend,” she said. “In practice, the story is very different … At the same time, the algorithm gives these systems a veneer of being more scientific, more objective, even as they become harder to interrogate.”

The MoJ refused to specify which intelligence systems the new tool used, claiming this information “would be likely to prejudice the maintenance of security and good order”. Documents obtained by the Bureau suggested that the tool’s data sources would soon be expanded, stating that prison “staff welcomed the additional information that the tool provides, but were keen to see further integration of the process into existing prison IT systems”.

The new system was first announced by the former justice secretary David Gauke in July last year. He claimed the tool would provide a better assessment of whether offenders were involved in organised crime.

He said: “This new £1 million investment will allow us to extend the use of digital categorisation to the rest of the prison estate within 12 months, so that all offenders will be sent to appropriate prisons.” Since then, the MoJ has given little public indication of progress, although a spokesman told the Bureau the trial had been expanded to include eight prisons.

The new system is part of a broader austerity-era pattern, in which private contractors team up with government departments to streamline services by harnessing the power of data. A report by the Bureau in March revealed both the scope and scale of recent investments in data-driven systems, along with systemic issues of transparency and accountability around how they are purchased and developed.

The digital prisoner categorisation tool was first developed by Deloitte, which won a tender in December 2017 to “identify possible methods to automate the categorisation and allocation of prisoners decision-making process, and improve data sharing across the criminal justice system”. The goal included “developing a more data rich process … aligned to the MoJ objective of becoming a more data-driven department.”

The MoJ advertised a follow-on contract in November 2018, but cancelled it after they decided to continue development of the tool in-house.

In March the department told the Bureau that it was testing a second version of the tool, with the aim of rolling it out more widely later this year. But this month it declined to comment on how far this had progressed, saying merely it was still being tested.

The lack of transparency has caused alarm. Peter Dawson, director of the Prison Reform Trust, said: “The categorisation system is certainly in need of an overhaul, and there are some sensible aspects to what has been trialled. But the risks are significant, especially around race.

“Given what we know about entrenched discrimination in the criminal justice system even before someone goes to prison, it is more likely than not that a categorisation system relying on confidential intelligence will entrench or even exacerbate that unfairness. So it is crucial that the prison service is completely transparent about every stage of its evaluation.”

The MoJ emphasised that the new tool “is intended to support categorisation decisions, rather than make them” and staff would be expected to use their judgement when using the results provided by the algorithm. The pilot report found that “in 20 cases staff have, based on other evidence and sources of information, rejected the provisional security category suggested by the tool”.

In some cases the pilot found that prisoners who could be managed in open prisons were being recategorised upwards to closed C-category prisons. On review, the MoJ determined that this was unnecessary and would be revised in the next version of the tool.

The report emphasised that “the audit capabilities enabled by the tool will improve our ability to promote equality of outcome”.

Clash of commitments

The government is grappling with two competing commitments: tackling systemic problems in the justice system and cracking down on disorder in prisons. In almost his first act as prime minister, Boris Johnson reaffirmed David Gauke’s pledges in August, ordering a review of sentencing and restating the intention to create a further 10,000 prison places.

Simultaneously, however, the MoJ is struggling to address the results of the Lammy review, which David Liddington, Gauke’s predecessor as justice secretary, admitted in December 2017 offered a “sober analysis of discrepancies in how people of different backgrounds experience the criminal justice system”.

At that time, the government committed to adopting the mantra of “explain or change” — understanding and tackling disparities across government institutions.

Implementation of this principle has not been straightforward, however. When the justice select committee reviewed progress on Lammy’s recommendations in March, it highlighted the persistent and ingrained nature of race inequalities in the justice system and the lack of concrete outcomes to address them. Campaigners working in the criminal justice system questioned the extent to which the MoJ was addressing Lammy’s findings in its new initiatives.

The findings around the digital categorisation system “show the importance of all new and future Ministry of Justice policies being properly equality impact assessed to ensure they do not further exacerbate the alarming levels of disparity in the criminal justice system,” said Jess Mullen, head of policy and communications for Clinks, the infrastructure organisation for the voluntary sector in criminal justice.

Simon Creighton, a partner at London law firm Bhatt Murphy, described categorisation as “absolutely critical” for prisoners, but warned that cuts to legal aid had “effectively removed legal oversight” from the process. He said that while legal challenges to categorisation “used to be quite routine” they were now only carried out in a “tiny minority of cases sufficiently egregious to warrant judicial review”.

“The effect is that decisions that might be on the wrong side of the balance are not challenged,” he said. “If soft information is added to the equation, particularly undisclosed and unchallenged ‘intelligence’ material, there is a real risk that decisions will be unreliable and unfair.”

The MoJ said that prisoners could access the justification for all decisions made and could appeal their categorisation. It emphasised that one black prisoner included in the assessment had been recategorised downwards.

A spokesman added: “The new tool will actually give us data we don’t currently have that will help us better identify and rectify inequalities, as well as using evidence more effectively to keep the public safe.”

In his 2017 report, Lammy warned that as technology evolved, “the nature of scrutiny will need to evolve too”.

“New decision-making tools, such as algorithms, are likely to be used more and more in the coming years – for example, to assess the risk individuals pose to others,” he wrote. “If and when this happens, the criminal justice system will need to find new ways to deliver transparent decision-making.”

Crofton Black works for the Bureau of Investigative Journalism. In the past year, concerns around transparency and accountability in data-driven systems have steadily grown. The Bureau has launched a new project to investigate how data systems are being used in the UK and elsewhere, and the impact that has on ordinary people’s lives. You can find out more about this project here, and read all the stories in this investigation here.

Content from our partners
Can green energy solutions deliver for nature and people?
"Why wouldn't you?" Joining the charge towards net zero
The road to clean power 2030