Welfare 29 September 2020 Poverty by algorithm: The Universal Credit design flaw that leaves people penniless After the A-level results debacle, another flawed government calculation is now creating misery for welfare claimants. Shutterstock Sign UpGet the New Statesman\'s Morning Call email. Sign-up A flaw in the Universal Credit welfare system is creating misery for claimants. Rather than providing a safety net, the way the benefits programme is designed and automated is pushing people into poverty. These are the findings of a report entitled “Automated Hardship: How the Tech-Driven Overhaul of the UK’s Benefits System Worsens Poverty” by the global NGO Human Rights Watch – which has previously revealed how the automation of welfare harms citizens' rights in Australia, India, the Netherlands, Sweden and the US. It finds that a poorly designed algorithm at the heart of Universal Credit is causing claimants to go hungry, fall into debt, and is sometimes punishing those in irregular work. Universal Credit is a means-tested benefit based on a claimant's past calendar month of earnings, adjusted according to changes in earnings each month. This means that irregular shifts, frequency and dates of payments, and fluctuations in a person’s earnings can be misinterpreted by the system – leaving some with very little income the following month because their earnings have been overestimated by the algorithm. This was the case for Penny Walters, 54, who lives in Newcastle. A couple of years ago, the charity funding for her job managing a community centre in a local church ran out. Without work, she had to apply for Universal Credit for the first time, while looking for other jobs that would help top up her income. She did agency work as a cleaner and shifts at Asda, but was punished by the Universal Credit algorithm when it registered those hours: she worked four weeks doing Christmas shifts for the supermarket last December, was paid at the end of the month, received her Universal Credit in January, and then received nothing at all in February. Because Walters had no work in February, she had zero income that month – already a tough time after the Christmas period. “I didn’t have the money to do anything,” she recalls. “I had to ask my 29-year-old daughter for the bus fare. I’m 54. I should not have to ask my child for money to do things. I’m the adult. I’m supposed to be providing for her, not the other way round. It just demoralises you.” Walters now volunteers two days a week cooking meals for vulnerable people and residents of sheltered housing, and no longer does paid work in addition to receiving Universal Credit. “You’re always a month behind because of the way it’s backdated, the way they work it out,” she tells me. “It does disincentivise you from working. Would you like to go to work for peanuts?” This failing is reminiscent of the flawed algorithm used by Ofqual to calculate A-level pupils’ grades this year in the absence of exam results. Its conclusions were abandoned when the government U-turned, amid criticism of automating outcomes at serious human cost. Although there are other possible ways to calculate benefits, including assessing income over shorter periods, or averaging earnings out over longer stretches to avoid monthly fluctuations, the Department for Work and Pensions (DWP) continues to use this flawed algorithm. Indeed, it is the backbone of Universal Credit, which was designed to mimic the world of work on a monthly salary – with benefits paid on a monthly basis, and calculated in arrears. This is despite a court ruling in June that found the algorithm irrational and unlawful – ruling in favour of four single mothers who were losing out on benefits because of the time in the month their salaries were paid in. In response to this ruling, the DWP said that it was “carefully considering the court’s decision”, but no changes to the system have since been made. Ultimately, as with the A-level model, which disadvantaged poorer pupils by punishing larger cohorts and schools with weaker records, the Universal Credit algorithm is a political decision. The point of reflecting the world of work was to encourage claimants to budget as if they were salaried workers. The idea was to encourage personal responsibility over finances – overlooking the fact that more than a third of Universal Credit claimants are already in work, and that many (particularly on low wages) are not even paid monthly: zero-hours and gig economy contracts, part-time work, shift work, and irregular hours. Assessing income weekly or fortnightly would carry the risk of overestimating what a claimant is owed – but accidental overpayments could be a lot less damaging than sudden reductions in income for an entire month. Human Rights Watch is calling for a “comprehensive redesign of how the government calculates the social security benefit”. “The government has put a flawed algorithm in charge of deciding how much money to give people to pay rent and feed their families,” said Amos Toh, senior artificial intelligence and human rights researcher at the organisation. “The government’s bid to automate the benefits system – no matter the human cost – is pushing people to the brink of poverty.” A DWP spokesperson said: “Universal Credit was designed to mirror the world of work, where the majority of employees receive wages monthly, and to help people get back into employment as soon as possible. “The monthly assessment and adjustment period reflects this, and ensures that if a claimant’s income falls, they won’t have to wait several months for a rise in their Universal Credit. “Urgent advances are available for those who need them, and we have taken steps to make debt repayments easier.” › Boris Johnson is looking for political gain in a crisis he helped to create Anoosh Chakelian is the New Statesman’s Britain editor. She co-hosts the New Statesman podcast, discussing the latest in UK politics. Subscribe For more great writing from our award-winning journalists subscribe for just £1 per month!