Another variable, “presumed partner,” is used to determine whether someone has a concealed relationship, since single people receive more benefits. This involves searching data for connections between welfare recipients and other Danish residents, such as whether they have lived at the same address or raised children together.
“The ideology that underlies these algorithmic systems, and [the] very intrusive surveillance and monitoring of people who receive welfare, is a deep suspicion of the poor,” says Victoria Adelmant, director of the Digital Welfare and Human Rights Project.
For all the complexity of machine learning models, and all the data amassed and processed, there is still a person with a decision to make at the hard end of fraud controls. This is the fail-safe, Jacobsen argues, but it’s also the first place where these systems collide with reality.
Morten Bruun Jonassen is one of these fail-safes. A former police officer, he leads Copenhagen’s control team, a group of officials tasked with ensuring that the city’s residents are registered at the correct address and receive the correct benefits payments. He’s been working for the city’s social services department for 14 years, long enough to remember a time before algorithms assumed such importance—and long enough to have observed the change of tone in the national conversation on welfare.
While the war on welfare fraud remains politically popular in Denmark, Jonassen says only a “very small” number of the cases he encounters involve actual fraud. For all the investment in it, the data mining unit is not his best source of leads, and cases flagged by Jacobsen’s system make up just 13 percent of the cases his team investigates—half the national average. Since 2018, Jonassen and his unit have softened their approach compared to other units in Denmark, which tend to be tougher on fraud, he says. In a case documented in 2019 by DR, Denmark’s public broadcaster, a welfare recipient said that investigators had trawled her social media to see whether she was in a relationship before wrongfully accusing her of welfare fraud.
While he gives credit to Jacobsen’s data mining unit for trying to improve its algorithms, Jonassen has yet to see significant improvement for the cases he handles. “Basically, it’s not been better,” he says. In a 2022 survey of Denmark’s towns and cities conducted by the unit, officials scored their satisfaction with it, on average, between 4 and 5 out of 7.
Jonassen says people claiming benefits should get what they’re due—no more, no less. And despite the scale of Jacobsen’s automated bureaucracy, he starts more investigations based on tips from schools and social workers than machine-flagged cases. And, crucially, he says, he works hard to understand the people claiming benefits and the difficult situations they find themselves in. “If you look at statistics and just look at the screen,” he says, “you don’t see that there are people behind it.”
Additional reporting by Daniel Howden, Soizic Penicaud, Pablo Jiménez Arandia, and Htet Aung. Reporting was supported by the Pulitzer Center’s AI Accountability Fellowship and the Center for Artistic Inquiry and Reporting.
Join the discussion