• Pine Studios

Welfare surveillance system violates human rights, Dutch court rules

Government told to halt use of AI to detect fraud in decision hailed by privacy campaigners

Jon Henley and Robert Booth

Wed 5 Feb 2020 13.18 GMTLast modified on Wed 5 Feb 2020 18.30 GMT

Shares772

A Dutch court has ordered the immediate halt of an automated surveillance system for detecting welfare fraud because it violates human rights, in a judgment likely to resonate well beyond the Netherlands.

The case was seen as an important legal challenge to the controversial but growing use by governments around the world of artificial intelligence (AI) and risk modelling in administering welfare benefits and other core services.

Campaigners say such “digital welfare states” – developed often without consultation, and operated secretively and without adequate oversight – amount to spying on the poor, breaching privacy and human rights norms and unfairly penalising the most vulnerable.

📷

Get Society Weekly: our newsletter for public service professionals

Read more

In the UK, where the government is accelerating the development of robots in the benefits system, the chairman of the House of Commons work and pensions select committee, Stephen Timms, said: “This ruling by the Dutch courts demonstrates that parliaments ought to look very closely at the ways in which governments use technology in the social security system, to protect the rights of their citizens.”

The UN special rapporteur on extreme poverty and human rights, Philip Alston, applauded the verdict and said it was “a clear victory for all those who are justifiably concerned about the serious threats digital welfare systems pose for human rights”.

The decision “sets a strong legal precedent for other courts to follow”, he added. “This is one of the first times a court anywhere has stopped the use of digital technologies and abundant digital information by welfare authorities on human rights grounds.”

The verdict will be watched closely by welfare rights campaigners in the UK, where the Department for Work and Pensions is engaged in a digitisation drive that vulnerable claimants fear could plunge them further into hunger and debt.

Advertisement

A Guardian investigation in October found the Department for Work and Pensions (DWP) had increased spending to about £8m a year on a specialist “intelligent automation garage” where computer scientists were developing more than 100 welfare robots, deep learning and intelligent automation for use in the welfare system.

The Dutch government’s risk indication system (SyRI) is a risk calculation model developed over the past decade by the social affairs and employment ministry to predict the likelihood of an individual committing benefit or tax fraud or violating labour laws.

Deployed primarily in low-income neighbourhoods, it gathers government data previously held in separate silos, such as employment, personal debt and benefit records, and education and housing histories, then analyses it using a secret algorithm to identify which individuals might be at higher risk of committing benefit fraud.

A broad coalition of privacy and welfare rights groups, backed by the largest Dutch trade union, argued that poor neighbourhoods and their inhabitants were being spied on digitally without any concrete suspicion of individual wrongdoing. SyRI was disproportionately targeting poorer citizens, they said, violating human rights norms.

The court ruled that the SyRI legislation contained insufficient safeguards against privacy intrusions and criticised a “serious lack of transparency” about how it worked. It concluded in its ruling that, in the absence of more information, the system may, in targeting poor neighbourhoods, amount to discrimination on the basis of socioeconomic or migrant status.

📷

Sentencing: minister rejects European human rights convention warning

Read more

The system did not pass the test required by the European convention on human rights of a “fair balance” between its objectives, namely to prevent and combat fraud in the interest of economic wellbeing, and the violation of privacy that its use entailed, the court added, declaring the legislation was therefore unlawful. The Dutch government can appeal against the decision.

Christiaan van Veen, director of the digital welfare state and human rights project at New York University School of Law, said it was “important to underline that SyRI is not a unique system; many other governments are experimenting with automated decision-making in the welfare state”.

Van Veen cited Australia and the UK as countries where such concerns were particularly acute. “This strong ruling will set a strong precedent globally that will encourage activists in other countries to challenge their governments,” he said.

Advertisement

Alston predicted the judgment would be “a wake-up call for politicians and others, not just in the Netherlands”. The special rapporteur presented a report to the UN general assembly in October on the emergence of the “digital welfare state” in countries around the globe, warning of the need “to alter course significantly and rapidly to avoid stumbling, zombie-like, into a digital welfare dystopia”.

In the UK, as well as contracts with the outsourcing multinationals IBM, Tata Consultancy and Capgemini, the DWP is also working with UiPath, a New York-based company co-founded by Daniel Dines, the world’s first “bot billionaire”, who last month said: “I want a robot for every person.”

His software is being deployed in an effort to introduce machine learning to check benefit claims, which suggests welfare computers will autonomously learn and alter the way they make decisions with minimum human intervention.

We won’t let Brexit come between us…

… and we hope you feel the same. Britain may be leaving the EU, but the Guardian remains committed to Europe, doubling down on the ideas and interests that we share. Our independent, fact-based reporting will inform Britain about Europe, Europe about Britain, and the rest of the world about both. These are turbulent, decade-defining times. But we will stay with you, delivering quality journalism so we can all make up our minds based on fact, not fiction.

You've read 42 articles in the last four months. More people, like you, are reading and supporting the Guardian’s independent, investigative journalism than ever before. And unlike many news organisations, we made the choice to keep our reporting open for all, regardless of where they live or what they can afford to pay.

The Guardian will engage with the most critical issues of our time – from the escalating climate emergency to widespread inequality to the influence of big tech on our lives. At a time when factual information is a necessity, we believe that each of us, around the world, deserves access to accurate reporting with integrity at its heart.

Our editorial independence means we set our own agenda and voice our own opinions. Guardian journalism is free from commercial and political bias and not influenced by billionaire owners or shareholders. This means we can give a voice to those less heard, explore where others turn away, and rigorously challenge those in power.

We hope you will consider supporting us today. We need your support to keep delivering quality journalism that’s open and independent. Every reader contribution, however big or small, is so valuable.

GTA 5 MODDED ACCCOUNT

60% DISCOUNT TODAY!

TRENDING ⇳

All Rights Reserved © 2020 PineModz