Amnesty warns against algorithms in social assistance

Human rights organization Amnesty International warns against the use of risk profiles and algorithms in the assessment of social benefits. According to Amnesty, socially disadvantaged people can lose their social support due to incorrect interpretation of data. 

Amnesty investigated the functioning of the Social Map in the Balkan Republic of Serbia. This social assessment methodology was introduced there last year – with the support of the World Bank. The research showed that the Roma community and people with disabilities in particular are seriously affected by the use of this algorithm. 

One of the disadvantages is that client-contact officials spend more time checking and entering data, rather than talking to the client, it is stated.

The report raises questions about the global use of algorithms in decision-making in social services. Amnesty emphasizes the need for transparency and ethics in such systems.

The Amnesty investigation describes, among other things, the case of a Roma mother whose social benefits were withdrawn because a charity organization had helped pay for her daughter's funeral costs. In the Netherlands there are cases of Social Services stopping or reducing benefits because a generous donor or a family member had donated 'a bag of groceries'.

The research shows that many people lose track when they end up in computer-controlled procedures, as was previously the case in the Netherlands with the child benefit processes. It seems that the algorithm makes the socially disadvantaged even more vulnerable, instead of supporting them as originally intended.

The human rights organization calls on authorities to review the system and ensure that it is fair and just for all citizens, regardless of their background or disabilities.