Human rights council warns algorithms in schools can be biased
The algorithms in the digital systems which schools use to analyse children’s performance and adjust language and arithmetic lessons to “their level” can lead to unequal treatment, the Dutch human rights council says in a new report.
Although digital systems can help with children’s development, bias also plays a part in education. This means if the systems are not properly tested, preconceptions about children’s potential skills may become reinforced, leading to discrimination and inequality, the researchers said.
For example, the report said, a system may have problems assessing the potential of a child with ADHD or dyslexia and a child labeled “weak” by the system may keep that label even though they have shown steady improvement.
Schools should assess thoroughly if using such systems make a positive contribution to education before using them, the council said. “Schools,” the council said, “must make demands of software suppliers when it comes to equal treatment, privacy, autonomy and transparency.”
At the same time, the college said it recognised schools have many different tasks and staff shortages are high. “It is sometimes difficult for teachers and school heads to be critical of the resources they use. After all, people tend to believe what a computer says.”
There have been numerous cases of biased algorithms impacting on people in the Netherlands in recent years.
An algorithm used by student finance service Duo did “indirectly discriminate” against youngsters with an ethnic minority background via its fraud strategy, advisory group PwC said in a report for the education ministry in March.
The childcare benefit scandal which resulted in tens of thousands of Dutch parents being incorrectly accused of fraud and unjustly ordered to pay back thousands of euros in childcare benefit by the Dutch tax office was also partly down to an algorithm.
In August 2023, the police said they would stop using an algorithm to predict if someone will show violent behaviour “immediately”, following an investigation by website Follow the Money.
And in May last year, it emerged that the foreign affairs ministry has been using a profiling system to analyse the risk posed by people applying for short-stay visas for the Schengen area since 2015.
Thank you for donating to DutchNews.nl.
We could not provide the Dutch News service, and keep it free of charge, without the generous support of our readers. Your donations allow us to report on issues you tell us matter, and provide you with a summary of the most important Dutch news each day.
Make a donation