📁
Global Case Studies of Algorithmic Harm
Real-world examples of discrimination through tech.
0 topics 0 posts
Pinned Approved in Global Case Studies of Algorithmic Harm

SUMMARY - Global Case Studies of Algorithmic Harm

A Black man in Detroit is wrongfully arrested after facial recognition software misidentifies him, spending 30 hours in jail for a crime he did not commit while his family wonders where he has gone. A woman in Austria receives a lower employability score from a government algorithm because she is female and over 30, reducing her access to job training programs. Thousands of Dutch families are falsely accused of childcare benefit fraud by an algorithmic system that disproportionately targeted immigrants and dual nationals, leading to financial ruin, family separations, and suicides.

Alberta
Approved in Global Case Studies of Algorithmic Harm

RIPPLE

This thread documents how changes to Global Case Studies of Algorithmic Harm may affect other areas of Canadian civic life. Share your knowledge: What happens downstream when this topic changes? What industries, communities, services, or systems feel the impact? Guidelines: - Describe indirect or non-obvious connections - Explain the causal chain (A leads to B because...) - Real-world examples strengthen your contribution Comments are ranked by community votes. Well-supported causal relationships inform our simulation and planning tools.
Alberta
Subscribe to Global Case Studies of Algorithmic Harm