📁
Bias in Facial Recognition and Surveillance
Impacts on racialized groups, gender minorities, and privacy.
0 topics 0 posts
Pinned Approved in Bias in Facial Recognition and Surveillance

SUMMARY - Bias in Facial Recognition and Surveillance

A landmark study reveals that facial recognition systems from major technology companies achieve near-perfect accuracy on light-skinned male faces while error rates for dark-skinned women reach 35 percent. A Black man in Detroit spends 30 hours in jail after facial recognition software misidentifies him as a shoplifting suspect, his face apparently interchangeable with another Black man's in the eyes of the algorithm.

Alberta
Approved in Bias in Facial Recognition and Surveillance

RIPPLE

This thread documents how changes to Bias in Facial Recognition and Surveillance may affect other areas of Canadian civic life. Share your knowledge: What happens downstream when this topic changes? What industries, communities, services, or systems feel the impact? Guidelines: - Describe indirect or non-obvious connections - Explain the causal chain (A leads to B because...) - Real-world examples strengthen your contribution Comments are ranked by community votes. Well-supported causal relationships inform our simulation and planning tools.
Alberta
Subscribe to Bias in Facial Recognition and Surveillance