SUMMARY - How Data Can Help—and Harm—Community Safety Programs
A city launches a data-driven crime prevention initiative, mapping where crimes occur, identifying hot spots, directing resources to high-crime areas, and the data shows concentrated crime in neighbourhoods that are also concentrated poverty, concentrated racialization, concentrated historic disinvestment - but the intervention is more policing, not more investment, and the data that could have revealed root causes instead directs enforcement. A social program uses predictive analytics to identify youth at high risk of offending, and the algorithm flags young people who are poor, who have family members involved in crime, who live in certain neighbourhoods - essentially predicting that marginalization predicts marginalization, then targeting the marginalized for intervention. A community organization gains access to local crime data and uses it to advocate for prevention resources in high-crime areas, to hold police accountable for response patterns, to document disparities that officials deny - the same data serving different purposes depending on who wields it. A neighbourhood resident tracks calls for service, response times, and outcomes in her community, finding that her neighbourhood waits longer and receives less than wealthier areas, her citizen data project revealing inequities that official data obscured. Data can illuminate or obscure, can empower communities or target them, can drive prevention or justify enforcement. How data is collected, who analyzes it, and what purposes it serves shape whether data helps or harms community safety efforts.
The Case for Data-Driven Prevention
Advocates for data-driven approaches argue that systematic analysis of crime patterns enables smarter resource allocation, that evidence should guide intervention, and that data can reveal what intuition misses.
Data reveals patterns that inform strategy. Where crime concentrates, when it occurs, what precedes it - these patterns become visible through data analysis. Understanding patterns enables targeted intervention. Resources directed by data may be more effective than resources directed by intuition or politics.
Data enables accountability. When outcomes are measured, programs can be evaluated. What works can be identified and expanded; what fails can be discontinued. Data-driven accountability improves performance over time.
Data can identify root causes. Analysis can reveal correlations between crime and unemployment, housing instability, lack of services, and other factors that suggest prevention strategies. Data that illuminates root causes can guide investment in addressing them.
From this perspective, data-driven prevention requires: quality data collection; sophisticated analysis; translation of findings into action; and measurement of outcomes to enable continuous improvement.
The Case for Caution About Data
Others argue that data in criminal justice contexts has consistently been used to target marginalized communities, that predictive tools encode existing biases, and that data-driven approaches may cause more harm than they prevent.
Data reflects existing biases. Crime data reflects policing patterns - where police look, they find crime. Using this data to direct resources creates feedback loops where over-policed areas receive more policing because data shows more crime because they are over-policed. Data-driven approaches may amplify rather than correct bias.
Predictive tools predict marginalization. Algorithms that predict crime risk often use proxies for race and poverty. Flagging youth as high-risk based on neighbourhood and family characteristics stigmatizes and targets those already marginalized. Prediction becomes self-fulfilling prophecy.
Data serves power. Who collects data, who analyzes it, and who acts on findings determines whose interests data serves. Data in the hands of institutions has historically been used against communities rather than for them. Community-controlled data may serve different purposes than institutionally-controlled data.
From this perspective, data use requires: community control over how data is used; scrutiny of what data encodes; rejection of predictive tools that target marginalized people; and recognition that data is never neutral.
The Predictive Policing Question
Algorithms that predict where crime will occur or who will commit it raise profound concerns.
From one view, predictive tools can direct resources efficiently. If data can identify high-risk times and places, preventive presence can be directed accordingly. Prediction is simply sophisticated resource allocation.
From another view, predictive policing is pre-crime enforcement that targets people based on who they are and where they live rather than what they have done. Constitutional protections exist for good reason; prediction circumvents them. The dystopian implications of predicting crime before it happens should give pause.
Whether prediction is efficient allocation or pre-crime targeting shapes its acceptability.
The Community Data Question
Who should control data about communities?
From one perspective, communities should control their own data. Data collected about a community belongs to that community. Community members should determine how it is used, who has access, and what purposes it serves. Community data sovereignty is essential.
From another perspective, data about public safety is public data. Restricting access prevents accountability, enables manipulation, and limits legitimate research. Public data should be publicly accessible with appropriate privacy protections.
Who controls data shapes whose interests it serves.
The Privacy Question
Data collection for prevention purposes raises privacy concerns.
From one view, aggregated, anonymized data can inform prevention without compromising privacy. Crime patterns at neighbourhood level, service gaps, and resource needs can be identified without individual-level tracking. Privacy and useful data are compatible.
From another view, individual-level data that drives intervention inevitably compromises privacy. When data identifies particular youth as high-risk, particular families as needing intervention, privacy is violated. Prevention that requires surveillance may cost more than it provides.
How privacy is balanced against prevention utility shapes what data approaches are acceptable.
The Question
When data shows crime concentrating where poverty concentrates, does that suggest more policing or more investment? When algorithms predict that marginalized youth are at risk, what have we learned that we did not already know - and what do we do with that knowledge? If data can reveal root causes, why is it mostly used to direct enforcement? When communities gain access to data and use it to demand equity, what changes? When the same data serves enforcement in some hands and liberation in others, what does that tell us about data itself? And when data-driven approaches reproduce the patterns they were meant to disrupt, what should we conclude about whether data can actually help?