Technology is not neutral, as facial recognition algorithms and predictive policing have shown us. Algorithms discriminate by design, reflecting and reinforcing pre-existing biases.
Technology can transgress all kinds of legal frameworks.
Crime data reflect only what crimes are identified by the police – not all the crimes that occur. So decisions based on crime data are necessarily biased and incompletely informed.
For 50 years, we have worked to make U.S. police more diverse and less intrusive. Why haven't we made more progress?
Preventing crime before it happens, while saving resources, sounds like a great use of big data. But these calculated probabilities raise big questions about civil liberties.