Menu Close

Articles on Predictive policing

Displaying all articles

A CCTV camera sculpture in Toronto draws attention to the increasing surveillance in everyday life. Our guests discuss ways to resist this creeping culture. Lianhao Qu /Unsplash

Being Watched: How surveillance amplifies racist policing and threatens the right to protest — Don’t Call Me Resilient EP 10

Mass data collection and surveillance have become ubiquitous. For marginalized communities, the stakes of having their privacy violated are high.
A photo of art work by Banksy in London comments on the power imbalance of surveillance technology. Guests on this episode discuss how AI and Facial recognition have been flagged by civil rights leaders due to its inherent racial bias. Niv Singer/Unsplash

Being Watched: How surveillance amplifies racist policing and threatens the right to protest — Don’t Call Me Resilient EP 10 transcript

Once analysts gain access to our private data, they can use that information to influence and alter our behaviour and choices. If you’re marginalized in some way, the consequences are worse.
Facial recognition algorithms are usually tested using white faces, which results in the technology being unable to differentiate between racialized individuals. (Shutterstock)

AI technologies — like police facial recognition — discriminate against people of colour

Technology is not neutral, as facial recognition algorithms and predictive policing have shown us. Algorithms discriminate by design, reflecting and reinforcing pre-existing biases.
A vigil in memory of Alton Sterling, who was shot dead by police. Baton Rouge, Louisiana. REUTERS/Jeffrey Dubinsky

Why is it so hard to improve American policing?

For 50 years, we have worked to make U.S. police more diverse and less intrusive. Why haven’t we made more progress?

Top contributors

More