So much of our lives happens online.
When the global pandemic was declared, for many of us, the shift from in-person life to digital life was almost total. We attended classes online, went on virtual dates, had Zoom parties with our friends and video consultations with our doctors.
Social media took on an outsized role in how we kept in touch with loved ones and shared our reactions to the news. For many of us, our digital footprint has exploded in size — there is more information than ever about our health, what we think, where we live, how we look and who we love available online.
One could argue that artificial intelligence technology has an upside, like when it tracks and predicts climate change. But there are also a lot of downsides. And even the potential benefits can have negative implications.
Although we sometimes opt in to share personal information in exchange for the convenience of apps and services, there are other times when our information is shared — and used — without our permission, and often without our knowledge. For example, in 2020, Clearview AI was essentially kicked out of Canada for collecting a database of billions of Canadian faces they sold to police departments and private companies.
Once analysts gain access to our private data, they can use that information to influence and alter our behaviour and choices. And if you’re marginalized in some way, the consequences are worse.
Experts have been warning about the dangers of data collection for a while now, especially for Black, Indigenous and racialized people. This year, Amnesty International called for the banning of facial recognition technology, calling it a form of mass surveillance that amplifies racist policing and threatens the right to protest.
What can we do to resist this creeping culture of surveillance?
Our guests today on this episode of Don’t Call Me Resilient have some ideas. They are experts in discrimination and technology. Yuan Stevens is the policy lead on technology, cybersecurity and democracy at the Ryerson Leadership Lab and a research fellow at the Centre for Media, Technology and Democracy at McGill School of Public Policy. Her work looks at technology’s impact on vulnerable populations. Wendy Hui Kyong Chun is the Canada 150 Research Chair in New Media at Simon Fraser University where she also heads up the Digital Democracies Institute. She’s the author of several books — her most recent is Discriminating Data.
For a full transcript of this episode of Don’t Call Me Resilient, go here.
Additional reading
Each week, we highlight articles or books that drill down into the topics we discuss in the episode. This week:
How police surveillance technologies act as tools of white supremacy
AI technologies — like police facial recognition — discriminate against people of colour
Collecting race-based data during coronavirus pandemic may fuel dangerous prejudices
Dark Matters: On the Surveillance of Blackness
Algorithmic Policing in Canada Explained
Follow and listen
You can listen or subscribe on Apple Podcasts, Google Podcasts, Spotify or wherever you listen to your favourite podcasts. We’d love to hear from you, including any ideas for future episodes. Join The Conversation on Twitter, Facebook and Instagram and use #DontCallMeResilient.
Don’t Call Me Resilient is a production of The Conversation Canada. This podcast was produced with a grant for Journalism Innovation from the Social Sciences and Humanities Research Council of Canada. The series is produced and hosted by Vinita Srivastava. Our producer is Susana Ferreira. Our associate producer is Ibrahim Daair. Reza Dahya is our sound producer. Our consulting producer is Jennifer Moroz. Lisa Varano is our audience development editor and Scott White is the CEO of The Conversation Canada. Zaki Ibrahim wrote and performed the music we use on the pod. The track is called Something in the Water.