tag:theconversation.com,2011:/africa/topics/predictive-policing-22313/articlesPredictive policing – The Conversation2024-02-07T13:19:28Ztag:theconversation.com,2011:article/2226992024-02-07T13:19:28Z2024-02-07T13:19:28ZDOJ funding pipeline subsidizes questionable big data surveillance technologies<figure><img src="https://images.theconversation.com/files/573845/original/file-20240206-28-bp34iu.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C6720%2C4476&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Predictive policing aimed to identify crime hot spots and 'chronic' offenders but missed the mark.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/los-angeles-ca-lapd-captain-elizabeth-morales-speaks-during-news-photo/624080088">Patrick T. Fallon for The Washington Post via Getty Images</a></span></figcaption></figure><p>Predictive policing has been shown to be an ineffective and biased policing tool. Yet, the Department of Justice has been funding the crime surveillance and analysis technology for years and continues to do so despite criticism from researchers, privacy advocates and members of Congress.</p>
<p>Sen. Ron Wyden, D-Ore., and U.S. Rep. Yvette Clarke, D-N.Y., joined by five Democratic senators, called on Attorney General Merrick Garland to <a href="https://www.wyden.senate.gov/news/press-releases/wyden-and-clarke-press-justice-department-to-end-funding-for-flawed-predictive-policing-systems">halt funding</a> for <a href="https://www.brennancenter.org/our-work/research-reports/predictive-policing-explained">predictive policing technologies</a> in a letter issued Jan. 29, 2024. Predictive policing involves analyzing crime data in an attempt to identify where and when crimes are likely to occur and who is likely to commit them.</p>
<p>The <a href="https://www.wyden.senate.gov/imo/media/doc/letter_to_doj_predictive_policing_and_title_vi_1242024.pdf">request</a> came months after the Department of Justice <a href="https://gizmodo.com/justice-department-kept-few-records-on-predictive-polic-1848660323">failed to answer</a> basic questions about how predictive policing funds were being used and who was being harmed by arguably <a href="https://doi.org/10.1007/s12103-020-09557-x">racially discriminatory algorithms</a> that have <a href="https://doi.org/10.3390/socsci10060234">never been proven to work as intended</a>. The Department of Justice <a href="https://gizmodo.com/justice-department-kept-few-records-on-predictive-polic-1848660323">did not have answers</a> to who was using the technology, how it was being evaluated and which communities were affected.</p>
<p>While focused on predictive policing, the senators’ demand raises what I, a law professor who <a href="https://scholar.google.com/citations?hl=en&user=5ZX7SbEAAAAJ&view_op=list_works&sortby=pubdate">studies big data surveillance</a>, see as a bigger issue: What is the Department of Justice’s role in funding new surveillance technologies? The answer is surprising and reveals an entire ecosystem of how technology companies, police departments and academics benefit from the flow of federal dollars.</p>
<h2>The money pipeline</h2>
<p>The <a href="https://nij.ojp.gov/">National Institute of Justice</a>, the DOJ’s research, development and evaluation arm, regularly provides seed money for grants and pilot projects to test out ideas like predictive policing. It was a National Institute of Justice grant that funded the first predictive policing <a href="https://www.ojp.gov/ncjrs/virtual-library/abstracts/predictive-policing-symposium-november-18-20-2009">conference in 2009</a> that launched the idea that past crime data could be run through an algorithm to <a href="https://digitalcommons.wcl.american.edu/facsch_lawrev/750/">predict future criminal risk</a>. The institute has <a href="https://nij.ojp.gov/funding/awards/list?field_award_status_value=All&state=All&field_funding_type_value=All&field_served_nationally_value=All&form_topic=&fiscal_year=&combine_awards=Predictive+Policing&awardee=&city=#awards-awards-list-block-jyhir1inpckhocqi">given US$10 million dollars</a> to predictive policing projects since 2009. </p>
<p>Because there was grant money available to test out new theories, academics and startup companies could afford to invest in <a href="https://www.npr.org/2013/07/26/205835674/can-software-that-predicts-crime-pass-constitutional-muster">new ideas</a>. Predictive policing was just an academic theory until there was cash to start testing it in various police departments. Suddenly, companies launched with the financial security that federal grants could pay their early bills. </p>
<p>National Institute of Justice-funded <a href="https://nij.ojp.gov/library/publications/risk-terrain-modeling-spatial-risk-assessment">research</a> often turns into for-profit companies. Police departments also benefit from getting money to buy the new technology without having to dip into their local budgets. This dynamic is one of the hidden drivers of police technology.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/WXnElg9alF8?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">How predictive policing works – and the harm it can cause.</span></figcaption>
</figure>
<p>Once a new technology gets big enough, another DOJ entity, the <a href="https://bja.ojp.gov/">Bureau of Justice Assistance</a>, funds projects with direct financial grants. The bureau funded police departments to test one of the biggest place-based predictive policing technologies – <a href="https://bja.ojp.gov/funding/awards/list?field_award_status_value=All&state=All&field_funding_type_value=All&field_served_nationally_value=All&fiscal_year=&combine_awards=Predpol&awardee=&city=#awards-awards-list-block-gkgdpm1ooymuyukj">PredPol</a> – in its early years. The bureau has also funded the purchase of other <a href="https://bja.ojp.gov/funding/awards/list?field_award_status_value=All&state=All&field_funding_type_value=All&field_served_nationally_value=All&fiscal_year=&combine_awards=Predictive&awardee=&city=#awards-awards-list-block-gkgdpm1ooymuyukj">predictive technologies</a>.</p>
<p>The Bureau of Justice Assistance funded one of the most <a href="https://theintercept.com/2018/05/11/predictive-policing-surveillance-los-angeles/">infamous</a> person-based predictive policing <a href="https://bja.ojp.gov/sites/g/files/xyckuh186/files/media/document/losangelesspi.pdf">pilots in Los Angeles</a>, operation LASER, which targeted “chronic offenders.” Both experiments – PredPol and LASER – failed to work as intended. The <a href="https://www.documentcloud.org/documents/5766472-BPC-19-0072#document/p32/a486274">Los Angeles Office of the Inspector General</a> identified the negative impact of the programs on the community – and the fact that the predictive theories did not work to reduce crime in any significant way.</p>
<p>As these DOJ entities’ practices indicate, federal money not only seeds but feeds the growth of new policing technologies. Since 2005, the Bureau of Justice Assistance has given <a href="https://bja.ojp.gov/doc/jag-program-fact-sheet.pdf">over $7.6 billion</a> of federal money to state, local and tribal law enforcement agencies for a host of projects. Some of that money has gone directly to new surveillance technologies. A quick skim through the <a href="https://bja.ojp.gov/funding/awards/list">public grants</a> shows approximately $3 million directed to <a href="https://bja.ojp.gov/funding/awards/list?field_award_status_value=All&state=All&field_funding_type_value=All&field_served_nationally_value=All&fiscal_year=&combine_awards=facial+recognition&awardee=&city=#awards-awards-list-block-gkgdpm1ooymuyukj">facial recognition</a>, $8 million for <a href="https://bja.ojp.gov/funding/awards/list?field_award_status_value=All&state=All&field_funding_type_value=All&field_served_nationally_value=All&fiscal_year=&combine_awards=shotspotter&awardee=&city=#awards-awards-list-block-gkgdpm1ooymuyukj">ShotSpotter</a> and $13 million to build and grow <a href="https://bja.ojp.gov/funding/awards/list?field_award_status_value=All&state=All&field_funding_type_value=All&field_served_nationally_value=All&fiscal_year=&combine_awards=RTCC&awardee=&city=#awards-awards-list-block-gkgdpm1ooymuyukj">real-time crime centers</a>. ShotSpotter (now rebranded as SoundThinking) is the leading brand of <a href="https://thehill.com/policy/technology/3558382-gunshot-detection-system-expanding-rapidly-in-us-despite-criticism/">gunshot detection technology</a>. Real-time crime centers combine security camera feeds and other data to <a href="https://www.wired.com/story/real-time-crime-centers-rtcc-us-police/">provide surveillance for a city</a>.</p>
<h2>The questions not asked</h2>
<p>None of this is necessarily nefarious. The Department of Justice is in the business of prosecution, so it is not surprising for it to fund prosecution tools. The <a href="https://nij.ojp.gov/topics/articles/brief-history-nij">National Institute of Justice</a> exists as a research body inside the Office of Justice Programs, so its role in helping to promote data-driven policing strategies is not inherently problematic. The <a href="https://bja.ojp.gov/about">Bureau of Justice Assistance</a> exists to assist local law enforcement through financial grants. The DOJ is feeding police surveillance power because it benefits law enforcement interests.</p>
<p>The problem, as indicated by Sen. Wyden’s letter, is that in subsidizing experimental surveillance technologies, the Department of Justice did not do basic risk assessment or racial justice evaluations before investing money in a new technological solution. As someone who has <a href="https://digitalcommons.wcl.american.edu/facsch_lawrev/749/">studied predictive policing</a> for over a decade, I can say that the questions asked by the senators were not asked in the pilot projects. </p>
<p>Basic questions of who would be affected, whether there could be a racially discriminatory impact, how it would change policing and whether it worked were not raised in any serious way. Worse, the focus was on deploying something new, not double-checking whether it worked. If you are going to seed and feed a potentially dangerous technology, you also have an obligation to weed it out once it turns out to be harming people.</p>
<p>Only now, after <a href="https://stoplapdspying.org/action/our-fights/data-driven-policing/predictive-policing/">activists have protested</a>, after scholars have <a href="https://nyupress.org/9781479892822/the-rise-of-big-data-policing/">critiqued</a> and after the original predictive policing companies have shut down or <a href="https://www.wired.com/story/soundthinking-geolitica-acquisition-predictive-policing/">been bought by bigger companies</a>, is the DOJ starting to ask the hard questions. In January 2024, the DOJ and the Department of Homeland Security asked for public comment to be included in a report on law enforcement agencies’ use of facial recognition technology, other technologies using biometric information and predictive algorithms. </p>
<p>Arising from a mandate under <a href="https://cops.usdoj.gov/Public_Trust_and_Safety_EO">executive order 14074</a> on advancing effective, accountable policing and criminal justice practices to enhance public trust and public safety, the DOJ Office of Legal Policy is going to evaluate how predictive policing affects civil rights and civil liberties. I believe that this is a good step – although a decade too late. </p>
<h2>Lessons not learned?</h2>
<p>The bigger problem is that the same process is happening again today with other technologies. As one example, <a href="https://www.wired.com/story/real-time-crime-centers-rtcc-us-police/">real-time crime centers</a> are being built <a href="https://crosscut.com/investigations/2023/07/federal-aid-supercharging-local-wa-police-surveillance-tech">across America</a>. Thousands of security cameras stream to a <a href="https://statescoop.com/real-time-crime-centers-police-privacy/">single command center</a> that is <a href="https://www.policemag.com/technology/article/15635270/how-technology-powers-real-time-crime-centers">linked</a> to automated license plate readers, gunshot detection sensors and 911 calls. The centers also use video analytics technology to identify and track people and objects across a city. And they tap into data about past crime.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/573642/original/file-20240206-21-hcjcrr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A wall of monitors shows aerial and street views of a city" src="https://images.theconversation.com/files/573642/original/file-20240206-21-hcjcrr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/573642/original/file-20240206-21-hcjcrr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/573642/original/file-20240206-21-hcjcrr.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/573642/original/file-20240206-21-hcjcrr.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/573642/original/file-20240206-21-hcjcrr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/573642/original/file-20240206-21-hcjcrr.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/573642/original/file-20240206-21-hcjcrr.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Real-time crime centers like this one in Albuquerque, N.M., enable police surveillance of entire cities.</span>
<span class="attribution"><a class="source" href="https://newsroom.ap.org/detail/NewMexicoFightingCrime/edee0f4a6fcc4a12a30dfa1f0d5a8959/photo">AP Photo/Susan Montoya Bryan</a></span>
</figcaption>
</figure>
<p>Millions of <a href="https://www.themarshallproject.org/2022/09/07/how-federal-covid-relief-flows-to-the-criminal-justice-system">federal dollars from the American Rescue Plan Act</a> are <a href="https://www.npr.org/2023/08/16/1194115202/real-time-crime-centers-which-started-in-bigger-cities-spread-across-the-u-s">going to cities</a> with the specific designation to <a href="https://epic.org/two-years-in-covid-19-relief-money-fueling-rise-of-police-surveillance/">address crime</a>, and some of those dollars have been <a href="https://epic.org/wp-content/uploads/2023/03/EPIC-ARPA-Surveillance-Funding-Table.pdf">diverted to build real-time crime centers</a>. They’re also being <a href="https://bja.ojp.gov/funding/awards/15pbja-22-gg-02156-jagx">funded by the Bureau of Justice Assistance</a>.</p>
<p>Real-time crime centers can do predictive analytics akin to predictive policing simply as a byproduct of all the data they collect in the ordinary course of a day. The centers can also scan entire cities with powerful computer vision-enabled cameras and react in real time. The capabilities of these advanced technologies make the civil liberties and racial justice fears around predictive policing pale in comparison. </p>
<p>So while the American public waits for answers about a technology, predictive policing, that had its heyday 10 years ago, the DOJ is seeding and feeding a far more invasive surveillance system with few questions asked. Perhaps things will go differently this time. Maybe the DOJ/DHS report on predictive algorithms will look inward at the department’s own culpability in seeding the surveillance problems of tomorrow.</p><img src="https://counter.theconversation.com/content/222699/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>I have worked as an unpaid consultant on two NIJ grants. I did not receive any compensation. One grant was an early NIJ grant to the Risk Terrain Modeling folks at Rutgers (which became Simsi). I have not had any relationship with them in years and took no money. I was also on an NIJ grant around the ethics of predictive policing. Again, I did not receive any financial compensation for the role. </span></em></p>Predictive policing has been a bust. The Department of Justice nurtured the technology from researchers’ minds to corporate production lines and into the hands of police departments.Andrew Guthrie Ferguson, Professor of Law, American UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1912662022-10-24T13:17:41Z2022-10-24T13:17:41ZArtificial intelligence is used for predictive policing in the US and UK – South Africa should embrace it, too<figure><img src="https://images.theconversation.com/files/486478/original/file-20220926-14-5pa015.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Predictive policing may be a useful addition to traditional policing in contexts like South Africa.</span> <span class="attribution"><span class="source">Fani Mahuntsi/Gallo Images via Getty Images</span></span></figcaption></figure><p>In the 2002 movie Minority Report (based on a <a href="https://www.goodreads.com/book/show/581125.The_Minority_Report">short story</a> by Philip K Dick), director Steven Spielberg imagined a future in which three psychics can “see” murders before they happen. Their clairvoyance allows Tom Cruise and his “Precrime” police force to avert nearly all potential homicides.</p>
<p>Twenty years on, in the real world, scientists and law enforcement agencies are using data mining and machine learning to mimic those psychics. Such “<a href="https://www.science.org/content/article/can-predictive-policing-prevent-crime-it-happens">predictive policing</a>”, as it is called, is based on the fact that many crimes – and criminals – have <a href="https://www.taylorfrancis.com/chapters/edit/10.4324/9780203118214-13/crime-pattern-theory-paul-brantingham-patricia-brantingham">detectable patterns</a>.</p>
<p>Predictive policing has enjoyed some successes. In a <a href="https://www.tandfonline.com/doi/full/10.1080/01900692.2019.1575664">case study</a> in the US, one police department was able to reduce gun incidents by 47% over the typically gun-happy New Year’s Eve. <a href="https://www.ironsidegroup.com/case-study/predictive-policing-success-manchester-police-department/">Manchester police</a> in the UK were similarly able to predict and reduce robberies, burglaries and thefts from motor vehicles by double digits in the first 10 weeks of rolling out predictive measures.</p>
<p>Predictive policing has improved in leaps and bounds. <a href="https://link.springer.com/content/pdf/10.1007/978-3-642-40994-3_33.pdf">In the past</a>, humans had to manually pore over crime reports or filter through national crime databases. Now, in the age of <a href="https://www.igi-global.com/book/data-mining-trends-applications-criminal/146986">big data, data mining</a> and powerful computers, that process can be automated. </p>
<p>But merely finding information is not enough to deter crime. The data needs to be analysed to detect underlying patterns and relationships. Scientists deploy algorithms and mathematical models such as machine learning, which imitates the way humans learn, to extract useful information and insights from existing data. </p>
<p>Recently, we <a href="https://dl.acm.org/doi/fullHtml/10.1145/3488933.3488973">turned to</a> a mathematical method conceived in the 18th century to refine our approach. By tweaking an existing algorithm based on this method, we significantly improved its crime prediction rates.</p>
<p>This finding holds promise for applying predictive policing in under-resourced contexts like South Africa. This could help reduce crime levels – some of the highest in the world and <a href="https://www.dailymaverick.co.za/article/2022-06-03-crime-crisis-continues-in-first-quarter-of-2022-with-women-and-children-worst-affected">rising</a>. It’s a situation the country’s police force seems <a href="https://africacheck.org/infofinder/explore-facts/how-many-people-does-one-police-officer-serve-south-africa">ill-equipped</a> to curb.</p>
<h2>Marrying two different approaches</h2>
<p>Thomas Bayes was a British mathematician. His famed <a href="https://www.mathsisfun.com/data/bayes-theorem.html">Bayes’ theorem</a> essentially describes the probability of an event occurring based on some prior knowledge of conditions that may be related to that event. Today, Bayesian analysis is commonplace in fields as diverse as artificial intelligence, astrophysics, finance, gambling and weather forecasting. We fine-tuned the Naïve Bayes algorithm and <a href="https://dl.acm.org/doi/fullHtml/10.1145/3488933.3488973">put it to the test</a> as a crime predictor. </p>
<p>Bayesian analysis can use probability statements to answer research questions about unknown parameters of statistical models. For example, what is the probability that a suspect accused of a crime is guilty? But going deeper – like calculating how poker cards may unfold, or how humans (especially humans with criminal intent) will act – requires increasingly sophisticated technologies and algorithms. </p>
<p><a href="https://dl.acm.org/doi/fullHtml/10.1145/3488933.3488973">Our research</a> built on the Naïve Bayes algorithm or classifier, a popular supervised machine learning algorithm, for <a href="https://dl.acm.org/doi/fullHtml/10.1145/3488933.3488973">crime prediction</a>. </p>
<p>Naïve Bayes starts on the premise that features – the variables that serve as input – are conditionally independent, meaning that the presence of one feature does not affect the others.</p>
<p>We fine-tuned the Naïve Bayes algorithm by marrying it with another algorithm known as <a href="https://machinelearningmastery.com/rfe-feature-selection-in-python/">Recursive Feature Elimination</a>. This tool assists in selecting the more significant features in a dataset and removing the weaker ones, with the objective of improving the results.</p>
<p>We then applied our finessed algorithm to a popular experimental dataset extracted from the Chicago Police Department’s <a href="https://home.chicagopolice.org/services/clearmap-application/">CLEAR</a> (Citizen Law Enforcement Analysis and Reporting) system, which has been used to predict and reduce crime in that American city. That dataset has been applied globally because of the rich data it contains: it provides incident-level crime data, registered offenders, community concerns, and locations of police stations in the city.</p>
<p>We compared the results of our enhanced Naïve Bayes against that of the original Naïve Bayes, as well as against other predictive algorithms such as Random Forests and Extremely Randomized Trees (algorithms we have <a href="https://dl.acm.org/doi/fullHtml/10.1145/3488933.3488972">also worked</a> on for crime prediction). We found that we could improve on the predictions of the Naïve Bayes by about 30%, and could either match or improve on the predictions of the other algorithms.</p>
<h2>Data and bias</h2>
<p>While our model holds promise, there’s one element that’s sorely lacking in applying it to South African contexts: data. As the Chicago CLEAR system illustrates, predictive models work best when you have lots of relevant data to work with. But South Africa’s police force has historically been very tight-fisted with its data, perhaps due to confidentiality issues. I ran into this problem in my <a href="https://open.uct.ac.za/handle/11427/25319">doctoral research</a> on detecting and mapping crime series.</p>
<p>This is slowly shifting. We are currently running a small case study in Bellville, a suburb about 20km from Cape Town’s central business district and the area in which our university is located, using the <a href="https://www.kaggle.com/datasets/slwessels/crime-statistics-for-south-africa">South African Police Service data</a> for predictive policing.</p>
<p>None of this is to suggest that predictive policing alone will solve South Africa’s crime problem. Predictive algorithms and policing are not without their flaws. Even the psychics in Minority Report, it turned out, were not error-free. Fears that these algorithms may simply reinforce racial biases, for instance, have been raised both in <a href="https://www.dailymaverick.co.za/article/2022-08-02-sa-police-may-be-jumping-the-gun-by-implementing-new-crimefighting-technologies/">South Africa</a> and <a href="https://www.science.org/content/article/can-predictive-policing-prevent-crime-it-happens">elsewhere</a>.</p>
<p>But we believe that, with continuous technological improvement, predictive policing could play an important role in bolstering the police’s responsiveness and may be a small step towards improving public confidence in the police.</p>
<p><em>Dr Olasupo Ajayi of the Department of Computer Science at the University of the Western Cape and Mr Sphamandla May, a master’s student in the department, co-authored this article and the research it’s based on.</em></p><img src="https://counter.theconversation.com/content/191266/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Omowunmi Isafiade does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Predictive policing has improved in leaps and bounds and become increasingly automated thanks to big data, data mining and powerful computers.Omowunmi Isafiade, Senior Lecturer in Computer Science, University of the Western CapeLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1675222021-10-27T13:14:26Z2021-10-27T13:14:26ZBeing Watched: How surveillance amplifies racist policing and threatens the right to protest — Don’t Call Me Resilient EP 10<figure><img src="https://images.theconversation.com/files/428566/original/file-20211026-21-1lrwvl1.jpg?ixlib=rb-1.1.0&rect=86%2C43%2C2800%2C1898&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A CCTV camera sculpture in Toronto draws attention to the increasing surveillance in everyday life. Our guests discuss ways to resist this creeping culture.</span> <span class="attribution"><span class="source">Lianhao Qu /Unsplash</span></span></figcaption></figure><iframe height="200px" width="100%" frameborder="no" scrolling="no" seamless="" src="https://player.simplecast.com/8e5484a0-56c5-49c4-b0c7-cf7458c63316?dark=true"></iframe>
<p><iframe id="tc-infographic-572" class="tc-infographic" height="100" src="https://cdn.theconversation.com/infographics/572/661898416fdc21fc4fdef6a5379efd7cac19d9d5/site/index.html" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>So much of our lives happens online. </p>
<p>When the global pandemic was declared, for many of us, the shift from in-person life to digital life was almost total. We attended classes online, went on virtual dates, had Zoom parties with our friends and video consultations with our doctors. </p>
<p>Social media took on an outsized role in how we kept in touch with loved ones and shared our reactions to the news. For many of us, our digital footprint has exploded in size — there is more information than ever about our health, what we think, where we live, how we look and who we love available online. </p>
<p>One could argue that artificial intelligence technology has an upside, like when it <a href="https://www.forbes.com/sites/bernardmarr/2021/01/04/how-artificial-intelligence-can-power-climate-change-strategy/?sh=5560f3d83482">tracks and predicts climate change</a>. But there are also a lot of downsides. And even the potential benefits can have negative implications. </p>
<p>Although we sometimes opt in to share personal information in exchange for the convenience of apps and services, there are other times when our information is shared — and used — without our permission, and often without our knowledge. For example, in 2020, <a href="https://www.priv.gc.ca/en/opc-news/news-and-announcements/2020/nr-c_200706/">Clearview AI was essentially kicked out of Canada</a> for collecting a database of billions of Canadian faces they sold to police departments and private companies. </p>
<p>Once analysts gain access to our private data, they can use that information to influence and alter our behaviour and choices. And if you’re marginalized in some way, the consequences are worse. </p>
<figure class="align-center ">
<img alt="A man lies face down on the ground while police kneel next to him." src="https://images.theconversation.com/files/428581/original/file-20211026-23-1b47d2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/428581/original/file-20211026-23-1b47d2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/428581/original/file-20211026-23-1b47d2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/428581/original/file-20211026-23-1b47d2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/428581/original/file-20211026-23-1b47d2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/428581/original/file-20211026-23-1b47d2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/428581/original/file-20211026-23-1b47d2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Surveillance plays a role in the attitude of RCMP officers towards Indigenous land defenders. Here, RCMP arrest a man during an anti-logging protest in Caycuse, B.C. in May.</span>
<span class="attribution"><span class="source">THE CANADIAN PRESS/Jen Osborne</span></span>
</figcaption>
</figure>
<p>Experts have been <a href="https://www.dukeupress.edu/dark-matters">warning about the dangers of data collection</a> for a while now, especially for Black, Indigenous and racialized people. This year, Amnesty International called for the banning of facial recognition technology, calling it a <a href="https://www.amnesty.org/en/latest/press-release/2021/01/ban-dangerous-facial-recognition-technology-that-amplifies-racist-policing/#:%7E:text=%E2%80%9CFacial%20recognition%20risks%20being%20weaponized,Rights%20Researcher%20at%20Amnesty%20International.">form of mass surveillance that amplifies racist policing and threatens the right to protest</a>. </p>
<p>What can we do to resist this creeping culture of surveillance? </p>
<p>Our guests today <a href="https://dont-call-me-resilient.simplecast.com/episodes/ep-10-being-watched-how-surveillance-amplifies-racist-policing-and-threatens-the-right-to-protest">on this episode of <em>Don’t Call Me Resilient</em></a> have some ideas. They are experts in discrimination and technology. Yuan Stevens is the policy lead on technology, cybersecurity and democracy at the Ryerson Leadership Lab and a research fellow at the Centre for Media, Technology and Democracy at McGill School of Public Policy. Her work looks at technology’s impact on vulnerable populations. Wendy Hui Kyong Chun is the Canada 150 Research Chair in New Media at Simon Fraser University where she also heads up the Digital Democracies Institute. She’s the author of several books — her most recent is <a href="https://mitpress.mit.edu/books/discriminating-data"><em>Discriminating Data</em></a>.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/428273/original/file-20211025-23-10nke5m.JPG?ixlib=rb-1.1.0&rect=95%2C71%2C3870%2C2544&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/428273/original/file-20211025-23-10nke5m.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/428273/original/file-20211025-23-10nke5m.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/428273/original/file-20211025-23-10nke5m.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/428273/original/file-20211025-23-10nke5m.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/428273/original/file-20211025-23-10nke5m.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/428273/original/file-20211025-23-10nke5m.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Police forces across Canada have begun using technology to predict who may become involved in illegal activity or where crimes might take place. Here an Oakland police officer surveils protesters with binoculars during a 2020 protest in California.</span>
<span class="attribution"><span class="source">THE CANADIAN PRESS/AP/Christian Monterrosa</span></span>
</figcaption>
</figure>
<p>For a full transcript of this episode of Don’t Call Me Resilient, go <a href="https://theconversation.com/being-watched-mass-surveillance-amplifies-racist-policing-and-threatens-the-right-to-protest-dont-call-me-resilient-ep-10-transcript-167523">here</a>.</p>
<h2>Additional reading</h2>
<p>Each week, we highlight articles or books that drill down into the topics we discuss in the episode. This week:</p>
<p><a href="https://theconversation.com/intense-police-surveillance-for-indigenous-land-defenders-contrasts-with-a-laissez-faire-stance-for-anti-vax-protesters-169589">Intense police surveillance for Indigenous land defenders contrasts with a laissez-faire stance for anti-vax protesters</a></p>
<p><a href="https://theconversation.com/how-police-surveillance-technologies-act-as-tools-of-white-supremacy-127435">How police surveillance technologies act as tools of white supremacy
</a></p>
<p><a href="https://theconversation.com/to-protect-our-privacy-and-free-speech-canada-needs-to-overhaul-its-approach-to-regulating-online-harms-170256">To protect our privacy and free speech, Canada needs to overhaul its approach to regulating online harms</a></p>
<p><a href="https://theconversation.com/inside-new-refugee-camp-like-a-prison-greece-and-other-countries-prioritize-surveillance-over-human-rights-168354">Inside new refugee camp like a ‘prison’: Greece and other countries prioritize surveillance over human rights</a></p>
<p><a href="https://theconversation.com/ai-technologies-like-police-facial-recognition-discriminate-against-people-of-colour-143227">AI technologies — like police facial recognition — discriminate against people of colour</a></p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><a class="source" href="https://theconversation.com/ca/podcasts">Click here to listen to Don’t Call Me Resilient</a></span>
</figcaption>
</figure>
<p><a href="https://theconversation.com/police-and-governments-may-increasingly-adopt-surveillance-technologies-in-response-to-coronavirus-fears-133737">Police and governments may increasingly adopt surveillance technologies in response to coronavirus fears </a></p>
<p><a href="https://theconversation.com/collecting-race-based-data-during-coronavirus-pandemic-may-fuel-dangerous-prejudices-137284">Collecting race-based data during coronavirus pandemic may fuel dangerous prejudices </a></p>
<p><a href="https://www.dukeupress.edu/dark-matters">Dark Matters: On the Surveillance of Blackness</a></p>
<p><a href="https://www.canadianlawyermag.com/practice-areas/privacy-and-data/the-new-surveillance-state/274550">The new surveillance state</a></p>
<p><a href="https://citizenlab.ca/2020/09/algorithmic-policing-in-canada-explained/">Algorithmic Policing in Canada Explained</a></p>
<h2>Follow and listen</h2>
<p>You can listen or subscribe on <a href="https://podcasts.apple.com/ca/podcast/dont-call-me-resilient/id1549798876">Apple Podcasts</a>, <a href="https://podcasts.google.com/feed/aHR0cHM6Ly9mZWVkcy5zaW1wbGVjYXN0LmNvbS9qZFg0Ql9DOA">Google Podcasts</a>, <a href="https://open.spotify.com/show/37tK4zmjWvq2Sh6jLIpzp7">Spotify</a> or <a href="https://dont-call-me-resilient.simplecast.com/">wherever you listen to your favourite podcasts</a>. <a href="mailto:theculturedesk@theconversation.com">We’d love to hear from you</a>, including any ideas for future episodes. Join The Conversation on <a href="https://twitter.com/ConversationCA">Twitter</a>, <a href="https://www.facebook.com/TheConversationCanada">Facebook</a> and <a href="https://www.instagram.com/theconversationdotcom/">Instagram</a> and use #DontCallMeResilient.</p>
<p><em>Don’t Call Me Resilient is a production of The Conversation Canada. This podcast was produced with a grant for Journalism Innovation from the Social Sciences and Humanities Research Council of Canada. The series is produced and hosted by Vinita Srivastava. Our producer is Susana Ferreira. Our associate producer is Ibrahim Daair. Reza Dahya is our sound producer. Our consulting producer is Jennifer Moroz. Lisa Varano is our audience development editor and Scott White is the CEO of The Conversation Canada. Zaki Ibrahim wrote and performed the music we use on the pod. The track is called Something in the Water.</em></p><img src="https://counter.theconversation.com/content/167522/count.gif" alt="The Conversation" width="1" height="1" />
Mass data collection and surveillance have become ubiquitous. For marginalized communities, the stakes of having their privacy violated are high.Vinita Srivastava, Host + Producer, Don't Call Me ResilientLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1675232021-10-27T13:14:24Z2021-10-27T13:14:24ZBeing Watched: How surveillance amplifies racist policing and threatens the right to protest — Don’t Call Me Resilient EP 10 transcript<figure><img src="https://images.theconversation.com/files/428563/original/file-20211026-23-j8t5oi.jpg?ixlib=rb-1.1.0&rect=98%2C119%2C3361%2C2147&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A photo of art work by Banksy in London comments on the power imbalance of surveillance technology. Guests on this episode discuss how AI and Facial recognition have been flagged by civil rights leaders due to its inherent racial bias.</span> <span class="attribution"><span class="source">Niv Singer/Unsplash</span></span></figcaption></figure><p><iframe id="tc-infographic-572" class="tc-infographic" height="100" src="https://cdn.theconversation.com/infographics/572/661898416fdc21fc4fdef6a5379efd7cac19d9d5/site/index.html" width="100%" style="border: none" frameborder="0"></iframe></p>
<iframe height="200px" width="100%" frameborder="no" scrolling="no" seamless="" src="https://player.simplecast.com/8e5484a0-56c5-49c4-b0c7-cf7458c63316?dark=true"></iframe>
<p><a href="https://theconversation.com/being-watched-mass-surveillance-amplifies-racist-policing-and-threatens-the-right-to-protest-dont-call-me-resilient-ep-10-167522"><strong>Episode 10: Being Watched: How surveillance amplifies racist policing and threatens the right to protest</strong></a>.</p>
<p><em>NOTE: Transcripts may contain errors. Please check the corresponding audio before quoting in print.</em></p>
<p><strong>Vinita Srivastava (VS):</strong> From The Conversation, this is Don’t Call Me Resilient, I’m Vinita Srivastava.</p>
<p><strong>Wendy Hui Kyong Chun (WKC):</strong> We don’t have to accept the technology that we’re given. We can reinvent it, we could rethink it. We need to challenge the defaults.</p>
<p><strong>VS:</strong> It feels like technology, like facial recognition and artificial intelligence, are an inevitable part of our lives. We ask Google Nest or Alexa to find and play a song. We use our faces to unlock our phones and we share news articles on social media. I’ll be honest, I feel like this technology has its upsides, like when it can track and predict climate change or identify the rioters who stormed the U.S. Capitol. But there are also a lot of downsides. Once analysts gain access to our private data, they can use that information to influence and alter our behaviour and choices. And like most else, if you’re marginalized in some way, the consequences are worse. Experts have been warning about the dangers of data collection for a while now, especially for Black, Indigenous and racialized people. And this year, Amnesty International called for the banning of facial recognition technology, calling it a form of mass surveillance that amplifies racist policing and threatens the right to protest. So what can we do to resist this creeping culture of surveillance? Our guests today are experts in discrimination and technology. Yuan Stevens is the Policy Lead on Technology, Cybersecurity and Democracy at the Ryerson Leadership Lab and a research fellow at the Centre for Media, Technology and Democracy at McGill School of Public Policy. Her work examines the impacts of technology on vulnerable populations in Canada, the U.S. and Germany. Wendy Hui Kyong Chun is the Canada 150 Research Chair in New Media at Simon Fraser University and she leads the Digital Democracies Institute there. She’s the author of several books, including <em>Discriminating Data</em>, which is out this fall. I’ve been thinking non-stop about surveillance and facial recognition for the last little while, as you can imagine. I’m not living under a rock. I know that there are significant dangers around personal data collection. And yet I’m one of those complacent people. I’ve got two kids. I’ve got a full-time job. I’m really busy. And I actually love social media. I put pics of my kids on there last night. So what are some of the risks of sharing my life online? Yuan, what do you think?</p>
<p><strong>Yuan Stevens (YS):</strong> I think there is a lot at stake when it comes to the amount of data we’re giving companies and how they can treat us and what they can do with that data once they have it. So what I do in my work is I basically look at the development of technology and I think about the ways it can be abused. One of the worst possible outcomes is that we end up in a place where companies work with governments to have this data and to access this data, but to also categorize us and control us. So one of my own personal interests is how people were treated by the Stasi and by their peers in the German Democratic Republic. And I think about how different they think about their data than we do in North America because they have a history behind them of the state snooping into their lives. There’s this ethnographic study by a researcher named <a href="https://www.researchgate.net/publication/322753483_Surveillance_and_control_an_ethnographic_study_of_the_legacy_of_the_Stasi_and_its_impact_on_wellbeing">Ulrike Neuendorf</a>, and she was able to discover that the impacts of this surveillance included things like significant impacts on their well-being, mistrust and significant trauma. If you think about what it feels like for someone to know something about you that you didn’t want them to know, that is huge. </p>
<p><strong>VS:</strong> What I’m hearing you saying is that this has implications for our health, our lives, our well-being and society. I sort of understand it on a large scale, that it can result in all of these things that are troubling. But on a personal level, what are the dangers there for somebody who is like, well, I’m a law-abiding citizen, so what’s the problem?</p>
<p><strong>WKC:</strong> So I think that one thing we can do is maybe switch it a little and not say I’m a law-abiding citizen. What’s the problem? But ask, what are the conditions under which you are a law-abiding citizen? So what’s really fascinating now is the example you started with. You took pictures of your kid. You put them online. What’s wrong with this? What’s interesting now is that publicity and surveillance are so intertwined now that it’s hard to understand their difference. So in other words, when you take that picture and you put it up and you create a public persona, you engage with people. It’s not simply you putting it up, but the ways in which by you doing this, what else is happening?</p>
<p><strong>VS:</strong> I’m just going to use an example as old, old school in the ‘90s when I was an activist on campus, we knew that they were Canadian agents somewhere in our midst. We just knew that they were collecting files on us. But in my head, I imagine those files to be like manilla folder files with black and white photos. So information was just more localized. And I don’t know what it’s like to be an activist today. And I’m wondering about the — especially for racialize people, queer people, immigrants, refugees — how they might be extra targeted by this kind of information and surveillance. What’s at stake for these communities?</p>
<p><strong>YS:</strong> Yeah, I think it’s a really good question of who stands to be, I think, the most targeted and harmed by the use of surveillance technology. So whether you’re queer or a religious minority or person of colour or if you are protected under discrimination law, what that means is that you deserve treatment that ensures that your rights are protected in the same way that you would be if you were a dominant group. It’s absolutely true that certain groups are going to be more targeted than others. So if you look at predictive policing technologies, there are certain logics inscribed in the use of and design of those technologies that can further perpetuate realities or statistical findings that existed before. So, for example, if you decide to deploy police to a certain neighbourhood because there are more instances of crime there, in fact, what you could be doing is finding crime more often there, primarily because you’re actually sending police there more than if you were to send them to another neighbourhood, for example.</p>
<p><strong>VS:</strong> You’re just looking more basically.</p>
<p><strong>YS:</strong> Exactly. That’s one of the instances in ways in which people of colour, for example, and racialized people can be further subject to surveillance and further found guilty of crimes because what you have is a feedback loop. So feedback loops are a really important concept when you’re looking at surveillance studies in the context of technologies.</p>
<p><strong>VS:</strong> Every time I think of predictive policing, I’m thinking about this dystopian movie, <em>Minority Report</em>.</p>
<p><strong>WKC:</strong> So a classic example of this is the Chicago heat list, which is now no longer being used. And there, they came up with — allegedly — they said what we’re doing is just coming up with a list of people most likely to be murdered or to murder somebody and then we’re going to go visit them and say, “look, you better change your ways or else something bad is going to happen.”</p>
<p><strong>VS:</strong> Oh, my God, that dystopian movie is actually real.</p>
<p><strong>WKC:</strong> It is real.</p>
<p><strong>YS:</strong> Absolutely.</p>
<p><strong>WKC:</strong> And the way that they determined the people most likely to be murdered was by going to past arrest history. So if you had a co-arrest with somebody who became a homicide victim, that would be a strong indicator that you then would be involved in a homicide. Now, what’s really strange about this is that, first of all, they didn’t take time into consideration. So you have these people who had co-arrests from being a kid and when marijuana was illegal, smoking weed together, who had clean records being visited by police and saying, look, you have to change your ways. And since some of these people had clean records when the police came and visited, the neighbours were like, this guy’s a snitch. The crazy thing as well is that the data that went into these predictive policing models and the whole setup of the model itself came from studying mainly African-American neighbourhoods in the west side of Chicago. So race and background are already there. So race didn’t need to be an overt factor because it was an implicit factor. So if you think of how these programs work, they’re trained using certain data in the way that they’re validated as correct — to say, OK, yes, it’s made a proper prediction — is by hiding some of that past data and then saying, “OK, let’s use this model.” Does it predict the past correctly? So these don’t actually predict the future. They’re tested on the ability to break with the past. So if the past is racist, these programs will only be considered to be correctly validated as accurate if they make racist predictions. So you’re caught in a system in which learning means repeating the past, which means you lose the future. So the reason why we don’t want these automated systems is because all it does is automate past mistakes. So some artists did this great mock-up of a machine-learning programme to find the white collar criminal on the fancy side of New York, blah, blah, blah. So, I mean, I think that the question is, how are we understanding exactly what Yuan was talking about, which are the communities that are most policed so we have the most data about? And so, if the police really want to say, look, we want to be effective and we want to use our resources, then go for this empty swath of people in the suburban homes, doing all sorts of stuff that are never pulled over or looked into. Not that I’m advocating for that.</p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/381818/original/file-20210201-13-1g0n3ld.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><a class="source" href="https://theconversation.com/ca/podcasts">Click here to listen to Don’t Call Me Resilient</a></span>
</figcaption>
</figure>
<p><strong>VS:</strong> I want to talk a little bit about Clearview, because some of this became known when the story of Clearview broke in the mainstream media that all of our data is scraped and then put into this database that is now being used for facial recognition and this database being sold to police or to law enforcement or to companies. Can you explain a little bit about that case and why it’s so important in Canada, the Clearview case?</p>
<p><strong>YS:</strong> Yeah, absolutely. What happened was this company, a start-up that’s still getting funding, tried to provide and is trying to provide its services to the general public and to the police and to governments and all kinds of entities. Clearview AI is a facial recognition technology company, but it’s also a data scraping company. So what it does is it scrapes data from all kinds of sources, social media websites in general, collects those, has used deep learning and machine learning technologies to analyse whose face is whose and categorise those. And then what it does is it sells the service of matching faces. Why this matters is not only is the company selling essentially face matching capabilities but it’s scraped significant amounts of data contrary to law that would otherwise prevent the scraping of data. Now scraping it in itself as not to be seen as criminal. I think it can be used for legitimate reasons, for example, by academic researchers. But none of this is done without our consent. We had no notice. We had no knowledge of this.</p>
<p><strong>VS:</strong> You mean like Canadians. When you say we, we’re talking about residents of Canada?</p>
<p><strong>YS:</strong> Yeah, I think when it comes to both racism and surveillance, we do have Canadian exceptionalism and Clearview AI and its use by the RCMP is another example to show that surveiling and the surveillance of us in Canada absolutely exists and is occurring. The reason why it matters too is because what happened was the RCMP was using Clearview AI services and conducted hundreds of searches, though it only admitted to some of those to the office of the privacy commissioner. And it’s always about the child predators. It always starts with that. And that’s something that <a href="https://www.schneier.com/">Bruce Schneier</a> has referred to as the four horsemen of the info apocalypse. Which is this idea that there is certain aberrant behaviour that you want to address. And then you say, you know, I’m going to use this technology only in those situations, which could be true. All of us can get behind the idea that children should be protected. And that’s, of course, I believe that, too. But then what you see happening is surveillance creep and the ability to use that same technology in other situations. And that’s actually, in a way, what’s happening potentially or what could happen with Apple scanning our images before they’re stored in our iCloud for, again, child sexual abuse materials. People who are concerned about how technology can be used and abused are always thinking in a sort of Minority Report sense. And the good reason we’re trying to see what is the absolute worst that can happen with us is because we’re trying to protect all people, because you know that in order to protect all people, you can’t allow certain people to be treated a certain way necessarily unless they’re … depending on how much trust there is an institution.</p>
<p><strong>VS:</strong> Are you basically saying my photos in my phone are also something to be worried about?</p>
<p><strong>YS:</strong> Absolutely, absolutely.</p>
<p><strong>VS:</strong> Just gets worse and worse. We have to talk for a minute now, or more than a minute, about facial recognition. I know that you both look at this in your work. Can we talk a little bit about what the technology is and also how it’s being used right now, Wendy?</p>
<p><strong>WKC:</strong> Sure. So facial recognition technology is a form of pattern recognition. And so it’s the idea that somehow and these are done through machine learning programs mainly, and they don’t focus on features that make sense to us. It’s not like a computer saying, “oh, I remember these people’s eyes, I’m going to match this eye to that eye,” but rather through various algorithms. Basically, you see one face and you try to match it to another face is the basic technology. It’s very problematic. It doesn’t really work well. It’s also very bad because the early programs were trained on publicly available faces. So you’re thinking Hollywood. Now think of what a hotbed of diversity Hollywood is. Other ones are like undergraduates who will do anything for five dollars or some school credit. So the libraries were mainly white. And so these technologies work very well with light-skinned faces and really poorly with dark-skinned faces. It’s getting better. But that’s not the point. The point isn’t that this needs to be perfect for all skin tones. But the reason why this matters so much is that think of how self-driving cars operate. If they can’t recognize dark people as people, then there’s clear danger that’s involved in this. But also because it’s not refined on dark-skinned faces. And this is something that people at Georgetown (University) have been working on a lot, is that it will misrecognize dark faces as criminal because it doesn’t have that distinction. So there was this famous example given by the ACLU where they looked at the U.S. Congress and said who amongst these are criminals? And it was disproportionately people of colour that were marked as “criminals.”</p>
<p><strong>VS:</strong> So basically, these technologies are built on historical information, which includes historical discrimination, historical racism. And so, this idea that science is neutral and technology is neutral is completely wrong. Basically, the discrimination is built into the technology.</p>
<p><strong>YS:</strong> Yeah, to the point, work by Kate Robertson and Cynthia Khoo at Citizen Lab has shown that we absolutely do have a bias to believe that mathematical processes are neutral. And so we’ll trust technology and we’ll want to listen to it, so to speak, when it has a certain output. And it’s because we think that this is statistics, this is maths. I don’t understand how it works, it must be fine. And that’s really problematic when you consider the fact that not only police but judges could also rely on essentially recommendation systems. It’s probably OK that we can be recommended some TV shows and Netflix before certain recommendations to be made regarding the most fundamental of our rights, that is a totally different story.</p>
<p><strong>VS:</strong> So should we just completely be not using this technology at all?</p>
<p><strong>YS:</strong> I absolutely think there should be certain no-go zones when it comes to the collection and particularly the processing of our data for certain outcomes. So, for example, in the general data protection regulation, which is one of the most advanced and progressive data protection regulations in Europe, what is not allowed is the processing of information for automated decision-making for the purposes of profiling. On its face, what that suggests to me is that you shouldn’t be allowed to, for example, collect information about faces in the public setting, perhaps are very certain circumstances. But the presumption should be that you don’t collect faces and biometric information in the setting and therefore to render someone potentially criminal and biometric information is also a really sensitive data type that I think that absolutely deserves special protection. Right now in Canada, there isn’t special affordances given to the protection of that kind of data. What we have is this kind of free for all in a way where all data is the same. But in fact, different kinds of data have different levels of sensitivity and there should be enforceable regulation in Canada saying that spelling out the kinds of data that should not be treated in certain ways. And right now that doesn’t exist.</p>
<p><strong>VS:</strong> And Wendy, what were you going to say?</p>
<p><strong>WKC:</strong> I completely agree with everything that Yuan has said. I want to just talk about the predictive part of this, because what I would argue is that the problem is using these programs for prediction. The famous example is Amazon’s hiring algorithm, which was trained on all of the hires it made. And what ended up happening is if you had “woman” anywhere on your CV, you lost.</p>
<p><strong>VS:</strong> How is that even possible? The technology actually docks you a point for being a woman?</p>
<p><strong>WKC:</strong> Yeah. So because they went by who they hired and who they didn’t hire. They didn’t hire women, so clearly being a woman is bad, you’re not going to be a good employee. And so they ditched the program. But rather than ditching it, what if we said thank you so much for meticulously documenting your discrimination? We use these not for prediction, but actually as evidence of historical trends. The example I always give is global climate change models. So global climate change models give you the most probable future given past and present behaviour. But then we don’t say, “oh, this is great, it’s going to go up two degrees, let’s make it go faster,” or we’re offered the most probable future so we won’t make that future happen. So what if we took a lot of these things which are allegedly predictive and said, OK, the heat list shows Chicago police are discriminatory, so let’s make sure that the kinds of things that would be automated under this don’t happen. So I think that’s one thing. Take these and look at them as historical probes rather than as predictive. To offer one example of people who are doing this, at USC in the Geena Davis Institute, they’re using these kind of pattern recognition technologies to go through the past archive of Hollywood films and to see what kind of gender representation is there and think through what kinds of representation there have been within mainstream media.</p>
<p><strong>YS:</strong> Yeah, and maybe in a more hopeful note as well, I’m aware of efforts by the Algorithmic Justice League, which looks at how people can flag issues with algorithms with respect to how they’re biased. And the hope is that you can improve systems because you say this is something that should be fixed and there are risks inherent with opening up your systems for criticism by the public. But I think it’s really one important step forward to allowing people who are affected by these technologies to impact their design that would actually give rise potentially to what <a href="https://mitpress.mit.edu/contributors/sasha-costanza-chock">Sasha Costanza-Chock</a> calls design justice. And I won’t go into that in-depth here, but it’s really the idea that there’s meaningful participation of community groups in the design of technology.</p>
<p><strong>VS:</strong> So just talking about the participation of groups in the creation of technology, I don’t know what it’s like for a protester right now on the street. But I do know that summer of 2020 we had uprisings in the United States, but also in Hong Kong and Beirut. And I know that facial recognition is not just used in North America. It’s a global issue that we’re talking about this idea of surveillance. And I know that both of you have talked about some of the ways that the protesters have resisted the surveillance. What caught your attention, Wendy, with some of these protests?</p>
<p><strong>WKC:</strong> Well, what’s important is that they’re very aware of how the technology works, because, again, what we started with is the ways in which publicity and surveillance are now intertwined. So it’s hard to think through publicity without thinking through surveillance at the same time. And what I would argue that the protesters show us and that we need to start thinking about our public rights, because I really think the work that is being done around privacy is important, but it’s completely inadequate. And there’s a thought that once you’re in public, you lose all rights, you’re simply exposed. You’re a public figure. But increasingly we’re all public figures. And what we need to be able to do is to be in public, vulnerable, and yet not attacked. And what I find really important is the ways in which people offer each other shade, either through making sure pictures are taken in a certain way or people registering that they’re at a certain place in order to provide a larger or different sense of location for these technologies. And these are inadequate in terms of long term solutions. But what they bring out is if you think again of how all these recommendation engines work or how everything works, we’re fundamentally intertwined with each other. Everything you get is based on what somebody else has done, which means we’re fundamentally connected. So what if we took this position of being connected as a place from which to act and to act collectively and to say we need to be able to loiter in public because everybody should have the right to loiter, everyone should have the right to be in public. And if we switch it this way, I think that this opens up an entirely different conversation. And more importantly, it moves privacy away from corporations get to know everything about you, but not share to other users and think about it in a far more expansive way.</p>
<p><strong>VS:</strong> You said provide shade, is that what you said? Provide shade for others, for each other? </p>
<p><strong>WKC:</strong> Yes, literally and metaphorically. And this comes from a lot of the work that Kara Keeling has done. She’s in African-American studies and in film studies. And we have been trying to think together through this question of exposure, shade and protection. And it comes from work that she’s done in analysing slavery and in house enslaved women took care of each other and their bodies, not because they own them, but because they did it, because they were outside of certain notions of privacy. So privacy, especially within the U.S., is very white. It’s like the first case in New York State around privacy was the protection of Abigail Robinson, I believe, who is a white woman whose photograph was used to sell flour against her will. But while this case was going on, Nancy Green, who was Aunt Jemima, had no rights to her image. She was completely viewed in public. And so I think if we move away from certain notions of privacy, which have never been adequate, and instead think through publicity as an enabling position that isn’t based on notions of certain really problematic notions of property. I think this can open things up in really productive ways.</p>
<p><strong>VS:</strong> I like that the right to loiter, the right to be public, the right to be in the public.</p>
<p><strong>WKC:</strong> And that comes from work done by wonderful Indian feminists who wrote a book, <em>Why Loiter</em>, which is all about that feminist women need and Muslim men need the right to loiter in public.</p>
<p><strong>VS:</strong> I never really thought about it in that way, the idea of loitering being a right to take up space. But you’re saying we all should have it.</p>
<p><strong>YS:</strong> I absolutely agree with Wendy that we have a system in Canada that is actually very similar to the U.S. where we prioritize privacy. In fact, it isn’t just privacy that is at stake, but it’s the right to control our information. The German Constitutional Court calls this informational self-determination. And that phrase to me really encompasses and cuts across a lot of these issues we’re talking about today because we’re talking about privacy, we’re talking about algorithmic decision-making recommendation systems. But privacy alone isn’t enough to protect our rights. Right now, we have changes to Canada’s privacy laws, and that doesn’t go far enough. And in fact, what we need is a comprehensive approach that protects our right to informational self-determination and views us not as consumers, but views us as humans whose human rights are at the core of the most important thing to protect. And that matters because if you’re out in public and if the police are using what’s called an IMSI catcher. So something that can ping to cell towers to tell where you are. Yes, it’s your privacy at stake. But yes, it’s also your freedom to protest. At the root of it is the right to have your information treated in the way that you want to be treated.</p>
<p><strong>VS:</strong> Before we wrap up, do you have a couple of top things that you want to leave listeners with, either things that you think individuals should be paying attention to or things that we should look at from a policy level or just observations that you think we should be making?</p>
<p><strong>WKC:</strong> We don’t have to accept the technology that we’re given. We can reinvent it. We could rethink it. We need to challenge the defaults. And secondly, technology isn’t the solution to our social problems. It’s often framed this way because there’s this belief that somehow we humans are inadequate and we can build this thing that can take care of these problems for us. It will never be, but it can be, part of the solution. But only if we look at the technology closely and we realize that the technology itself is built-in with these assumptions. But it’s also built on studying certain populations. And that maybe one way, therefore, to change these technologies is to revisit the populations that were so key to the building of certain presumptions, like go back to that residential study of segregation in the United States, realized there was something more and so much more that was happening. And so, therefore, start with everybody we touch whenever we use these technologies as a way to open up different worlds.</p>
<p><strong>YS:</strong> And to add to that, I really want to encourage any listeners who care about these topics to take up space too. And this extends what you were saying, Wendy, about the right to loiter and the right in some ways to take up that space. I really would encourage people who would deploy technology, whether you’re policymakers or are the police, to really consider what is in the public interest. And part of the consideration of what’s in the public interest is consideration of how your technology will impact those equality-seeking groups. So it’s twofold. It’s really take up more space if you are going to be a person who’s impacted by this. And also keep in mind the public interest and those are qualities seeking groups when you are using this technology to the detriment of those people.</p>
<p><strong>VS:</strong> Lovely to speak with you both so much. Thank you very much for taking the time today to be with me.</p>
<p><strong>WKC:</strong> Thank you for inviting us. And it’s a wonderful conversation.</p>
<p><strong>YS:</strong> Thank you so much. I’m really honoured to be part of this.</p>
<p><strong>VS:</strong> That’s it for this episode of Don’t Call Me Resilient. Thanks for listening. I’d love to know, are you as freaked out as I am after that conversation? Talk to me. I’m on Twitter @WriteVinita. And don’t forget to tag our producers @conversationca. Just use the hashtag #DontCallMeResilient. If you’d like to read more about the creeping dangers of surveillance, go to theconversation.com/ca. It’s also where you’ll find our show notes with links to stories and research connected to our conversation with Yuan Stevens and Wendy Hui Kyong Chun. Finally, if you like what you heard today, please help spread the love. Tell a friend about us or leave us a review on whatever podcast app you’re listening to us on. Don’t Call me Resilient is a production of The Conversation Canada. It was made possible by a grant for journalism innovation from the Social Sciences and Humanities Research Council of Canada. The series is produced and hosted by me, Vinita Srivastava. Our producer is Susana Ferreira. Our associate producer is Ibrahim Daair. Reza Dahya is our incredibly patient sound producer and our fabulous consulting producer is Jennifer Moroz. Lisa Varano leads audience development for The Conversation Canada and Scott White is our CEO. And if you’re wondering who wrote and performed the music we use on the pod, that’s the amazing Zaki Ibrahim. The track is called Something in the Water. Thanks for listening, everyone, and hope you join us again. Until then, I’m Vinita. And please, don’t call me resilient.</p><img src="https://counter.theconversation.com/content/167523/count.gif" alt="The Conversation" width="1" height="1" />
Once analysts gain access to our private data, they can use that information to influence and alter our behaviour and choices. If you’re marginalized in some way, the consequences are worse.Vinita Srivastava, Host + Producer, Don't Call Me ResilientIbrahim Daair, Culture + Society EditorLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1679762021-09-16T20:10:30Z2021-09-16T20:10:30ZQLD police will use AI to ‘predict’ domestic violence before it happens. Beware the unintended consequences<figure><img src="https://images.theconversation.com/files/421508/original/file-20210916-27-1eso001.jpeg?ixlib=rb-1.1.0&rect=68%2C137%2C5682%2C3690&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>The Queensland Police Service (QPS) is expected to begin a trial using <a href="https://www.theguardian.com/australia-news/2021/sep/14/queensland-police-to-trial-ai-tool-designed-to-predict-and-prevent-domestic-violence-incidents">artificial intelligence (AI)</a> to determine the <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3494076">future risk</a> posed by known domestic violence perpetrators. </p>
<p>Perpetrators identified as “high risk” — based on previous calls to an address, past criminal activity and other police-held data — will be visited at home by police before domestic violence escalates, and before any crime has been committed. </p>
<p>It is necessary to find better ways to improve safety for women subjected to domestic violence. However, using AI technology in this context may have unintended consequences — and the proposed plan raises serious questions about the role of police in preventing domestic violence incidents.</p>
<p>The approach relies on an algorithm that has been developed from existing QPS administrative data (QPRIME). All statistical algorithms must assess risk based <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3494076">on available data</a>, which in turn means they are only as good as the data underpinning them.</p>
<p>Experts who criticise the use of <a href="https://www.brennancenter.org/our-work/research-reports/predictive-policing-explained">data-driven risk assessment tools</a> in policing point to the lack of transparency in the specific kinds of data analysed, as well as how predictions based on these data are acted upon.</p>
<p>Because of how police operate, the key data most consistently captured are information about past situations police have been called to, and criminal history data. </p>
<p>Using this information to train an AI algorithm could reinforce existing biases in the criminal justice system. It could create an endless feedback loop between police and those members of the public who have the most contact with police.</p>
<p>In Australia, they are <a href="https://www.proquest.com/openview/0d003b28161c54287fcb8b6f2d888267/1?pq-origsite=gscholar&cbl=4425140">Aboriginal and Torres Strait Islander people</a>. It is not difficult to imagine that under this new regime Aboriginal and Torres Strait Islander people will be <a href="https://www.sentencingcouncil.qld.gov.au/__data/assets/pdf_file/0009/657648/not-a-one-way-street-report.pdf">visited more by police</a>.</p>
<p>QPS representative <a href="https://www.theguardian.com/australia-news/2021/sep/14/queensland-police-to-trial-ai-tool-designed-to-predict-and-prevent-domestic-violence-incidents">Ben Martain</a> has said police won’t be able to charge someone they door-knock for a future suspected offence. </p>
<p>He also said for the pilot, attributes of ethnicity and geographic location were removed before training the AI model. But despite this, it seems likely Aboriginal and Torres Strait Islander people will continue to be disproportionately targeted, since they are over-represented across all kinds of police contact.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/nsw-police-want-access-to-tinders-sexual-assault-data-cybersafety-experts-explain-why-its-a-date-with-disaster-159811">NSW Police want access to Tinder's sexual assault data. Cybersafety experts explain why it's a date with disaster</a>
</strong>
</em>
</p>
<hr>
<h2>Introducing risk</h2>
<p>The aim of such AI-based strategies in policing is to prevent or reduce crime, through an assessment of the risk of future offending. In theory, this means police would intervene early to stop a crime from occurring in the first place.</p>
<p>However, with this approach there are risks police may <em>create</em> crime. An unprompted police door-knock would be unwelcome in most households — let alone one where police have previously attended to carry out searches or make arrests.</p>
<p>In this “preventative” program, perpetrators and the victims they live with may be nervous, agitated or even angry at the police intrusion at their home for no apparent reason.</p>
<p>A visited person might use <a href="https://www.alrc.gov.au/publication/pathways-to-justice-inquiry-into-the-incarceration-rate-of-aboriginal-and-torres-strait-islander-peoples-alrc-report-133/12-fines-and-driver-licences/infringement-notices-for-offensive-language-3/">offensive language</a> or refuse to provide their name. It would not be surprising if this led to charges. </p>
<p>Such charges might lead the visited person to become even more nervous, agitated or angry, and then they may find they are charged with assault and resisting police. This is popularly known as the <a href="https://humanrights.gov.au/our-work/indigenous-deaths-custody-chapter-6-police-practices">“trifecta”</a>, wherein a person who has otherwise not offended is ultimately charged with offensive language, resisting arrest and assaulting police. </p>
<p>The standard powers in the police toolbox are to arrest and charge. With QPS’s proposed plan, there is an obvious risk of widening the net of criminalisation for both perpetrators, as well as <a href="https://www.anrows.org.au/project/accurately-identifying-the-person-most-in-need-of-protection-in-domestic-and-family-violence-law/">victims who may be misidentified</a> as perpetrators. For instance, sometimes victims who have used violence in self-defence have been arrested instead of the perpetrator. </p>
<h2>Bringing further harm to victims</h2>
<p>The role of the victim in such a program is also of concern. Any program that deepens surveillance of perpetrators also <a href="https://link.springer.com/article/10.1007%2Fs10896-019-00090-y">deepens surveillance of victims</a>. </p>
<p>Victims do not always want police to intervene in their lives. In some cases, this form of proactive policing might feel like an extension of control, rather than help. What happens when police visit and discover a high-risk perpetrator and victim are living together again? </p>
<p>Victims may fear <a href="http://rcfv.archive.royalcommission.vic.gov.au/CustomData/Exhibits/HAD/WIT.0075.001.0214_R.pdf">child protection</a> authorities will get involved and feel obliged to cover up the fact they are still with the perpetrator. And once a victim has been pressured to lie, they may be reluctant to call the police the next time they do need police intervention. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/421509/original/file-20210916-21-14k5sy2.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Child hides behind stuffed toy" src="https://images.theconversation.com/files/421509/original/file-20210916-21-14k5sy2.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/421509/original/file-20210916-21-14k5sy2.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/421509/original/file-20210916-21-14k5sy2.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/421509/original/file-20210916-21-14k5sy2.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/421509/original/file-20210916-21-14k5sy2.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/421509/original/file-20210916-21-14k5sy2.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/421509/original/file-20210916-21-14k5sy2.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Victims of domestic violence may feel obliged to lie or withhold information from police to avoid child protection authorities getting involved.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>In some cases, the perpetrator or victim may decide not to take the safety advice of police officers who visit. It is not clear what police might do in a situation where they ask a perpetrator to leave, or try to take a victim to safety, but they refuse.</p>
<p>The mission of any domestic violence intervention should be to restore power to victims. But we know interventions do not assist all women (or men) equally. <a href="https://www.unswlawjournal.unsw.edu.au/wp-content/uploads/2021/04/10-Douglas-Tarrant-Tolmie.pdf">Structural inequalities</a>, including race and class, mean interventions are experienced differently by different people.</p>
<p>Will a victim have a say in whether police engage in proactive policing of their perpetrator? Should they have a say?</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/police-access-to-covid-check-in-data-is-an-affront-to-our-privacy-we-need-stronger-and-more-consistent-rules-in-place-167360">Police access to COVID check-in data is an affront to our privacy. We need stronger and more consistent rules in place</a>
</strong>
</em>
</p>
<hr>
<h2>Are there safer options?</h2>
<p>In the context of <a href="https://dfvbenchbook.aija.org.au/dynamics-of-domestic-and-family-violence/factors-affecting-risk/">risk assessment</a>, many experts argue women often (although not always) have a strong sense of when they are at heightened risk.</p>
<p>Family court-ordered contact visits can be one of those moments of high risk. Yet in these situations women often report <a href="https://www.crimejusticejournal.com/article/view/1122">police refusing</a> to help keep them and their children safe. How is the voice of the victim factored into risk assessment with this tool? </p>
<p>One particular concern is whether police are really equipped to intervene in circumstances where there is no crime. QPS representative <a href="https://www.theguardian.com/australia-news/2021/sep/14/queensland-police-to-trial-ai-tool-designed-to-predict-and-prevent-domestic-violence-incidents">Ben Martain</a> said when perpetrators are “not at a point of crisis, in a heightened emotional state, or affected by drugs or alcohol” — they are “generally more amenable to recognising this as a turning-point opportunity in their lives”. </p>
<p>But police themselves have questioned their role in domestic violence <a href="https://www.theguardian.com/australia-news/2021/may/22/queensland-police-consider-bringing-social-workers-to-domestic-violence-incidents">circumstances</a> — instead highlighting the potential role social workers may have, in their place.</p>
<p>It is not clear whether police are the best-positioned service to intervene when there is no identified disturbance. Queensland already has <a href="https://dfvbenchbook.aija.org.au/fair-hearing-and-safety/fair-hearing-and-safety-information-sharing/or">information-sharing protocols</a> involving teams tasked specifically with responding to people involved in high-risk domestic violence relationships. These teams include community-based support workers. </p>
<p>This may be a better path for intervention during those critical periods of calm.</p><img src="https://counter.theconversation.com/content/167976/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Heather Douglas receives funding from the Australian Research Council. </span></em></p><p class="fine-print"><em><span>Robin Fitzgerald receives funding from Australian Research Council. </span></em></p>Plans have been made for the AI-based program to begin trials before the year ends. But it raises serious questions about the role of police in preventing domestic violence.Heather Douglas, Professor of Law, The University of MelbourneRobin Fitzgerald, Lecturer in Criminology, The University of QueenslandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1639582021-08-03T15:43:46Z2021-08-03T15:43:46ZRelationship between big tech and policing is shielded behind commercial confidentiality – it’s a problem<figure><img src="https://images.theconversation.com/files/410781/original/file-20210712-15-1751p26.jpeg?ixlib=rb-1.1.0&rect=310%2C620%2C5199%2C3199&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/miniature-police-officer-on-top-computer-419515312">Shutterstock/kirill_makarov</a></span></figcaption></figure><p>For over ten years, public inquiries, press reports, police whistleblowers – and even chief constables – have been raising the issue of police IT systems not being fit for purpose and ultimately failing victims of crime. This has prompted significant media attention and public scrutiny, as well as the resignation of the <a href="https://news.sky.com/story/greater-manchester-police-chief-constable-resigns-after-force-placed-into-special-measures-12166616">head of one force</a>.</p>
<p>But the role of the private technology companies behind these reportedly failing systems has not been closely examined. Just three big developers are being paid tens of millions of pounds to supply the majority of these UK systems – and it’s time a light was shone on them too.</p>
<p>The market is dominated by the Japanese-owned tech giant <a href="https://www.necsws.com/solutions/police-software/police-record-management-system/">NEC</a>, a US-based specialist provider <a href="https://nicherms.com/">Niche RMS</a>, and the UK-based <a href="https://www.capita.com/expertise/industry-specific-services/public-safety/records-management/police-records-management">Capita</a> which supplies four forces. According to my ongoing research, at least 41 of the UK’s 43 forces are now the proud owners of a commercially produced integrated data system. According to their own websites Niche <a href="https://nicherms.com/who-we-serve/">serves 26</a> forces with NEC <a href="https://www.jcnnewswire.com/pressrelease/67819/3/">serving 16</a>. My own research suggests that Capita is offering its services to about four forces. </p>
<h2>Government directive</h2>
<p>Large scale investment in new police data systems in the UK follows a government directive requiring police to improve the way they collect and share intelligence. This directive came following a 2004 <a href="https://dera.ioe.ac.uk/6394/1/report.pdf">public inquiry</a> into the murders of schoolgirls Holly Wells and Jessica Chapman by a known sex offender in Soham, Cambridgeshire. The inquiry found that better police data practices could have prevented the crimes. </p>
<p>Since then, police forces across the UK have embarked on a massive procurement spree, buying complicated data systems, chiefly from these three large multinational technology developers. </p>
<p>As well as being paid huge sums to design and implement the systems, these companies are often granted long-term contracts to provide IT services and maintenance on an ongoing basis, fixing errors and glitches, developing new functionalities and redesigning parts that are not working well. As these public-private partnerships become an ever-more embedded feature of UK policing, it is time to start asking whether they work in the public interest – because there have been examples which appear to call this into question. </p>
<p>In June, the new chief constable of Greater Manchester Police (GMP) admitted that his force’s £27 million crime reporting system “<a href="https://www.theguardian.com/uk-news/2021/jun/29/greater-manchesters-27m-recording-system-doesnt-work-says-police-chief">doesn’t work</a>” and may need to be scrapped.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1409642880038281226"}"></div></p>
<p><strong>iOPS</strong></p>
<p>His admission follows a troubled year, in which the force was placed in “special measures” by inspectors, leading to the resignation of his predecessor following a <a href="https://www.justiceinspectorates.gov.uk/hmicfrs/wp-content/uploads/greater-manchester-police-integrated-operational-policing-system.pdf">snap inspection</a> of the “iOPS” IT system developed by Capita which found that 800,000 crimes and 74% of child protection incidents had mistakenly gone unrecorded. </p>
<p><a href="https://manchestermill.co.uk/p/exclusive-major-data-breach-at-greater">Further reports</a> then emerged that “a dataset of personal information” – including the names and details of sexual assault victims – was made available online, leading to massive breaches of privacy, although blame for this has not as yet been attributed to a failing of the IT system. </p>
<p><strong>Athena and Connect</strong></p>
<p>But GMP is not the only force having serious problems with IT systems. In 2019, it was <a href="https://www.bbc.co.uk/news/uk-46964659">reported</a> that the “Athena” platform, which cost £35 million and which enables data to be shared instantly across nine UK police forces, was “unfit for purpose”. Frequent crashes of the system and overly complicated processes meant police were failing to charge criminals in time for cases to make it to court. </p>
<p>Similar issues were <a href="https://www.bbc.co.uk/news/uk-england-essex-32217671">raised in 2015</a> when Essex police was reportedly forced to turn to pen and paper after the Athena system crashed for days. Developers Northgate Public Services – which have since been taken over by <a href="https://www.necsws.com/solutions/police-software/police-record-management-system/">NEC</a> – apologised at the time for problems “in small areas” which it said it was fixing. Northgate’s Connect platform, which forms the basis for Athena, is in use by 16 forces. </p>
<p><strong>Niche</strong></p>
<p>And <a href="https://www.whatdotheyknow.com/request/niche_computer_system#incoming-160205">in 2009</a>, West Yorkshire police acknowledged in response to a freedom of information request that their introduction of the new “Niche” platform had “led to wrongful arrests”. However, they also said the “critical perspective of Niche” was a result of their efforts to “identify areas for improvement”.</p>
<p>The problems appear to be widespread. A <a href="https://www.forensicanalytics.co.uk/annual-police-ict-survey-results-revealed/">2018 survey</a> found that only 2% of police officers nationally were satisfied with their IT systems, and only 30% thought their force invested wisely in them.</p>
<p>Police data systems have received much less media attention than more captivating, futuristic technologies like <a href="https://www.bbc.co.uk/news/technology-57504717">facial recognition</a> and <a href="https://www.bbc.co.uk/news/technology-47118229">predictive policing</a>. But they have far greater implications for the way people are policed today.</p>
<p>They determine what data police can record, how they record it and how easily that data can be accessed, shared, analysed and corrected. When the systems glitch, crimes are left un-investigated, victims are failed, innocent people are arrested and criminals escape justice – as the cases of Manchester, Essex and West Yorkshire have already shown. </p>
<p>And when the data they supply turns out to be incomplete, unreliable, or erroneous, facial recognition techniques will pick out the wrong faces and predictive policing will pick out the wrong people. </p>
<p>The relationship between big police tech and policing is shielded behind commercial confidentiality – it’s time the government opened it up to proper public scrutiny.</p>
<hr>
<p><em>A spokesperson for Niche said:</em></p>
<blockquote>
<p>We deliver information management solutions to Police Services around the world in support of their public safety/safeguarding mission, and we welcome feedback that allows us to improve our products in support of our customers’ mission.</p>
</blockquote>
<p><em>Capita declined to comment. NEC were also approached for comment on the issues raised in this article.</em></p><img src="https://counter.theconversation.com/content/163958/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katerina Hadjimatheou receives funding from the Economic and Social Research Council under grant ES/M010236/1.</span></em></p>Just three big developers are being paid tens of millions of pounds to supply the majority of these UK systems.Katerina Hadjimatheou, Lecturer in Criminology and Ethics, University of EssexLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1432272020-08-24T20:56:15Z2020-08-24T20:56:15ZAI technologies — like police facial recognition — discriminate against people of colour<figure><img src="https://images.theconversation.com/files/352575/original/file-20200812-16-1a3ty5h.jpg?ixlib=rb-1.1.0&rect=0%2C17%2C3000%2C1482&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Facial recognition algorithms are usually tested using white faces, which results in the technology being unable to differentiate between racialized individuals.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>Detroit police wrongfully arrested Robert Julian-Borchak Williams in January 2020 <a href="https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html">for a shoplifting incident that had taken place two years earlier</a>. Even though Williams had nothing to do with the incident, facial recognition technology used by Michigan State Police “matched” his face with a grainy image obtained from an in-store surveillance video showing another African American man taking US$3,800 worth of watches. </p>
<p>Two weeks later, the case was dismissed at the prosecution’s request. However, relying on the faulty match, police had already handcuffed and arrested Williams in front of his family, forced him to provide a mug shot, fingerprints and a sample of his DNA, interrogated him and imprisoned him overnight. </p>
<p>Experts suggest that Williams is not alone, and that others have been subjected to similar injustices. The ongoing controversy about police use of Clearview AI certainly underscores the privacy risks posed by facial recognition technology. But it’s important to realize that <a href="https://www.nytimes.com/2019/12/19/technology/facial-recognition-bias.html">not all of us bear those risks equally</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-coronavirus-pandemic-highlights-the-need-for-a-surveillance-debate-beyond-privacy-137060">The coronavirus pandemic highlights the need for a surveillance debate beyond 'privacy'</a>
</strong>
</em>
</p>
<hr>
<h2>Training racist algorithms</h2>
<p>Facial recognition technology that is <a href="https://www.wired.com/story/best-algorithms-struggle-recognize-black-faces-equally/">trained on and tuned to Caucasian faces</a> systematically misidentifies and mislabels racialized individuals: numerous studies report that facial recognition technology is “<a href="https://www.aclu.org/news/privacy-technology/wrongfully-arrested-because-face-recognition-cant-tell-black-people-apart/">flawed and biased, with significantly higher error rates when used against people of colour</a>.” </p>
<p>This <a href="https://www.mirror.co.uk/tech/apple-accused-racism-after-face-11735152">undermines the individuality and humanity of racialized persons</a> who are more likely to be misidentified as criminal. The technology — and the identification errors it makes — reflects and further entrenches long-standing social divisions that are deeply entangled with racism, sexism, homophobia, settler-colonialism and other intersecting oppressions. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/vSuDE6wvQlU?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A France24 investigation into racial bias in facial recognition technology.</span></figcaption>
</figure>
<h2>How technology categorizes users</h2>
<p>In his game-changing 1993 book, <a href="https://eric.ed.gov/?id=ED377817"><em>The Panoptic Sort</em></a>, scholar Oscar Gandy warned that “complex technology [that] involves the collection, processing and sharing of information about individuals and groups that is generated through their daily lives … is used to coordinate and control their access to the goods and services that define life in the modern capitalist economy.” Law enforcement uses it to pluck suspects from the general public, and private organizations use it to determine whether we have access to things like <a href="https://thehill.com/blogs/congress-blog/technology/459455-making-equitable-access-to-credit-a-reality-in-the-age-of">banking</a> and <a href="https://hbr.org/2019/05/all-the-ways-hiring-algorithms-can-introduce-bias">employment</a>. </p>
<p>Gandy prophetically warned that, if left unchecked, this form of “cybernetic triage” would exponentially disadvantage members of equality-seeking communities — for example, groups that are racialized or socio-economically disadvantaged — both in terms of what would be allocated to them and how they might come to understand themselves.</p>
<p>Some 25 years later, we’re now living with the panoptic sort on steroids. And examples of its negative effects on equality-seeking communities abound, such as the false identification of Williams. </p>
<h2>Pre-existing bias</h2>
<p>This sorting using algorithms infiltrates the most fundamental aspects of everyday life, occasioning both direct and structural violence in its wake.</p>
<p>The direct violence experienced by Williams is immediately evident in the events surrounding his arrest and detention, and the individual harms he experienced are obvious and can be traced to the actions of police who chose to rely on the technology’s “match” to make an arrest. More insidious is the <a href="https://www.thoughtco.com/structural-violence-4174956">structural violence</a> perpetrated through facial recognition technology and <a href="https://github.com/MimiOnuoha/On-Algorithmic-Violence">other digital technologies</a> that rate, match, categorize and sort individuals in ways that magnify pre-existing discriminatory patterns. </p>
<p>Structural violence harms are less obvious and less direct, and cause injury to equality-seeking groups through systematic denial to power, resources and opportunity. Simultaneously, it increases direct risk and harm to individual members of those groups. </p>
<p>Predictive policing uses <a href="http://affinitymagazine.us/2020/05/07/predictive-policing-threatens-civil-liberties/">algorithmic processing of historical data to predict when and where new crimes are likely to occur</a>, assigns police resources accordingly and embeds enhanced police surveillance into communities, usually in lower-income and racialized neighbourhoods. This increases the chances that any criminal activity — including less serious criminal activity that might otherwise prompt no police response — will be detected and punished, ultimately limiting the life chances of the people who live within that environment. </p>
<p><div data-react-class="InstagramEmbed" data-react-props="{"url":"https://www.instagram.com/p/CDKKDxmlqEQ","accessToken":"127105130696839|b4b75090c9688d81dfd245afe6052f20"}"></div></p>
<p>And the evidence of inequities in other sectors continues to mount. <a href="https://www.wired.co.uk/article/gcse-results-alevels-algorithm-explained">Hundreds of students in the United Kingdom</a> protested on Aug. 16 against the disastrous results of <a href="https://www.gov.uk/government/organisations/ofqual">Ofqual</a>, a flawed algorithm the U.K. government used to determine which students would qualify for university. In 2019, Facebook’s microtargeting ad service <a href="https://www.cbc.ca/news/politics/facebook-employment-job-ads-discrimination-1.5086491">helped dozens of public and private sector employers</a> exclude people from receiving job ads on the basis of age and gender. Research conducted by ProPublica has documented <a href="https://www.propublica.org/article/breaking-the-black-box-when-algorithms-decide-what-you-pay">race-based price discrimination for online products</a>. And search engines regularly produce racist and sexist results.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/googles-algorithms-discriminate-against-women-and-people-of-colour-112516">Google's algorithms discriminate against women and people of colour</a>
</strong>
</em>
</p>
<hr>
<h2>Perpetuating oppression</h2>
<p>These outcomes matter because they perpetuate and deepen pre-existing inequalities based on characteristics like race, gender and age. They also matter because they deeply affect how we come to know ourselves and the world around us, sometimes by <a href="https://fs.blog/2017/07/filter-bubbles/">pre-selecting the information</a> we receive in ways that reinforce stereotypical perceptions. Even technology companies themselves acknowledge the <a href="https://www.digitaltrends.com/news/google-execs-bias-algorithms/">urgency of stopping algorithms from perpetuating discrimination</a>.</p>
<p>To date the success of ad hoc investigations, conducted by the tech companies themselves, has been inconsistent. Occasionally, corporations involved in producing discriminatory systems withdraw them from the market, such as when <a href="https://priv.gc.ca/en/opc-news/news-and-announcements/2020/nr-c_200706/">Clearview AI announced it would no longer offer facial recognition technology in Canada</a>. But often such decisions result from regulatory scrutiny or public outcry only <em>after</em> members of equality-seeking communities have already been harmed.</p>
<p>It’s time to give our regulatory institutions the tools they need to address the problem. Simple privacy protections that hinge on obtaining individual consent to enable data to be captured and repurposed by companies cannot be separated from the discriminatory outcomes of that use. This is especially true in an era when most of us (<a href="https://www.technologyreview.com/2017/04/11/5113/the-dark-secret-at-the-heart-of-ai/">including technology companies themselves</a>) cannot fully understand what algorithms do or why they produce specific results.</p>
<h2>Privacy is a human right</h2>
<p>Part of the solution entails breaking down the current regulatory silos that treat privacy and human rights as separate issues. Relying on a consent-based data protection model flies in the face of the basic principle that privacy and equality are both human rights that cannot be contracted away. </p>
<p>Even <a href="https://www.ic.gc.ca/eic/site/062.nsf/eng/h_00108.html">Canada’s Digital Charter</a> — the federal government’s latest attempt to respond to the shortcomings of the current state of the digital environment — maintains these conceptual distinctions. It treats hate and extremism, control and consent, and strong democracy as separate categories.</p>
<p>To address algorithmic discrimination, we must recognize and frame both privacy and equality as human rights. And we must create an infrastructure that is equally attentive to and expert in both. Without such efforts, the glossy sheen of math and science will continue to camouflage AI’s discriminatory biases, and travesties such as that inflicted on Williams can be expected to multiply.</p><img src="https://counter.theconversation.com/content/143227/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jane Bailey receives funding from the Social Sciences and Humanities Research Council of Canada. </span></em></p><p class="fine-print"><em><span>Jacquelyn Burkell receives funding from the Social Sciences and Humanities Research Council of Canada. </span></em></p><p class="fine-print"><em><span>Valerie Steeves receives funding from the Social Sciences and Humanities Research Council of Canada, the Office of the Privacy Commissioner of Canada, and the Canadian Institutes of Health Research. </span></em></p>Technology is not neutral, as facial recognition algorithms and predictive policing have shown us. Algorithms discriminate by design, reflecting and reinforcing pre-existing biases.Jane Bailey, Professor of Law and Co-Leader of The eQuality Project, L’Université d’Ottawa/University of OttawaJacquelyn Burkell, Associate Vice-President, Research, Western UniversityValerie Steeves, Full Professor, L’Université d’Ottawa/University of OttawaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1025302018-09-26T12:33:58Z2018-09-26T12:33:58ZTechnology dominates our lives – that’s why we should teach human rights law to software engineers<figure><img src="https://images.theconversation.com/files/237498/original/file-20180921-129856-1ql3q0w.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Alexa, what are my human rights?</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/muenster-january-27-2018-white-amazon-1012682005?src=OLqt6jev6mKZaZkc3Ffj9w-1-4">Shutterstock</a></span></figcaption></figure><p>Artificial Intelligence (AI) is finding its way into more and more aspects of our daily lives. It powers the smart assistants on our mobile phones and virtual “home assistants”. It is in the algorithms designed to improve our health diagnostics. And it is used in the <a href="https://theconversation.com/ai-profiling-the-social-and-moral-hazards-of-predictive-policing-92960">predictive policing tools</a> used by the police to fight crime. </p>
<p>Each of these examples throws up potential problems when it comes to the protection of our human rights. Predictive policing, if not correctly designed, can lead to discrimination based on race, gender or ethnicity. </p>
<p>Privacy and data protection rules apply to information related to our health. Similarly, systematic recording and use of our smartphones’ geographical location <a href="https://www.supremecourt.gov/opinions/17pdf/16-402_h315.pdf">may breach privacy and data protection rules</a> and it could lead to concerns over digital surveillance by public authorities. </p>
<p>Software engineers are responsible for the design of the algorithms behind all of these systems. It is the software engineers who enable smart assistants to answer our questions more accurately, help doctors to improve the detection of health risks, and allow police officers to better identify pockets of rising crime risks. </p>
<p>Software engineers do not usually receive training in human rights law. Yet with each line of code, they may well be interpreting, applying and even breaching key human rights law concepts – without even knowing it.</p>
<p>This is why it is crucial that we teach human rights law to software engineers. Earlier this year, new EU regulation forced businesses to become more open with consumers about the information they hold. Known as <a href="https://theconversation.com/what-does-gdpr-mean-for-me-an-explainer-96630">GDPR</a>, you may remember it as the subject of numerous desperate emails begging you to opt in to remain on various databases. </p>
<p>GDPR increased restrictions on what organisations can do with your data, and extends the rights of individuals to access and control data about them. These moves towards <a href="https://ico.org.uk/for-organisations/guide-to-the-general-data-protection-regulation-gdpr/accountability-and-governance/data-protection-by-design-and-default/">privacy-by-design and data protection-by-design</a> are great opportunities to integrate legal frameworks into technology. On their own, however, they are not enough. </p>
<p>For example, a better knowledge of human rights law can help software developers understand what indirect discrimination is and why it is prohibited by law. (Any discrimination based on race, colour, sex, language, religion, political or other opinion, national or social origin, property, association with a national minority, birth or other status is prohibited under article 14 of the <a href="https://www.echr.coe.int/Documents/Convention_ENG.pdf">European Convention on Human Rights</a>.)</p>
<p>Direct discrimination occurs when an individual is treated less favourably based on one or more of these protected grounds. Indirect discrimination occurs when a rule that is neutral in appearance leads to less favourable treatment of an individual (or a group of individuals).</p>
<p>Similarly, understanding the intricacies of the right to a fair trial and its corollary, presumption of innocence, may lead to better informed choices in algorithm design. That could help avoid the possibility that algorithms would presume that the number of police arrests in a multi-ethnic neighbourhood correlates with the number of effective criminal convictions. </p>
<p>Even more importantly, it would assist them in developing <a href="https://www.nature.com/articles/d41586-018-05469-3">unbiased choices</a> of datasets that are not proxies for discrimination based on ethnicity or race. For example, wealth and income data combined with geographic location data may be used as a proxy for the identification of populations from a certain ethnic background if they <a href="http://ec.europa.eu/newsroom/just/document.cfm?action=display&doc_id=45791">tend to concentrate in a particular neighbourhood</a>.</p>
<h2>Legal code</h2>
<p>Likewise, a better understanding of how legal frameworks on human rights operate may stimulate the creation of solutions for enhancing compliance with legal rules. For instance, there is a great need for <a href="https://openscholarship.wustl.edu/cgi/viewcontent.cgi?article=1166&context=law_lawreview">technological due process</a> solutions, by which individuals could easily challenge AI-based decisions made by public authorities that directly affect them. This could be the case of parents who would be wrongly identified as potential child abusers by opaque algorithms <a href="https://www.theguardian.com/society/2018/sep/16/child-abuse-algorithms-from-science-fiction-to-cost-cutting-reality">used by local authorities</a>.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/237504/original/file-20180921-129871-oejnzs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/237504/original/file-20180921-129871-oejnzs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/237504/original/file-20180921-129871-oejnzs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/237504/original/file-20180921-129871-oejnzs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/237504/original/file-20180921-129871-oejnzs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/237504/original/file-20180921-129871-oejnzs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/237504/original/file-20180921-129871-oejnzs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The key to fair technology.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/closeup-view-on-conceptual-keyboard-human-410663563?src=AmrMLeUIInsmJpVPwZvXKA-1-23">Shutterstock</a></span>
</figcaption>
</figure>
<p>Such solutions could also be relevant to the private sector. For example, decisions on insurance premiums and loans are often determined by <a href="https://heinonline.org/HOL/LandingPage?handle=hein.journals/washlr89&div=4&id=&page=">profiling and scoring algorithms</a> hidden behind <a href="https://digitalcommons.law.umaryland.edu/books/96/">black boxes</a>. Full transparency and disclosure of these algorithms may not be possible or desirable due to the nature of these business models. </p>
<p>Thus, a due process-by-design solution could allow individuals to easily challenge such decisions before accepting an offer. As our contemporary societies inexorably evolve towards intensive AI applications, we need to bear in mind that the <a href="https://hbr.org/2017/01/the-humans-working-behind-the-ai-curtain">humans behind the AI curtain</a> have the power to make (mis)informed decisions that affect us all. </p>
<p>It is high time that resources and energy are reverted towards educating them not only in cutting edge technology – but also on the relevant human rights rules.</p><img src="https://counter.theconversation.com/content/102530/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ana Beduschi does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Technology can transgress all kinds of legal frameworks.Ana Beduschi, Senior Lecturer in Law, University of ExeterLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/726402017-05-10T01:34:40Z2017-05-10T01:34:40ZWhy big-data analysis of police activity is inherently biased<figure><img src="https://images.theconversation.com/files/167997/original/file-20170504-21608-zqv05t.jpg?ixlib=rb-1.1.0&rect=0%2C1530%2C4090%2C3214&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How does bad data affect predictive policing algorithms?</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/silhouette-police-officer-beside-laptop-fingerprint-24544438">Photosani/shutterstock.com</a></span></figcaption></figure><p>In early 2017, Chicago Mayor Rahm Emanuel announced a new initiative in the city’s <a href="http://www.nbcnews.com/news/us-news/chicago-police-department-goes-high-tech-fight-rise-killings-n713206">ongoing battle with violent crime</a>. The most common solutions to this sort of problem involve <a href="https://www.ncjrs.gov/works/chapter8.htm">hiring more police officers or working more closely with community members</a>. But Emanuel declared that the Chicago Police Department would expand its use of software, enabling what is called “predictive policing,” particularly in neighborhoods on the city’s south side.</p>
<p>The Chicago police will use data and computer analysis to identify neighborhoods that are more likely to experience violent crime, assigning additional police patrols in those areas. In addition, the software will identify individual people who are expected to become – but have yet to be – <a href="http://www.chicagotribune.com/news/opinion/commentary/ct-gun-violence-list-chicago-police-murder-perspec-0801-jm-20160729-story.html">victims or perpetrators of violent crimes</a>. Officers may even be assigned to visit those people to <a href="http://www.theverge.com/2014/2/19/5419854/the-minority-report-this-computer-predicts-crime-but-is-it-racist">warn them against committing a violent crime</a>.</p>
<p>Any attempt to curb the alarming rate of <a href="https://www.nytimes.com/2016/12/28/us/chicago-murder-rate-gun-deaths.html">homicides in Chicago</a> is laudable. But the city’s new effort seems to ignore evidence, including recent research from members of our policing study team at the <a href="https://hrdag.org/policing/">Human Rights Data Analysis Group</a>, that predictive policing tools reinforce, rather than reimagine, existing police practices. Their expanded use could lead to further targeting of communities or people of color.</p>
<h2>Working with available data</h2>
<p>At its core, any predictive model or algorithm is a combination of data and a statistical process that seeks to identify patterns in the numbers. This can include looking at police data in hopes of learning about crime trends or recidivism. But a useful outcome depends not only on good mathematical analysis: It also <a href="https://theconversation.com/big-datas-streetlight-effect-where-and-how-we-look-affects-what-we-see-58122">needs good data</a>. That’s where predictive policing often falls short.</p>
<p>Machine-learning algorithms learn to make predictions by analyzing patterns in an initial training data set and then look for similar patterns in new data as they come in. If they <a href="https://theconversation.com/did-artificial-intelligence-deny-you-credit-73259">learn the wrong signals from the data</a>, the subsequent analysis will be lacking.</p>
<p>This happened with a Google initiative called “Flu Trends,” which was launched in 2008 in hopes of using information about people’s online searches to spot disease outbreaks. Google’s systems would monitor users’ searches and identify locations where many people were researching various flu symptoms. In those places, <a href="http://dx.doi.org/10.1126/science.1248506">the program would alert public health authorities</a> that more people were about to come down with the flu.</p>
<p>But the project failed to account for the potential for periodic changes in Google’s own search algorithm. In an early 2012 update, Google modified its search tool to suggest a diagnosis when users searched for terms like “cough” or “fever.” On its own, this change <a href="http://dx.doi.org/10.1126/science.1248506">increased the number of searches for flu-related terms</a>. But Google Flu Trends interpreted the data as predicting a flu outbreak twice as big as federal public health officials expected and far <a href="https://www.wired.com/2015/10/can-learn-epic-failure-google-flu-trends/">larger than what actually happened</a>.</p>
<h2>Criminal justice data are biased</h2>
<p>The failure of the Google Flu Trends system was a result of one kind of flawed data – information biased by factors other than what was being measured. It’s much harder to identify bias in criminal justice prediction models. In part, this is because <a href="https://www.ucrdatatool.gov/twomeasures.cfm">police data</a> <a href="https://sunlightfoundation.com/2015/05/01/the-benefits-of-criminal-justice-data-beyond-policing/">aren’t collected uniformly</a>, and in part it’s because what data police track reflect longstanding institutional biases along <a href="http://dx.doi.org/10.1186/1477-7517-3-22">income</a>, <a href="http://dx.doi.org/10.1198/016214506000001040">race</a> and gender lines. </p>
<p>While police data often are described as representing “crime,” that’s not quite accurate. Crime itself is a largely hidden social phenomenon that happens anywhere a person violates a law. What are called “crime data” usually tabulate specific events that aren’t necessarily lawbreaking – like a 911 call – or that are influenced by existing police priorities, like arrests of people suspected of particular types of crime, or reports of incidents seen when <a href="https://us.sagepub.com/en-us/nam/the-mismeasure-of-crime/book234529#contents">patrolling a particular neighborhood</a>. </p>
<p>Neighborhoods with lots of police calls aren’t necessarily the same places the most crime is happening. They are, rather, where the most police attention is – though where that attention focuses can often be <a href="http://www.cengage.com/c/the-invisible-woman-gender-crime-and-justice-4e-belknap">biased by gender</a> and <a href="http://dx.doi.org/10.1198/016214506000001040">racial factors</a>. </p>
<h2>It’s not possible to remove the bias</h2>
<p>Some researchers have argued that machine learning algorithms can address systemic biases by designing “neutral” models that don’t take into account sensitive variables like <a href="http://dx.doi.org/10.1080/01621459.2015.1077710">race or gender</a>. But while it may seem possible in hypothetical situations, it doesn’t appear to be the case in real life.</p>
<p>Our recent study, by Human Rights Data Analysis Group’s Kristian Lum and William Isaac, found that <a href="http://dx.doi.org/10.1111/j.1740-9713.2016.00960.x">predictive policing vendor PredPol’s purportedly race-neutral algorithm</a> targeted black neighborhoods at roughly twice the rate of white neighborhoods when trained on historical drug crime data from Oakland, California. We found similar results when analyzing the data by income group, with low-income communities targeted at disproportionately higher rates compared to high-income neighborhoods.</p>
<p>But estimates – created from <a href="http://www.icpsr.umich.edu/icpsrweb/NAHDAP/studies/34481/version/4">public health surveys</a> and <a href="http://www.epimodels.org/drupal/?q=node/81">population models</a> – suggest illicit drug use in Oakland is roughly equal across racial and income groups. If the algorithm were truly race-neutral, it would spread drug-fighting police attention evenly across the city. </p>
<p>Similar evidence of racial bias was found by ProPublica’s investigative reporters <a href="https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing">when they looked at COMPAS, an algorithm predicting a person’s risk of committing a crime</a>, used in bail and sentencing decisions in Broward County, Florida, and elsewhere around the country. These systems learn only what they are presented with; if those data are biased, their learning can’t help but be biased too.</p>
<p><a href="https://theconversation.com/removing-gender-bias-from-algorithms-64721">Fixing this problem</a> is not a matter of just doing more advanced mathematical or statistical calculations. Rather, it will require rethinking how police agencies collect and analyze data, and how they train their staff to use data on the job.</p>
<h2>Understanding the biases to improve the data</h2>
<p>Using predictive analytics in the real world is challenging, particularly when <a href="https://theconversation.com/we-need-to-know-the-algorithms-the-government-uses-to-make-important-decisions-about-us-57869">trying to craft government policies</a> to minimize harm to vulnerable populations. We do not believe that police departments should stop using analytics or data-driven approaches to reducing crime. Rather, police should work to understand the biases and limitations inherent in their data.</p>
<p>In our view, police departments – and all agencies that use <a href="https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/2016_0504_data_discrimination.pdf">predictive algorithms</a> – should make their systems transparent to public scrutiny. This should start with community members and police departments discussing policing priorities and measures of police performance. That way any software the police use can be programmed to reflect the community’s values and concerns.</p>
<h2>Ensuring transparency</h2>
<p>It is not enough to claim or assume an algorithm is unbiased just because it is computerized and uses data: A lack of bias must be proven by evaluating the algorithm’s performance itself. Police agencies should get independent experts or human rights groups to perform <a href="https://theconversation.com/we-need-to-know-the-algorithms-the-government-uses-to-make-important-decisions-about-us-57869">regular audits of the algorithms and the data they process</a>. Much like the annual financial reviews large companies do, these examinations can ensure the input data are valid and are analyzed properly to avoid discrimination. If a company wants to claim its algorithm is proprietary and should be kept secret, it should still be required to offer robust testing environments so outside experts can examine its performance. </p>
<p>Further, police departments that use algorithms to make predictions about individuals, like <a href="http://directives.chicagopolice.org/directives/data/a7a57b85-155e9f4b-50c15-5e9f-7742e3ac8b0ab2d3.html">Chicago’s Strategic Subject List does</a>, should have policies similar to a <a href="https://arxiv.org/pdf/1606.08813v3.pdf">new European Union regulation</a> requiring <a href="https://theconversation.com/did-artificial-intelligence-deny-you-credit-73259">human-understandable explanations</a> of computer algorithms’ decisions. And no agency or company should be allowed to discriminate against people who have been identified by predictive policing.</p>
<p>Used correctly, predictive policing can be used to address the complex factors underlying crime trends. For example, rather than stepping up patrols, <a href="https://cops.usdoj.gov/html/dispatch/01-2015/saskatchewans_crime_reduction_model.asp">Toronto and other cities in Canada</a> are using predictive modeling to connect residents to local social services. By improving the quality of data cities collect, and analyzing the information with more transparent and inclusive processes, cities can build safer communities, rather than cracking down harder on areas that are already struggling.</p><img src="https://counter.theconversation.com/content/72640/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>William Isaac is a statistical consultant for the Human Rights Data Analysis Group (HRDAG). The Human Rights Data Analysis Group is a non-profit, non-partisan organization that produces rigorous, scientific analyses of human rights violations around the world. As a non-profit project, HRDAG is primarily funded by private donors (please see our Funding page for more information: <a href="https://hrdag.org/funding/">https://hrdag.org/funding/</a>).</span></em></p><p class="fine-print"><em><span>Andi Dixon is a policy analyst for the Human Rights Data Analysis Group (HRDAG). The Human Rights Data Analysis Group is a non-profit, non-partisan organization that produces rigorous, scientific analyses of human rights violations around the world. As a non-profit project, HRDAG is primarily funded by private donors (please see our Funding page for more information: <a href="https://hrdag.org/funding/">https://hrdag.org/funding/</a>).</span></em></p>Crime data reflect only what crimes are identified by the police – not all the crimes that occur. So decisions based on crime data are necessarily biased and incompletely informed.William Isaac, Ph.D. Candidate in Political Science, Michigan State UniversityAndi Dixon, Ph.D. Student in Communications, Columbia UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/622592016-07-08T22:46:03Z2016-07-08T22:46:03ZWhy is it so hard to improve American policing?<p>The use of lethal force by police officers in Minnesota and Baton Rouge has once again sparked protests over the violent dynamic between citizens and the police.</p>
<p>The ideal today is “democratic policing,” a concept developed by scholars like Gary T. Marx at MIT. Broadly, this <a href="http://web.mit.edu/gtmarx/www/dempol.html">refers to</a> a police force that is publicly accountable, subject to the rule of law, respectful of human dignity and that intrudes into citizens’ lives only under certain limited circumstances. </p>
<p>Partly in response to this ideal, policing in America has evolved considerably over the past 50 years. There have been changes in hiring, how relations with civilians are managed and what technologies are used. </p>
<p>The 20th century has seen a slow but <a href="http://www.newsweek.com/racial-makeup-police-departments-331130">steady integration</a> of minorities and women within police forces. Different managerial models aimed at improving relations with citizens have also influenced policing over the last 40 years. The most prominent among these are <a href="http://www.ojjdp.gov/mpg/litreviews/Community_and_Problem_Oriented_Policing.pdf">community-oriented policing</a>, <a href="http://www.ojjdp.gov/mpg/litreviews/Community_and_Problem_Oriented_Policing.pdf">problem-oriented policing</a> and <a href="https://www.ncjrs.gov/pdffiles1/bja/210681.pdf">intelligence-led policing</a>. </p>
<p>Policing has also been deeply transformed by the rapid integration of new technologies leading to computerization of police forces such as the profiling of crime hotspots, access to a broader range of weapons like tasers and the deployment of surveillance technologies like drones and closed circuit TV. </p>
<p>Some of these changes have been positive, but as recent events show, many problems remain. Why hasn’t more progress been made?</p>
<h2>Not all police forces are equal</h2>
<p>One problem is the inequality inherent in the system. For example, Washington, D.C. has <a href="http://www.governing.com/gov-data/safety-justice/police-officers-per-capita-rates-employment-for-city-departments.html">61.2 police officers</a> per 10,000 residents, while Baton Rouge has just 28.7.</p>
<p>Policing in America is not a standardized profession guided by an established set of procedures and policies. There are at least <a href="http://www.bjs.gov/index.cfm?ty=tp&tid=71">12,000 local</a> police agencies in the United States, making it one of the <a href="https://www.britannica.com/topic/police/Decentralized-police-organizations">most decentralized</a> police organizations in the world. </p>
<p>There are more than 600 state and local police academies across the country delivering training programs that vary <a href="http://www.bjs.gov/content/pub/pdf/slleta06.pdf">tremendously</a> in content, quality and intensity. This, inevitably, has an impact on the <a href="http://www.merlot.org/merlot/viewMaterial.htm?id=828673">skills</a> of their graduates. </p>
<p>Differences in policing also reflect the quality of leadership and the availability of resources. </p>
<p>Police chiefs and commanders represent a critical source of influence. They provide the doctrine by deciding whether to focus on prevention or repression of crime. They design strategies like police visibility or zero tolerance. And they identify the practice to be adopted – rounding up the usual suspects or systematic stop-and-frisk.</p>
<p>Often, however, these police practices are not aligned with public expectations. Citizen review boards – such as those in <a href="http://www.nyc.gov/html/ccrb/html/home/home.shtml">New York City</a> or <a href="https://www.sandiego.gov/city-clerk/boards-commissions/crb">San Diego</a> – are the exception rather than the norm. </p>
<p>And then there is the money issue. Police departments that are financially crippled are simply not able to provide regular training and therefore don’t have the expertise to pursue certain kinds of crime. The policing of fraud, for example, requires financial expertise and specialized units. </p>
<h2>From public relations policing to intensive policing</h2>
<p>Policing styles in America vary according to the targeted audience.</p>
<p>Police work in an affluent neighborhoods is often characterized by “soft” policing strategies. In other words, policing in those areas is more a question of making people feel secure than actual crime fighting. </p>
<p>However, in disadvantaged, multi-ethnic neighborhoods, police presence and activity are often <a href="http://amstat.tandfonline.com/doi/abs/10.1198/016214506000001040#/doi/abs/10.1198/016214506000001040">more intense</a>. They are there to target crimes that have been identified as priorities by police leadership and elected officials. </p>
<p>In fact, one policing model, <a href="http://www.nij.gov/topics/law-enforcement/strategies/predictive-policing/Pages/welcome.aspx">predictive policing</a>, can <a href="http://papers.ssrn.com/sol3/Papers.cfm?abstract_id=2050001">exacerbate racial tension</a> between law enforcement and African-American communities. </p>
<p>Predictive policing is based on crime analysis and computerization. This model helps law enforcement mobilize their resources in places where crime tends to concentrate. These crime clusters tend to be located in poor and disadvantaged communities. However, trying to prevent crime by focusing police forces on some addresses, street corners and blocks increases police-citizens encounters. Some of these encounters – even between police and law-abiding citizens caught up in the dragnet – can turn violent.</p>
<p>Another noticeable trend that is front and center in the media today is the “militarization” of police. </p>
<p>This blurring of the distinction between the police and military institutions, between law enforcement and war, <a href="http://cjmasters.eku.edu/sites/cjmasters.eku.edu/files/21stmilitarization.pdf">began in the 1980s</a> and has only intensified since. It was reinforced by public policy rhetoric calling for a “war on crime,” “war on drugs” and “war on terror.” Police forces began to acquire military equipment and implement militarized training with little or no accountability. For instance, in the wake of 9/11, several local police departments received funding from the Department of <a href="http://www.thedailybeast.com/articles/2011/12/20/local-cops-ready-for-war-with-homeland-security-funded-military-weapons.html">Homeland Security </a>and Department of Defense with little or no guidance on how to spend the money. This led to the unnecessary purchase of military equipment including armored cars, bulletproof vests for dogs and advanced bomb-disarming robots.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/129899/original/image-20160708-24087-1ss62cv.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/129899/original/image-20160708-24087-1ss62cv.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/129899/original/image-20160708-24087-1ss62cv.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=335&fit=crop&dpr=1 600w, https://images.theconversation.com/files/129899/original/image-20160708-24087-1ss62cv.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=335&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/129899/original/image-20160708-24087-1ss62cv.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=335&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/129899/original/image-20160708-24087-1ss62cv.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=421&fit=crop&dpr=1 754w, https://images.theconversation.com/files/129899/original/image-20160708-24087-1ss62cv.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=421&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/129899/original/image-20160708-24087-1ss62cv.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=421&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>As a result, we have seen a booming of SWAT (Special Weapons and Tactics) teams: 80 percent of cities with 25,000 to 50,000 inhabitants now have a SWAT team. From the late 1990s, through the <a href="http://fas.org/sgp/crs/natsec/R43701.pdf">1033 Program</a>, the Department of Defense has authorized the transfer of military equipment to police departments across the country. <a href="http://www.nytimes.com/2014/06/09/us/war-gear-flows-to-police-departments.html?_r=0">Since 2006</a> the police have bought 93,763 machine guns and 435 armored cars from the Pentagon. All this has only heightened the real and perceived potential for deadly force by police officers. </p>
<h2>Now I see you</h2>
<p>Another significant change in modern policing is the increasing capacity to monitor criminal activity and the population in general.</p>
<p>Police agencies now have access to a vast network of closed-circuit television (CCTV) monitors, allowing the surveillance of public and private spaces. Just to give a few numbers, the Chicago Police Department has access to 17,000 cameras, including <a href="http://vintechnology.com/2011/05/04/top-5-cities-with-the-largest-surveillance-camera-networks/">4,000 in public schools and 1,000 at O’Hare Airport</a>.</p>
<p>Drones, too, are increasingly in use. The U.S. Border Patrol deploys them to monitor smuggling activities. They have been purchased by <a href="https://www.eff.org/deeplinks/2012/10/eff-and-muckrock-have-filed-over-200-public-records-requests-surveillance-drones">a number</a> of local police departments, including those in Los Angeles; Mesa County, Arizona; Montgomery County, Texas; Miami Dade; and Seattle. </p>
<h2>A mirror of society</h2>
<p>In many regards, police agencies are a mirror of our beliefs and values as a society. </p>
<p>When applying this assumption to the phenomenon of intensive policing, it is not surprising, I would argue, that a country that has the highest rate of gun ownership among Western countries, the highest <a href="http://www.theguardian.com/news/datablog/2012/jul/22/gun-homicides-ownership-world-list">murder rate</a> by guns among advanced democracies and the largest military apparatus in the world would see a militarization of its police. </p>
<p>The same reflection can be made about the use of police surveillance technologies in a society where information technology increasingly defines our interactions. </p>
<p>Ultimately, policing is inseparable from politics. Police organizations are constantly influenced by political pressure, such as the nomination of a new chief of police or new laws that police must enforce. The state of our police system, in other words, for good or for ill, is an accurate proxy measure of the state of our democracy.</p>
<p><em>Editor’s Note: This story updates <a href="https://theconversation.com/democratic-policing-what-it-says-about-america-today-35066">Democratic policing: what it says about America today</a>.</em></p><img src="https://counter.theconversation.com/content/62259/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Frederic Lemieux does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>For 50 years, we have worked to make U.S. police more diverse and less intrusive. Why haven’t we made more progress?Frederic Lemieux, Professor and Program Director of Bachelor in Police and Security Studies; Master’s in Security and Safety Leadership; Master’s in Strategic Cyber Operations and Information Management, George Washington UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/483662015-11-16T11:10:05Z2015-11-16T11:10:05ZThe promise and perils of predictive policing based on big data<figure><img src="https://images.theconversation.com/files/101884/original/image-20151113-10435-fd2539.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Off to nab a would-be criminal?</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/nifmus/2527982098">Steve Koukoulas</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p>Police departments, like everyone else, would like to be more effective while spending less. Given the tremendous attention to <a href="http://www.ibm.com/big-data/us/en/">big data</a> in recent years, and the <a href="https://hbr.org/2012/10/big-data-the-management-revolution/ar">value it has provided</a> in fields ranging from astronomy to medicine, it should be no surprise that police departments are using data analysis to inform deployment of scarce resources. Enter the era of what is called “<a href="http://www.huffingtonpost.com/news/predictive-policing/">predictive policing</a>.”</p>
<p>Some form of predictive policing is likely now in force in a city near you. <a href="http://www.memphisflyer.com/memphis/blue-crush-continues-to-help-mpd-combat-crime/Content?oid=3685313">Memphis</a> was an <a href="https://www.youtube.com/watch?v=JFrYg5wYMMg">early adopter</a>. Cities from <a href="https://www.ncjrs.gov/App/Publications/abstract.aspx?ID=255905">Minneapolis</a> to <a href="http://www.miamiherald.com/news/local/community/miami-dade/article19256145.html">Miami</a> have embraced predictive policing. Time magazine named predictive policing (with particular reference to the city of Santa Cruz) one of <a href="https://leb.fbi.gov/2013/april/predictive-policing-using-technology-to-reduce-crime">the 50 best inventions of 2011</a>. New York City Police Commissioner William Bratton recently said that predictive policing is “<a href="https://www.revealnews.org/article/predictive-policing-is-wave-of-the-future-ny-commissioner-says/">the wave of the future</a>.”</p>
<p>The term “predictive policing” suggests that the police can anticipate a crime and be there to stop it before it happens and/or apprehend the culprits right away. As the <a href="http://articles.latimes.com/2010/aug/21/local/la-me-predictcrime-20100427-1">Los Angeles Times points out</a>, it depends on “sophisticated computer analysis of information about previous crimes, to predict where and when crimes will occur.”</p>
<p>At a very basic level, it’s easy for anyone to read a crime map and identify neighborhoods with higher crime rates. It’s also easy to recognize that burglars tend to target businesses at night, when they are unoccupied, and to target homes during the day, when residents are away at work. The challenge is to take a combination of dozens of such factors to determine where crimes are more likely to happen and who is more likely to commit them. Predictive policing algorithms are getting increasingly good at such analysis. Indeed, such was the premise of the movie Minority Report, in which the police can arrest and convict murderers before they commit their crime.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/101885/original/image-20151113-10401-6g0yf5.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/101885/original/image-20151113-10401-6g0yf5.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/101885/original/image-20151113-10401-6g0yf5.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=851&fit=crop&dpr=1 600w, https://images.theconversation.com/files/101885/original/image-20151113-10401-6g0yf5.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=851&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/101885/original/image-20151113-10401-6g0yf5.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=851&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/101885/original/image-20151113-10401-6g0yf5.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1069&fit=crop&dpr=1 754w, https://images.theconversation.com/files/101885/original/image-20151113-10401-6g0yf5.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1069&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/101885/original/image-20151113-10401-6g0yf5.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1069&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Tom Cruise, out to bust pre-criminals.</span>
<span class="attribution"><span class="source">Twentieth Century Fox</span></span>
</figcaption>
</figure>
<p>Predicting a crime with certainty is something that science fiction can have a field day with. But as a data scientist, I can assure you that in reality we can come nowhere close to certainty, even with advanced technology. To begin with, <a href="http://theconversation.com/big-data-analyses-depend-on-starting-with-clean-data-points-43687">predictions can be only as good as the input data</a>, and quite often these input data have errors.</p>
<p>But even with perfect, error-free input data and <a href="https://theconversation.com/big-data-algorithms-can-discriminate-and-its-not-clear-what-to-do-about-it-45849">unbiased processing</a>, ultimately what the algorithms are determining are correlations. Even if we have perfect knowledge of your troubled childhood, your socializing with gang members, your lack of steady employment, your wacko posts on social media and your recent gun purchases, all that the best algorithm can do is to say it is likely, but not certain, that you will commit a violent crime. After all, to believe such predictions as guaranteed is to deny free will. </p>
<h2>Feed in data, get out probabilities</h2>
<p>What data can do is give us probabilities, rather than certainty. Good data coupled with good analysis can give us very good estimates of probability. If you sum probabilities over many instances, you can usually get a robust estimate of the total.</p>
<p>For example, data analysis can provide a probability that a particular house will be broken into on a particular day based on historical records for similar houses in that neighborhood on similar days. An insurance company may add this up over all days in a year to decide how much to charge for insuring that house.</p>
<p>A police department may add up these probabilities across all houses in a neighborhood to estimate how likely it is that there will be a burglary in that neighborhood. They can then place more officers in neighborhoods with higher probabilities for crime with the idea that police presence may deter crime. This seems like a win all around: less crime and targeted use of police resources. Indeed the statistics, in terms of <a href="http://www.universityofcalifornia.edu/news/5493/predictive-policing-test-substantially-reduces-crime">reduced crime rates</a>, support our intuitive expectations.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/101886/original/image-20151113-10438-ucyqr5.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/101886/original/image-20151113-10438-ucyqr5.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/101886/original/image-20151113-10438-ucyqr5.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=334&fit=crop&dpr=1 600w, https://images.theconversation.com/files/101886/original/image-20151113-10438-ucyqr5.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=334&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/101886/original/image-20151113-10438-ucyqr5.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=334&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/101886/original/image-20151113-10438-ucyqr5.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=420&fit=crop&dpr=1 754w, https://images.theconversation.com/files/101886/original/image-20151113-10438-ucyqr5.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=420&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/101886/original/image-20151113-10438-ucyqr5.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=420&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Predictive policing goes beyond looking only at where crime has already occurred.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/brettlider/323381483">Brett Lider</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<h2>Likely doesn’t mean definitely</h2>
<p>Similar arguments can be used in multiple arenas where we’re faced with limited resources. Realistically, customs agents cannot thoroughly search every passenger and every bag. Tax authorities cannot audit every tax return. So they target the “most likely” culprits. But likelihood is very far from certainty: all the authorities know is that the odds are higher. Undoubtedly many innocent individuals are labeled “likely.” If you’re innocent but get targeted, it can be a big hassle, or worse.</p>
<p>Incorrectly targeted individuals may be inconvenienced by a customs search, but predictive policing can do real harm. Consider the case of Tyrone Brown, recently <a href="http://www.nytimes.com/2015/09/25/us/police-program-aims-to-pinpoint-those-most-likely-to-commit-crimes.html?_r=0">reported in The New York Times</a>. He was specifically targeted for attention by the Kansas City police because he was friends with known gang members. In other words, the algorithm picked him out as having a higher likelihood of committing a crime based on the company he kept. They told him he was being watched and would be dealt with severely if he slipped up. </p>
<p>The algorithm didn’t “make a mistake” in picking out someone like Tyrone Brown. It may have correctly determined that Tyrone was more likely to commit a murder than you or I. But that is very different from saying that he did (or will) kill someone.</p>
<p>Suppose there’s a one-in-a-million chance that a typical citizen will commit a murder, but there is a one-in-a-thousand chance that Tyrone will. That makes him a thousand times as likely to commit a murder as a typical citizen. So it makes sense statistically for the police to focus their attention on him. But don’t forget that there is only a one-in-a-thousand chance that he commits a murder. For a thousand such “suspect” Tyrones, there is only one who is a murderer and 999 who are innocent. How much are we willing to inconvenience or harm the 999 to stop the one?</p>
<p>Kansas city is far from being alone in this sort of preemptive contact with citizens identified as “likely to commit crimes.”
Last year, there was <a href="http://www.theverge.com/2014/2/19/5419854/the-minority-report-this-computer-predicts-crime-but-is-it-racist">considerable controversy</a> over a similar program in Chicago.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/101887/original/image-20151113-10427-6wyr4f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/101887/original/image-20151113-10427-6wyr4f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/101887/original/image-20151113-10427-6wyr4f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=402&fit=crop&dpr=1 600w, https://images.theconversation.com/files/101887/original/image-20151113-10427-6wyr4f.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=402&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/101887/original/image-20151113-10427-6wyr4f.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=402&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/101887/original/image-20151113-10427-6wyr4f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=505&fit=crop&dpr=1 754w, https://images.theconversation.com/files/101887/original/image-20151113-10427-6wyr4f.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=505&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/101887/original/image-20151113-10427-6wyr4f.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=505&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Preventing crime is the goal, but are personal freedoms a casualty?</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/ibm_media/7362224408">ibmphoto24</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span>
</figcaption>
</figure>
<h2>Balancing crime reduction with civil rights</h2>
<p>Such tactics, even if effective in reducing crime, raise <a href="http://www.innocenceproject.org/news-events-exonerations/police-departments-employ-controversial-201cpredictive-policing201d-tactics">civil liberty concerns</a>. Suppose you fit the profile of a bad driver and have accumulated points on your driving record. Consider how you would feel if you had a patrol car follow you every time you got behind the wheel. Even worse, it’s likely, even if you’re doing your best, that you will make an occasional mistake. For most of us, rolling through a stop sign or driving five miles above the speed limit is usually of little consequence. But since you have a cop following you, you get a ticket for every small offense. In consequence, you end up with an even worse driving record. </p>
<p>Yes, data can help make predictions, and these predictions can help police expend their resources smarter. But we must remember that a probabilistic prediction is not certainty, and explicitly consider the harm to innocent people when we take actions based on probabilities. More broadly speaking, data science can bring us many benefits, but care is required to make sure that it does so <a href="http://www.bigdatadialog.com/fairness/">in a fair manner</a>.</p><img src="https://counter.theconversation.com/content/48366/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>H V Jagadish's research on Big Data is funded in part by the National Science Foundation and the National Institutes of Health.</span></em></p>Preventing crime before it happens, while saving resources, sounds like a great use of big data. But these calculated probabilities raise big questions about civil liberties.H.V. Jagadish, Bernard A Galler Collegiate Professor of Electrical Engineering and Computer Science, University of MichiganLicensed as Creative Commons – attribution, no derivatives.