tag:theconversation.com,2011:/nz/topics/face-surveillance-89673/articlesFace surveillance – The Conversation2024-01-19T13:42:15Ztag:theconversation.com,2011:article/2172262024-01-19T13:42:15Z2024-01-19T13:42:15ZFace recognition technology follows a long analog history of surveillance and control based on identifying physical features<figure><img src="https://images.theconversation.com/files/569962/original/file-20240117-29-ri412u.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5272%2C3598&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Today's technology advances what passport control has been doing for more than a century.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/controll-of-passports-at-the-frontiers-between-beuthen-and-news-photo/548866047">ullstein bild via Getty Images</a></span></figcaption></figure><p>American Amara Majeed was <a href="https://www.bbc.com/news/world-asia-48061811">accused of terrorism</a> by the Sri Lankan police in 2019. Robert Williams was <a href="https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html">arrested outside his house</a> in Detroit and detained in jail for 18 hours for allegedly stealing watches in 2020. Randal Reid <a href="https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arrests.html">spent six days in jail</a> in 2022 for supposedly using stolen credit cards in a state he’d never even visited.</p>
<p>In all three cases, the authorities had the wrong people. In all three, it was face recognition technology that told them they were right. Law enforcement officers in many U.S. states are <a href="https://www.wired.com/story/hidden-role-facial-recognition-tech-arrests/">not required to reveal</a> that they used face recognition technology to identify suspects.</p>
<p>Face recognition technology is the latest and most sophisticated version of <a href="https://www.dhs.gov/biometrics">biometric surveillance</a>: using unique physical characteristics to identify individual people. It stands in a <a href="https://www.thalesgroup.com/en/markets/digital-identity-and-security/government/inspired/history-of-biometric-authentication">long line of technologies</a> – from the fingerprint to the passport photo to iris scans – designed to monitor people and determine who has the right to move freely within and across borders and boundaries.</p>
<p>In my book, “<a href="https://www.press.jhu.edu/books/title/12700/do-i-know-you">Do I Know You? From Face Blindness to Super Recognition</a>,” I explore how the story of face surveillance lies not just in the history of computing but in the history of medicine, of race, of psychology and neuroscience, and in the health humanities and politics.</p>
<p>Viewed as a part of the long history of people-tracking, face recognition techology’s incursions into privacy and limitations on free movement are carrying out exactly what biometric surveillance was always meant to do.</p>
<p>The system works by converting captured faces – either static from photographs or moving from video – into a series of unique data points, which it then compares against the data points drawn from images of faces already in the system. As face recognition technology improves in accuracy and speed, its effectiveness as a means of surveillance becomes ever more pronounced.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="faces in a crowd highlighted and annotated with dates and times" src="https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Paired with AI, face recognition technology scans the crowd at a conference.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/live-demonstration-uses-artificial-intelligence-and-facial-news-photo/1080200068">David McNew/AFP via Getty Images</a></span>
</figcaption>
</figure>
<h2>Accuracy improves, but biases persist</h2>
<p>Surveillance is predicated on the idea that <a href="https://theconversation.com/surveillance-is-pervasive-yes-you-are-being-watched-even-if-no-one-is-looking-for-you-187139">people need to be tracked</a> and their movements limited and controlled in a trade-off between privacy and security. The assumption that less privacy leads to more security is built in.</p>
<p>That may be the case for some, but not for the people disproportionately targeted by face recognition technology. <a href="https://www.routledge.com/Histories-of-Surveillance-from-Antiquity-to-the-Digital-Era-The-Eyes-and/Marklund-Skouvig/p/book/9781032021539">Surveillance has always been designed</a> to identify the people whom those in power wish to most closely track.</p>
<p>On a global scale, <a href="https://doi.org/10.1080/21670811.2018.1493938">there are</a> <a href="https://longreads.tni.org/stateofpower/settled-habits-new-tricks-casteist-policing-meets-big-tech-in-india">caste cameras in India</a>, <a href="https://www.theguardian.com/world/2021/sep/30/uyghur-tribunal-testimony-surveillance-china">face surveillance of Uyghurs in China</a> and even <a href="https://mynbc15.com/news/spotlight-on-america/facial-recognition-technology-in-school-hallways-states-face-a-divisive-debate">attendance surveillance</a> <a href="https://dx.doi.org/10.7302/21934">in U.S. schools</a>, often with low-income and majority-Black populations. <a href="https://www.aclu.org/news/privacy-technology/how-is-face-recognition-surveillance-technology-racist">Some people are tracked more closely</a> than others.</p>
<p>In addition, the cases of Amara Majeed, Robert Williams and Randal Reid <a href="https://www.aclu.org/news/privacy-technology/how-is-face-recognition-surveillance-technology-racist">aren’t anomalies</a>. As of 2019, face recognition technology <a href="https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf">misidentified Black and Asian people</a> at up to <a href="https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software">100 times the rate of white people</a>, including, in 2018, a disproportionate number of the <a href="https://www.aclu.org/news/privacy-technology/amazons-face-recognition-falsely-matched-28">28 members of the U.S. Congress</a> who were falsely matched with mug shots on file using Amazon’s Rekognition tool.</p>
<p>When the database against which captured images were compared had only a limited number of mostly white faces upon which to draw, face recognition technology would offer matches based on the closest alignment available, leading to a pattern of highly racialized – and racist – false positives.</p>
<p>With the expansion of images in the database and increased sophistication of the software, <a href="https://www.csis.org/blogs/strategic-technologies-blog/how-accurate-are-facial-recognition-systems-and-why-does-it">the number of false positives</a> – incorrect matches between specific individuals and images of wanted people on file – has <a href="https://bipartisanpolicy.org/blog/frt-accuracy-performance/">declined dramatically</a>. Improvements in pixelation and mapping static images into moving ones, along with increased social media tagging and <a href="https://www.penguinrandomhouse.com/books/691288/your-face-belongs-to-us-by-kashmir-hill/">ever more sophisticated scraping tools</a> like those developed by Clearview AI, have helped decrease the error rates.</p>
<p><a href="https://www.washingtonpost.com/technology/2019/12/19/federal-study-confirms-racial-bias-many-facial-recognition-systems-casts-doubt-their-expanding-use/">The biases</a>, however, remain deeply embedded into the systems and their purpose, explicitly or implicitly targeting already targeted communities. The technology is not neutral, nor is the surveillance it is used to carry out.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Pen and ink illustration of suited hands using calipers to measure a man's forehead to back of his head" src="https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=458&fit=crop&dpr=1 600w, https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=458&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=458&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=576&fit=crop&dpr=1 754w, https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=576&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=576&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Physiognomy went beyond recognition of an individual and tried to connect physical features with other characteristics.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/illustration/head-royalty-free-illustration/1399373778">clu/DigitalVision Vectors via Getty Images</a></span>
</figcaption>
</figure>
<h2>Latest technique in a long history</h2>
<p>Face recognition software is only the most recent manifestation of global systems of tracking and sorting. Precursors are rooted in the now-debunked belief that bodily features offer a unique index to character and identity. This pseudoscience was formalized in the late 18th century under the rubric of the <a href="https://www.hup.harvard.edu/books/9780674036048">ancient practice of physiognomy</a>.</p>
<p>Early systemic applications included anthropometry (body measurement), fingerprinting and iris or retinal scans. They all offered unique identifiers. None of these could be done without the participation – willing or otherwise – of the person being tracked.</p>
<p>The framework of bodily identification was adopted in the 19th century for use in criminal justice detection, prosecution and record-keeping to allow governmental control of its populace. The intimate relationship between face recognition and border patrol was galvanized by the <a href="http://www.atlasobscura.com/articles/passport-photos-history-development-regulation-mugshots">introduction of photos into passports</a> in some countries including Great Britain and the United States in 1914, <a href="https://doi.org/10.1017/9781108664271">a practice that became widespread by 1920</a>.</p>
<p>Face recognition technology provided a way to go stealth on human biometric surveillance. Much early research into face recognition software was <a href="https://www.wired.com/story/secret-history-facial-recognition/">funded by the CIA</a> for the purposes of border surveillance.</p>
<p>It tried to develop a standardized framework for face segmentation: mapping the distance between a person’s facial features, including eyes, nose, mouth and hairline. Inputting that data into computers let a user search stored photographs for a match. These early scans and maps were limited, and the attempts to match them were not successful.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Woman looks at screen with her image on a vending machine" src="https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A customer pays via facial recognition at a smart store in China.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/nov-6-2018-a-visitor-tries-facial-recognition-payment-in-a-news-photo/1058496364">Huang Zongzhi/Xinhua News Agency via Getty Images</a></span>
</figcaption>
</figure>
<p>More recently, private companies have <a href="https://fortune.com/longform/facial-recognition/">adopted data harvesting techniques</a>, including face recognition, as part of a long practice of <a href="https://theconversation.com/data-brokers-know-everything-about-you-what-ftc-case-against-ad-tech-giant-kochava-reveals-218232">leveraging personal data for profit</a>.</p>
<p>Face recognition technology works not only to unlock your phone or help you board your plane more quickly, but also in promotional store kiosks and, essentially, in any photo taken and shared by anyone, with anyone, anywhere around the world. These photos are stored in a database, creating ever more comprehensive systems of surveillance and tracking.</p>
<p>And while that means that today it is unlikely that Amara Majeed, Robert Williams, Randal Reid and Black members of Congress would be ensnared by a false positive, face recognition technology has invaded everyone’s privacy. It – and the governmental and private systems that design, run, use and capitalize upon it – is watching, and paying particular attention to those whom society and its structural biases deem to be the greatest risk.</p><img src="https://counter.theconversation.com/content/217226/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sharrona Pearl receives funding from Interfaith America.</span></em></p>Face recognition technology follows earlier biometric surveillance techniques, including fingerprints, passport photos and iris scans. It’s the first that can be done without the subject’s knowledge.Sharrona Pearl, Associate Professor of Bioethics and History, Drexel UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1430782020-07-28T12:17:31Z2020-07-28T12:17:31ZHow to hide from a drone – the subtle art of ‘ghosting’ in the age of surveillance<figure><img src="https://images.theconversation.com/files/349270/original/file-20200723-19-1selkhc.jpg?ixlib=rb-1.1.0&rect=0%2C204%2C4252%2C2346&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The federal government has used military-grade border patrol drones like this one to monitor protests in US cities.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/joncutrer/43252568250/">_ Jonathan Cutrer/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>Drones of all sizes are being used by environmental advocates to monitor deforestation, by conservationists to track poachers, and by journalists and activists to document large protests. As a <a href="https://scholar.google.com/citations?user=MEUtCZYAAAAJ&hl=en">political sociologist</a> who studies social movements and drones, I document a wide range of nonviolent and pro-social drone uses in my new book, “<a href="https://mitpress.mit.edu/books/good-drone">The Good Drone</a>.” I show that these efforts have the potential to democratize surveillance. </p>
<p>But when the Department of Homeland Security redirects large, fixed-wing drones from the U.S.-Mexico border to <a href="https://www.nytimes.com/2020/06/19/us/politics/george-floyd-protests-surveillance.html">monitor protests</a>, and when towns experiment with using drones to <a href="https://www.nbcnews.com/news/us-news/connecticut-town-tests-pandemic-drone-detect-fevers-experts-question-if-n1189546">test people for fevers</a>, it’s time to think about how many eyes are in the sky and how to avoid unwanted aerial surveillance. One way that’s within reach of nearly everyone is learning how to simply disappear from view.</p>
<h2>Crowded skies</h2>
<p>Over the past decade there’s been an explosion in the public’s use of drones – everyday people with everyday tech doing <a href="https://digital.sandiego.edu/gdl2016report/1/">interesting things</a>. As drones enter already-crowded airspace, the Federal Aviation Administration is <a href="https://doi.org/10.15394/ijaaa.2020.1453">struggling to respond</a>. The near future is likely to see even more of these devices in the sky, flown by an ever-growing cast of social, political and economic actors. </p>
<figure class="align-center ">
<img alt="small drone over a city street" src="https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A law enforcement drone flew over demonstrators, Friday, June 5, 2020, in Atlanta.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/America-Protest-Atlanta/db14ae07df09454398c3fb94439453a4/16/0">AP Photo/Mike Stewart</a></span>
</figcaption>
</figure>
<p>Public opinion about the use and spread of drones is still <a href="https://theconversation.com/dont-shoot-that-drone-overhead-probably-isnt-invading-your-privacy-114701">up in the air</a>, but burgeoning drone use has sparked numerous efforts to curtail drones. These responses range from public policies exerting community control over local airspace, to the development of sophisticated jamming equipment and tactics for knocking drones out of the sky. </p>
<p>From startups to major defense contractors, there is a scramble to deny airspace to drones, to hijack drones digitally, to control drones physically and to shoot drones down. Anti-drone measures range from simple blunt force, <a href="https://www.popularmechanics.com/flight/drones/how-to/a16756/how-to-shoot-down-a-drone/">10-gauge shotguns</a>, to the poetic: <a href="https://www.washingtonpost.com/news/worldviews/wp/2016/02/01/trained-eagle-destroys-drone-in-dutch-police-video/">well-trained hawks</a>. </p>
<p>Many of these anti-drone measures are expensive and complicated. Some are illegal. The most affordable – and legal – way to avoid drone technology is <a href="http://www.dronesurvivalguide.org/">hiding</a>.</p>
<h2>How to disappear</h2>
<p>The first thing you can do to hide from a drone is to take advantage of the natural and built environment. It’s possible to wait for bad weather, since smaller devices like those used by local police have a hard time flying in high winds, dense fogs and heavy rains. </p>
<p>Trees, walls, alcoves and tunnels are more reliable than the weather, and they offer shelter from the high-flying drones used by the Department of Homeland Security.</p>
<figure class="align-center ">
<img alt="Silhouettes of drones" src="https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=418&fit=crop&dpr=1 600w, https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=418&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=418&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=525&fit=crop&dpr=1 754w, https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=525&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=525&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">In some parts of the world, hiding from drones is a matter of life and death.</span>
<span class="attribution"><a class="source" href="http://www.dronesurvivalguide.org/">Drone Survival Guide</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span>
</figcaption>
</figure>
<p>The second thing you can do is minimize your digital footprints. It’s smart to avoid using wireless devices like mobile phones or GPS systems, since they have digital signatures that can reveal your location. This is useful for evading drones, but is also important for avoiding other privacy-invading technologies.</p>
<p>The third thing you can do is confuse a drone. Placing mirrors on the ground, standing over broken glass, and wearing elaborate headgear, <a href="https://www.theguardian.com/technology/2017/jan/04/anti-surveillance-clothing-facial-recognition-hyperface">machine-readable blankets</a> or <a href="https://projectkovr.com/">sensor-jamming jackets</a> can break up and distort the image a drone sees. </p>
<p>Mannequins and other forms of mimicry can confuse both on-board sensors and the analysts charged with monitoring the drone’s video and sensor feeds. </p>
<p>Drones equipped with infrared sensors will see right through the mannequin trick, but are confused by tactics that mask the body’s temperature. For example, a space blanket will mask significant amounts of the body’s heat, as will simply hiding in an area that matches the body’s temperature, like a building or sidewalk exhaust vent.</p>
<p>The fourth, and most practical, thing you can do to protect yourself from drone surveillance is to get a disguise. The growth of mass surveillance has led to an explosion in creative experiments meant to mask one’s identity. But some of the smartest ideas are decidedly old-school and low-tech. Clothing is the first choice, because hats, glasses, masks and scarves go a long way toward scrambling drone-based facial-recognition software. </p>
<figure class="align-center ">
<img alt="Facial makeup chart" src="https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=518&fit=crop&dpr=1 600w, https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=518&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=518&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=651&fit=crop&dpr=1 754w, https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=651&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=651&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Clever use of makeup can thwart facial recognition systems.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/johnbullas/4591293468/">John C Bullas BSc MSc PhD MCIHT MIAT/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span>
</figcaption>
</figure>
<p>Your gait is as unique as your fingerprint. As gait-recognition software evolves, it will be important to also mask the key pivot points used in identifying the walker. It may be that the best response is affecting a limp, using a minor leg brace or wearing extremely loose clothing.</p>
<p>Artists and scientists have taken these approaches a step further, developing a <a href="https://weburbanist.com/2013/04/01/stealth-wear-counter-surveillance-fashion-protects-privacy/">hoodie wrap</a> that’s intended to shield the owner’s heat signature and to scramble facial recognition software, and <a href="https://www.chicagotribune.com/business/ct-biz-facial-recognition-blocking-glasses-privacy-20200417-isy77jwrsncoholhndmyifadr4-story.html">glasses</a> intended to foil facial recognition systems. </p>
<h2>Keep an umbrella handy</h2>
<p>These innovations are alluring, but umbrellas may prove to be the most ubiquitous and robust tactic in this list. They’re affordable, easy to carry, hard to see around and can be disposed of in a hurry. Plus you can build a <a href="http://survival.sentientcity.net/umbrella.html">high-tech one</a>, if you want.</p>
<p>It would be nice to live in a world with fewer impositions on privacy, one in which law enforcement did not use small quadcopters and the Department of Homeland Security did not redeploy large Predator drones to surveil protesters. And, for people in some parts of the world, it would be nice not to associate the sound of a drone with impending missile fire. But given that those eyes are in the sky, it’s good to know how to hide. </p>
<p>
<section class="inline-content">
<img src="https://images.theconversation.com/files/248895/original/file-20181204-133100-t34yqm.png?w=128&h=128">
<div>
<header>Austin Choi-Fitzpatrick is the author of:</header>
<p><a href="https://mitpress.mit.edu/books/good-drone">The Good Drone: How Social Movements Democratize Surveillance</a></p>
<footer>MIT Press provides funding as a member of The Conversation US.</footer>
</div>
</section>
</p><img src="https://counter.theconversation.com/content/143078/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Austin Choi-Fitzpatrick has previously won an industry award from drone manufacturer DJI, and his work has been supported through the National Science Foundation. MIT Press provides funding as a member of The Conversation US.</span></em></p>Avoiding drones’ prying eyes can be as complicated as donning a high-tech hoodie and as simple as ducking under a tree.Austin Choi-Fitzpatrick, Associate Professor of Political Sociology, University of San DiegoLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1426082020-07-20T16:28:22Z2020-07-20T16:28:22ZWhy facial recognition algorithms can’t be perfectly fair<figure><img src="https://images.theconversation.com/files/348442/original/file-20200720-102864-l4tc05.jpg?ixlib=rb-1.1.0&rect=0%2C5%2C3456%2C2374&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Who's who?</span> <span class="attribution"><a class="source" href="https://unsplash.com/photos/obZx1LjKKjc">Timon Studler/Unsplash</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>In June 2020, a facial recognition algorithm led to the <a href="https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html">wrongful arrest of Robert Williams</a>, an African-American, for a crime he did not commit. After a shoplifting incident in in a pricey area of Detroit, Michigan, his driver’s license photo was wrongly matched with a blurry video of the perpetrator. Police released him after several hours and apologised, but the episode raises serious questions about the accuracy of visual recognition algorithms.</p>
<p>The troubling aspect of the story is that facial recognition algorithms have been shown to be <a href="https://www.nytimes.com/2019/01/24/technology/amazon-facial-technology-study.html">less accurate for black faces than for white ones</a>. But why do facial recognition algorithms make more mistakes for Blacks than whites, and what can be done about it? </p>
<h2>To err is human… and algorithmic</h2>
<p>Like any prediction algorithm, facial recognition algorithms make <a href="https://en.wikipedia.org/wiki/Probabilistic_forecasting">probabilistic predictions</a> based on incomplete data – a blurry photo, for example. Such predictions are never completely error-free, nor can they be. Since errors always exist, the question is what is an acceptable level of errors, what kind of errors should be prioritised, and whether you need a strictly identical error rate for every population group.</p>
<p>Facial recognition algorithms produce two kinds of errors: false positives and false negatives. The first occur when the algorithm thinks there’s a positive match between two facial images, but in fact there is no match (this was the case for Robert Williams). The second take place when the algorithm says there’s no match, but in fact there should be one.</p>
<p>The consequences of these two errors are different depending on the situation. For example, if the police use a facial recognition algorithm in their efforts to locate a fugitive, a false positive can lead to the wrongful arrest of an innocent person. Alternately, when border-control authorities use facial recognition to determine if a person matches the passport he or she carries, a false positive will lead to the impostor crossing the border with a stolen passport. Each case requires a determination of the cost of different kinds of errors, and a decision on which kind of errors to prioritise. For example, if police are tracking potentially violent suspects, they may want to reduce the number of false negatives so the suspects are less likely to slip through, but this would <a href="https://fra.europa.eu/en/publication/2019/facial-recognition-technology-fundamental-rights-considerations-context-law">drive up the number of false positives</a> – in other words, people falsely arrested.</p>
<h2>Race and technology</h2>
<p>Racism can arise when there is a higher rate of error, either false negative or false positive, for a subset of a population – for example, Blacks in the United States. These differential error rates are not programmed into the algorithm – if they were, it would be manifestly illegal. Instead, they slip in during the design and “training” process. Most developers send their algorithms to the <a href="https://www.nist.gov/speech-testimony/facial-recognition-technology-frt-0">United States Technical Standards Agency</a> (NIST), to be tested for differential error rates over different parts of the population. NIST uses a large US government database of passport and visa photos, and will test the algorithm based on different nationalities. NIST <a href="https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf">publishes the results</a>, which show huge error-rate variations for certain nationalities.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/348416/original/file-20200720-37-mvxp8d.jpg?ixlib=rb-1.1.0&rect=0%2C17%2C1500%2C940&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/348416/original/file-20200720-37-mvxp8d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=384&fit=crop&dpr=1 600w, https://images.theconversation.com/files/348416/original/file-20200720-37-mvxp8d.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=384&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/348416/original/file-20200720-37-mvxp8d.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=384&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/348416/original/file-20200720-37-mvxp8d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=483&fit=crop&dpr=1 754w, https://images.theconversation.com/files/348416/original/file-20200720-37-mvxp8d.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=483&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/348416/original/file-20200720-37-mvxp8d.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=483&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Jonathan McIntosh/Flickr</span>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>This kind of differential performance can be due to inadequate training data or an intrinsic limitation of the learning algorithm itself. If the training data contains a million examples of white males, but only two examples of black females, the learning algorithm will have difficulty distinguishing the faces of black females. The way to correct this is either to have training data that is representative of the entire population (which is nearly impossible), or to give different weights to the data in the training set to simulate the proportions that would exist in a data set covering the whole population.</p>
<p>Inadequate training data is not the sole cause for differential performance. Some algorithms have intrinsic difficulties extracting unique features from certain kinds of faces. For example, infants’ faces tend to look alike and thus are notoriously hard to distinguish from each other. Some algorithms will do better when shown only a few training examples, but if these fixes don’t work, it may be possible to impose a “fairness constraint,” a rule that forces the algorithm to equalize performance among different population groups.</p>
<p>Unfortunately, this can have the effect of bringing down the level of performance for other groups, potentially to an unacceptable level. If we impose a fairness constraint, we also need to identify which population groups should be covered. Should a facial recognition algorithm treat every possible skin colour and ethnic origin alike, including relatively small population groups? You can break down the population into an almost unlimited number of subgroups.</p>
<p>And what level of difference in performance can be tolerated between groups – do they have to be identical, or can we tolerate a certain percentage differential? And what is the effect of fairness constraints on algorithmic performance? Indeed, a perfectly nondiscriminatory facial recognition algorithm may be perfectly useless. </p>
<p>As a society, we make tradeoffs like this every day. For algorithms, these trade-offs must be explicit: “less then perfect” fairness becomes an explicit design choice.</p>
<h2>Ethical concerns</h2>
<p>Another uncomfortable tradeoff is whether we allow data on ethnicity or skin colour to be collected and used to help make algorithms less discriminatory. Europe generally prohibits the collection of data on ethnicity, and for good reason. Databases on ethnicity helped Nazis and cooperating governments locate and murder 6 million Jews in the 1940s. Yet data on ethnicity or skin colour can help make algorithms less racist. The data can help test algorithms for differential treatment, by permitting a test on only black- or brown-skinned individuals. Also, a <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3329669">“racially aware” algorithm</a> can learn to compensate for discrimination by creating separate models for different population groups, for example a “black” model and a “non-black” model. But this runs against an important principle espoused in France and in other countries that rules should be colour blind.</p>
<p>If perfect fairness is impossible, should facial recognition technology be prohibited? Certain cities in the United States have imposed a moratorium on police use of facial recognition until issues of reliability and discrimination can be sorted out. The state of Washington has enacted a law to require <a href="https://www.reuters.com/article/us-washington-tech/washington-state-signs-facial-recognition-curbs-into-law-critics-want-ban-idUSKBN21I3AS">testing and strict regulatory oversight for police use of facial recognition</a>.</p>
<p>One aspect of the law is to require a study of differential impacts of the system on different subgroups of the population, and an obligation to introduce mitigation measures to correct performance differentials. Regulation, not prohibition, is the right approach, but regulation will require us to make a series of explicit choices that we’re not used to making, including the key question of how fair is “fair enough”.</p><img src="https://counter.theconversation.com/content/142608/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Les auteurs ne travaillent pas, ne conseillent pas, ne possèdent pas de parts, ne reçoivent pas de fonds d'une organisation qui pourrait tirer profit de cet article, et n'ont déclaré aucune autre affiliation que leur organisme de recherche.</span></em></p>Facial recognition algorithms will always make mistakes. But how can we make them less discriminatory?Winston Maxwell, Directeur d'Etudes, droit et numérique, Télécom Paris – Institut Mines-TélécomStephan Clémençon, Teacher/researcher in applied maths, Télécom Paris – Institut Mines-TélécomLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1262822020-07-09T19:59:15Z2020-07-09T19:59:15ZLarge-scale facial recognition is incompatible with a free society<figure><img src="https://images.theconversation.com/files/346762/original/file-20200710-87067-1g5fm9c.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5590%2C3690&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>In the US, tireless <a href="https://www.technologyreview.com/2020/06/12/1003482/amazon-stopped-selling-police-face-recognition-fight/">opposition</a> to state use of facial recognition algorithms has recently won some victories. </p>
<p>Some progressive cities have <a href="https://edition.cnn.com/2019/07/17/tech/cities-ban-facial-recognition/index.html">banned</a> some uses of the technology. <a href="https://techcrunch.com/2020/06/08/ibm-ends-all-facial-recognition-work-as-ceo-calls-out-bias-and-inequality/">Three</a> <a href="https://www.usatoday.com/story/news/nation/2020/06/10/george-floyd-protests-amazon-police-use-facial-recognition/5338536002/">tech</a> <a href="https://www.washingtonpost.com/technology/2020/06/11/microsoft-facial-recognition/">companies</a> have pulled facial recognition products from the market. <a href="https://edition.cnn.com/2020/06/25/tech/facial-recognition-legislation-markey/index.html">Democrats have advanced a bill</a> for a moratorium on facial recognition. The Association for Computing Machinery (ACM), a leading computer science organisation, <a href="https://www.acm.org/binaries/content/assets/public-policy/ustpc-facial-recognition-tech-statement.pdf">has also come out against the technology</a>. </p>
<p>Outside the US, however, the tide is heading in the other direction. China is deploying <a href="https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html">facial recognition on a vast scale</a> in its social credit experiments, policing, and suppressing the Uighur population. It is also exporting facial recognition technology (and norms) to partner countries in the <a href="https://www.lowyinstitute.org/the-interpreter/belt-and-road-means-big-data-facial-recognition-too">Belt and Road initiative</a>. The UK High Court ruled its use by South Wales Police <a href="https://www.bbc.com/news/uk-wales-49565287">lawful</a> last September (though the decision is being appealed).</p>
<p>Here in Australia, despite <a href="https://humanrights.gov.au/about/news/media-releases/commission-calls-accountable-ai">pushback from the Human Rights Commission</a>, the trend is also towards greater use. The government proposed an ambitious plan for a <a href="https://www.itnews.com.au/news/three-states-complete-national-face-matching-database-upload-535352">national face database</a> (including wacky trial balloons about <a href="https://www.nytimes.com/2019/10/29/world/australia/pornography-facial-recognition.html">age-verification on porn sites</a>). Some local councils are <a href="https://www.abc.net.au/news/2020-06-17/facial-surveillance-slowly-being-trialled-around-the-country/12308282">adding facial recognition</a> into their existing surveillance systems. Police officers have <a href="https://www.abc.net.au/news/science/2020-04-14/clearview-ai-facial-recognition-tech-australian-federal-police/12146894">tried out the dystopian services of Clearview AI</a>. </p>
<p>Should Australia be using this technology? To decide, we need to answer fundamental questions about the kind of people, and the kind of society, we want to be.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/why-regulating-facial-recognition-technology-is-so-problematic-and-necessary-107284">Why regulating facial recognition technology is so problematic - and necessary</a>
</strong>
</em>
</p>
<hr>
<h2>From facial recognition to face surveillance</h2>
<p>Facial recognition has <a href="https://global-uploads.webflow.com/5e027ca188c99e3515b404b7/5ed1002058516c11edc66a14_FRTsPrimerMay2020.pdf">many uses</a>. </p>
<p>It can verify individual identity by comparing a target image with data held on file to confirm a match – this is “one-to-one” facial recognition. It can also compare a target image with a database of subjects of interest. That’s “one-to-many”. The most ambitious form is “all-to-all” matching. This would mean matching every image to a comprehensive database of every person in a given polity. </p>
<p>Each approach can be carried out asynchronously (on demand, after images are captured) or in real time. And they can be applied to separate (disaggregated) data streams, or used to bring together massive surveillance datasets. </p>
<p>Facial recognition occurring at one end of each of these scales – one-to-one, asynchronous, disaggregated – has well-documented benefits. One-to-one real-time facial recognition can be convenient and relatively safe, like unlocking your phone, or proving your identity at an automated passport barrier. Asynchronous disaggregated one-to-many facial recognition can be useful for law enforcement – analysing CCTV footage to identify a suspect, for example, or finding victims and perpetrators in <a href="https://www.nytimes.com/2020/02/07/business/clearview-facial-recognition-child-sexual-abuse.html">child abuse videos</a>.</p>
<p>However, facial recognition at the other end of these scales – one-to-many or all-to-all, real-time, integrated – amounts to face surveillance, which has less obvious benefits. Several police forces in the UK have trialled real-time one-to-many facial recognition to seek persons of interest, <a href="https://www.ft.com/content/f4779de6-b1e0-11e9-bec9-fdcab53d6959">with mixed results</a>. The benefits of integrated real-time all-to-all face surveillance in China are yet to be seen.</p>
<p>And while the benefits of face surveillance are dubious, it risks fundamentally changing the kind of society we live in.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=403&fit=crop&dpr=1 600w, https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=403&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=403&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=506&fit=crop&dpr=1 754w, https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=506&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=506&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Real-time facial recognition applied to crowds amounts to face surveillance.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>Face surveillance often goes wrong, but it’s bad even when it works</h2>
<p>Most facial recognition algorithms are accurate with head-on, well-lit portraits, but underperform with “faces in the wild”. They are also <a href="https://dam-prod.media.mit.edu/x/2019/01/24/AIES-19_paper_223.pdf">worse at identifying black faces</a>, and <a href="http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf">especially the faces of black women</a>. </p>
<p>The errors tend to be false positives – making incorrect matches, rather than missing correct ones. If face surveillance were used to dole out cash prizes, this would be fine. But a match is almost always used to target interventions (such as arrests) that harm those identified.</p>
<p>More false positives for minority populations means they bear the costs of face surveillance, while any benefits are likely to accrue to majority populations. So using these systems will <a href="https://www.aclu.org/news/privacy-technology/how-is-face-recognition-surveillance-technology-racist/">amplify the structural injustices</a> of the societies that produce them.</p>
<p>Even when it works, face surveillance is still harmful. Knowing where people are and what they are doing enables you to predict and control their behaviour.</p>
<p>You might believe the Australian government wouldn’t use this power against us, but the very fact they have it makes us less free. Freedom isn’t only about making it <em>unlikely</em> others will interfere with you. It’s about making it <a href="https://www.cambridge.org/core/books/on-the-peoples-terms/219DF8F7F166B305318CD9D51FAC45DE">impossible</a> for them to do so. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australian-police-are-using-the-clearview-ai-facial-recognition-system-with-no-accountability-132667">Australian police are using the Clearview AI facial recognition system with no accountability</a>
</strong>
</em>
</p>
<hr>
<h2>Face surveillance is intrinsically wrong</h2>
<p>Face surveillance relies on the idea that others are entitled to extract biometric data from you without your consent when you are in public. </p>
<p>This is false. We have a right to control our own biometric data. This is what is called an underived right, like the right to control your own body.</p>
<p>Of course, rights have limits. You can lose the protection of a right – someone who robs a servo may lose their right to anonymity – or the right may be overridden, if necessary, for a good enough cause.</p>
<p>But the great majority of us have committed no crime that would make us lose the right to control our biometric data. And the possible benefits of using face surveillance on any particular occasion must be discounted by their probability of occurring. Certain rights violations are unlikely to be overridden by hypothetical benefits.</p>
<p><a href="https://openreview.net/forum?id=s-e2zaAlG3I">Many prominent algorithms</a> used for face surveillance were also developed in morally compromised ways. They used datasets containing images used without permission of the rightful owners, as well as harmful images and deeply objectionable labels.</p>
<h2>Arguments for face surveillance don’t hold up</h2>
<p>There will of course be counterarguments, but none of them hold up.</p>
<p><em>You’ve already given up your privacy to Apple or Google – why begrudge police the same kind of information?</em> Just because we have sleepwalked into a surveillance society doesn’t mean we should refuse to wake up. </p>
<p><em>Human surveillance is more biased and error-prone than algorithmic surveillance.</em> Human surveillance is indeed morally problematic. Vast networks of CCTV cameras already compromise our civil liberties. Weaponizing them with software that enables people to be tracked across multiple sites only makes them worse. </p>
<p><em>We can always keep a human in the loop.</em> False positive rates can be reduced by human oversight, but human oversight of automated systems is itself <a href="https://doi.org/10.1016/0005-1098(83)90046-8">flawed</a> and <a href="https://arstechnica.com/tech-policy/2019/09/algorithms-should-have-made-courts-more-fair-what-went-wrong/">biased</a>, and this doesn’t address the other objections against face surveillance. </p>
<p><em>Technology is neither good nor bad in itself; it’s just a tool that can be used for good or bad ends.</em> Every tool makes <a href="https://mitpress.mit.edu/books/how-artifacts-afford">some things easier and some things harder</a>. Facial recognition makes it easier to oppress vulnerable populations and violate everyone’s basic rights.</p>
<h2>It’s time for a moratorium</h2>
<p>Face surveillance is based on morally compromised research, violates our rights, is harmful, and exacerbates structural injustice, both when it works and when it fails. Its adoption harms individuals, and makes our society as a whole more unjust, and less free. </p>
<p>A moratorium on its use in Australia is the least we should demand.</p><img src="https://counter.theconversation.com/content/126282/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Face surveillance makes it easier to oppress vulnerable populations and violate everyone’s basic rights. It’s time for a moratorium.Seth Lazar, Professor, Australian National UniversityClaire Benn, Research Fellow, Humanising Machine Intelligence Grand Challenge, Australian National UniversityMario Günther, Research Fellow, Humanising Machine Intelligence Grand Challenge, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.