tag:theconversation.com,2011:/id/topics/biometrics-4553/articlesBiometrics – The Conversation2024-01-19T13:42:15Ztag:theconversation.com,2011:article/2172262024-01-19T13:42:15Z2024-01-19T13:42:15ZFace recognition technology follows a long analog history of surveillance and control based on identifying physical features<figure><img src="https://images.theconversation.com/files/569962/original/file-20240117-29-ri412u.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5272%2C3598&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Today's technology advances what passport control has been doing for more than a century.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/controll-of-passports-at-the-frontiers-between-beuthen-and-news-photo/548866047">ullstein bild via Getty Images</a></span></figcaption></figure><p>American Amara Majeed was <a href="https://www.bbc.com/news/world-asia-48061811">accused of terrorism</a> by the Sri Lankan police in 2019. Robert Williams was <a href="https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html">arrested outside his house</a> in Detroit and detained in jail for 18 hours for allegedly stealing watches in 2020. Randal Reid <a href="https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arrests.html">spent six days in jail</a> in 2022 for supposedly using stolen credit cards in a state he’d never even visited.</p>
<p>In all three cases, the authorities had the wrong people. In all three, it was face recognition technology that told them they were right. Law enforcement officers in many U.S. states are <a href="https://www.wired.com/story/hidden-role-facial-recognition-tech-arrests/">not required to reveal</a> that they used face recognition technology to identify suspects.</p>
<p>Face recognition technology is the latest and most sophisticated version of <a href="https://www.dhs.gov/biometrics">biometric surveillance</a>: using unique physical characteristics to identify individual people. It stands in a <a href="https://www.thalesgroup.com/en/markets/digital-identity-and-security/government/inspired/history-of-biometric-authentication">long line of technologies</a> – from the fingerprint to the passport photo to iris scans – designed to monitor people and determine who has the right to move freely within and across borders and boundaries.</p>
<p>In my book, “<a href="https://www.press.jhu.edu/books/title/12700/do-i-know-you">Do I Know You? From Face Blindness to Super Recognition</a>,” I explore how the story of face surveillance lies not just in the history of computing but in the history of medicine, of race, of psychology and neuroscience, and in the health humanities and politics.</p>
<p>Viewed as a part of the long history of people-tracking, face recognition techology’s incursions into privacy and limitations on free movement are carrying out exactly what biometric surveillance was always meant to do.</p>
<p>The system works by converting captured faces – either static from photographs or moving from video – into a series of unique data points, which it then compares against the data points drawn from images of faces already in the system. As face recognition technology improves in accuracy and speed, its effectiveness as a means of surveillance becomes ever more pronounced.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="faces in a crowd highlighted and annotated with dates and times" src="https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/569964/original/file-20240117-15-h4ovvh.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Paired with AI, face recognition technology scans the crowd at a conference.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/live-demonstration-uses-artificial-intelligence-and-facial-news-photo/1080200068">David McNew/AFP via Getty Images</a></span>
</figcaption>
</figure>
<h2>Accuracy improves, but biases persist</h2>
<p>Surveillance is predicated on the idea that <a href="https://theconversation.com/surveillance-is-pervasive-yes-you-are-being-watched-even-if-no-one-is-looking-for-you-187139">people need to be tracked</a> and their movements limited and controlled in a trade-off between privacy and security. The assumption that less privacy leads to more security is built in.</p>
<p>That may be the case for some, but not for the people disproportionately targeted by face recognition technology. <a href="https://www.routledge.com/Histories-of-Surveillance-from-Antiquity-to-the-Digital-Era-The-Eyes-and/Marklund-Skouvig/p/book/9781032021539">Surveillance has always been designed</a> to identify the people whom those in power wish to most closely track.</p>
<p>On a global scale, <a href="https://doi.org/10.1080/21670811.2018.1493938">there are</a> <a href="https://longreads.tni.org/stateofpower/settled-habits-new-tricks-casteist-policing-meets-big-tech-in-india">caste cameras in India</a>, <a href="https://www.theguardian.com/world/2021/sep/30/uyghur-tribunal-testimony-surveillance-china">face surveillance of Uyghurs in China</a> and even <a href="https://mynbc15.com/news/spotlight-on-america/facial-recognition-technology-in-school-hallways-states-face-a-divisive-debate">attendance surveillance</a> <a href="https://dx.doi.org/10.7302/21934">in U.S. schools</a>, often with low-income and majority-Black populations. <a href="https://www.aclu.org/news/privacy-technology/how-is-face-recognition-surveillance-technology-racist">Some people are tracked more closely</a> than others.</p>
<p>In addition, the cases of Amara Majeed, Robert Williams and Randal Reid <a href="https://www.aclu.org/news/privacy-technology/how-is-face-recognition-surveillance-technology-racist">aren’t anomalies</a>. As of 2019, face recognition technology <a href="https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf">misidentified Black and Asian people</a> at up to <a href="https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software">100 times the rate of white people</a>, including, in 2018, a disproportionate number of the <a href="https://www.aclu.org/news/privacy-technology/amazons-face-recognition-falsely-matched-28">28 members of the U.S. Congress</a> who were falsely matched with mug shots on file using Amazon’s Rekognition tool.</p>
<p>When the database against which captured images were compared had only a limited number of mostly white faces upon which to draw, face recognition technology would offer matches based on the closest alignment available, leading to a pattern of highly racialized – and racist – false positives.</p>
<p>With the expansion of images in the database and increased sophistication of the software, <a href="https://www.csis.org/blogs/strategic-technologies-blog/how-accurate-are-facial-recognition-systems-and-why-does-it">the number of false positives</a> – incorrect matches between specific individuals and images of wanted people on file – has <a href="https://bipartisanpolicy.org/blog/frt-accuracy-performance/">declined dramatically</a>. Improvements in pixelation and mapping static images into moving ones, along with increased social media tagging and <a href="https://www.penguinrandomhouse.com/books/691288/your-face-belongs-to-us-by-kashmir-hill/">ever more sophisticated scraping tools</a> like those developed by Clearview AI, have helped decrease the error rates.</p>
<p><a href="https://www.washingtonpost.com/technology/2019/12/19/federal-study-confirms-racial-bias-many-facial-recognition-systems-casts-doubt-their-expanding-use/">The biases</a>, however, remain deeply embedded into the systems and their purpose, explicitly or implicitly targeting already targeted communities. The technology is not neutral, nor is the surveillance it is used to carry out.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Pen and ink illustration of suited hands using calipers to measure a man's forehead to back of his head" src="https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=458&fit=crop&dpr=1 600w, https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=458&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=458&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=576&fit=crop&dpr=1 754w, https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=576&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/569966/original/file-20240117-21-awurl6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=576&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Physiognomy went beyond recognition of an individual and tried to connect physical features with other characteristics.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/illustration/head-royalty-free-illustration/1399373778">clu/DigitalVision Vectors via Getty Images</a></span>
</figcaption>
</figure>
<h2>Latest technique in a long history</h2>
<p>Face recognition software is only the most recent manifestation of global systems of tracking and sorting. Precursors are rooted in the now-debunked belief that bodily features offer a unique index to character and identity. This pseudoscience was formalized in the late 18th century under the rubric of the <a href="https://www.hup.harvard.edu/books/9780674036048">ancient practice of physiognomy</a>.</p>
<p>Early systemic applications included anthropometry (body measurement), fingerprinting and iris or retinal scans. They all offered unique identifiers. None of these could be done without the participation – willing or otherwise – of the person being tracked.</p>
<p>The framework of bodily identification was adopted in the 19th century for use in criminal justice detection, prosecution and record-keeping to allow governmental control of its populace. The intimate relationship between face recognition and border patrol was galvanized by the <a href="http://www.atlasobscura.com/articles/passport-photos-history-development-regulation-mugshots">introduction of photos into passports</a> in some countries including Great Britain and the United States in 1914, <a href="https://doi.org/10.1017/9781108664271">a practice that became widespread by 1920</a>.</p>
<p>Face recognition technology provided a way to go stealth on human biometric surveillance. Much early research into face recognition software was <a href="https://www.wired.com/story/secret-history-facial-recognition/">funded by the CIA</a> for the purposes of border surveillance.</p>
<p>It tried to develop a standardized framework for face segmentation: mapping the distance between a person’s facial features, including eyes, nose, mouth and hairline. Inputting that data into computers let a user search stored photographs for a match. These early scans and maps were limited, and the attempts to match them were not successful.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Woman looks at screen with her image on a vending machine" src="https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/569967/original/file-20240117-23-u3alzk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A customer pays via facial recognition at a smart store in China.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/nov-6-2018-a-visitor-tries-facial-recognition-payment-in-a-news-photo/1058496364">Huang Zongzhi/Xinhua News Agency via Getty Images</a></span>
</figcaption>
</figure>
<p>More recently, private companies have <a href="https://fortune.com/longform/facial-recognition/">adopted data harvesting techniques</a>, including face recognition, as part of a long practice of <a href="https://theconversation.com/data-brokers-know-everything-about-you-what-ftc-case-against-ad-tech-giant-kochava-reveals-218232">leveraging personal data for profit</a>.</p>
<p>Face recognition technology works not only to unlock your phone or help you board your plane more quickly, but also in promotional store kiosks and, essentially, in any photo taken and shared by anyone, with anyone, anywhere around the world. These photos are stored in a database, creating ever more comprehensive systems of surveillance and tracking.</p>
<p>And while that means that today it is unlikely that Amara Majeed, Robert Williams, Randal Reid and Black members of Congress would be ensnared by a false positive, face recognition technology has invaded everyone’s privacy. It – and the governmental and private systems that design, run, use and capitalize upon it – is watching, and paying particular attention to those whom society and its structural biases deem to be the greatest risk.</p><img src="https://counter.theconversation.com/content/217226/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sharrona Pearl receives funding from Interfaith America.</span></em></p>Face recognition technology follows earlier biometric surveillance techniques, including fingerprints, passport photos and iris scans. It’s the first that can be done without the subject’s knowledge.Sharrona Pearl, Associate Professor of Bioethics and History, Drexel UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2132132023-12-06T13:28:53Z2023-12-06T13:28:53ZYour car might be watching you to keep you safe − at the expense of your privacy<figure><img src="https://images.theconversation.com/files/563468/original/file-20231204-15-ei72ki.png?ixlib=rb-1.1.0&rect=0%2C0%2C1273%2C714&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Many modern cars watch occupants -- a plus for safety but not so much for privacy.</span> <span class="attribution"><a class="source" href="https://www.lgnewsroom.com/2021/08/how-lgs-enhanced-in-vehicle-cabin-camera-makes-driving-and-riding-safer/">Courtesy LG</a></span></figcaption></figure><p>Depending on which late-model vehicle you own, your car <a href="https://www.consumerreports.org/cars/car-safety/driver-monitoring-systems-ford-gm-earn-points-in-cr-tests-a6530426322/">might be watching you</a> – literally and figuratively – as you drive down the road. It’s watching you with cameras that monitor the cabin and track where you’re looking, and with sensors that track your speed, lane position and rate of acceleration. </p>
<p>Your car uses this data to make your ride safe, comfortable and convenient. For example, the cameras <a href="https://www.wired.com/story/cars-that-watch-their-drivers-could-re-teach-the-world-to-drive/">can tell when you’ve been distracted</a> and need to bring your attention back to the road. They can also <a href="https://mycardoeswhat.org/safety-features/high-speed-alert/">identify when you are speeding</a> by verifying the speed limit from your GPS position or traffic signs along the road and warn you to slow down. Some carmakers are also beginning to incorporate similar features for convenience, such as unlocking your car by <a href="https://www.popsci.com/technology/genesis-gv60-facial-recognition/">scanning your face</a> <a href="https://www.techradar.com/news/fingerprint-scanners-are-now-being-used-to-unlock-and-start-your-car">or fingerprint</a>. Your car may also transmit some of this data to the manufacturer’s data centers, where the company uses it to improve your driving experience or provide you with personalized services.</p>
<p>In addition to providing these benefits, this data collection is a potential privacy nightmare. The information can reveal your identity, your habits when you’re in your car, how safely you drive, where you’ve been and where you regularly go. A report by the Mozilla Foundation, a nonprofit technology research and advocacy organization, found that <a href="https://foundation.mozilla.org/en/privacynotincluded/articles/its-official-cars-are-the-worst-product-category-we-have-ever-reviewed-for-privacy/">carmakers’ privacy policies are exceedingly lax</a>. The study identified cars as the “worst category of products for privacy that we have ever reviewed.” U.S. Sen. Ed Markey wrote a <a href="https://www.markey.senate.gov/imo/media/doc/senator_markey_letter_to_automakers_on_privacy.pdf">letter to U.S. automakers</a> on Nov. 30, 2023, asking a lengthy set of questions about their data practices.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/XKQ-uxTw11g?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Cars collect a lot of information about drivers and passengers.</span></figcaption>
</figure>
<p>Today’s smart cars present drivers with a trade-off between convenience and privacy, assuming drivers have the option of improving the data privacy of their cars. As a <a href="https://dblp.org/pid/172/0864.html">computer scientist who studies cybersecurity and resilience in transportation</a>, I see several technological routes to getting the best of both worlds: cars that make use of this collected data while also preserving users’ privacy.</p>
<h2>Driver data</h2>
<p>Today’s cars use a wide range of sensors to understand the environment, analyze the data and ensure the safety of passengers. For instance, cars are equipped with sensors that measure brake pedal position, vehicle speed, driver’s movements, surrounding vehicles and even traffic lights. The collected data is transmitted to the car’s electric control units, the computers that operate the car’s many systems.</p>
<p>There are two types of sensors that <a href="https://doi.org/10.1016/j.jsr.2009.04.005">continuously monitor and predict a driver’s drowsiness</a>. The first is vehicle status monitoring sensors such as lane detection and steering wheel position tracking. This data is not directly related to a specific person and can be considered not personally identifiable information unless it is correlated with other data that identifies the driver. </p>
<p>The second type of sensors tracks drivers themselves. This category includes things like cameras to <a href="https://doi.org/10.1007/s11768-010-8043-0">track the driver’s eye movements to predict fatigue</a>. This second group of sensors is directly related to the driver’s privacy because they collect personally identifiable information, such as the driver’s face.</p>
<h2>Protecting privacy</h2>
<p>There is a trade-off between the quality of the driving experience and the privacy of drivers, depending on the level of services and features. Some drivers may prefer to share their biometric data to facilitate accessing a car’s functions and automating a major part of their driving experience. Others may prefer to manually control the car’s systems, sharing less personally identifiable information or none at all.</p>
<p>At first glance, it seems the trade-off of privacy and driver comfort cannot be avoided. Car manufacturers tend to take measures to <a href="https://news.fiu.edu/2023/how-ai-will-protect-your-car-and-your-privacy">protect drivers’ data against data thieves</a>, but they collect a lot of data themselves. And as the Mozilla Foundation report showed, most car companies reserve the right to sell your data. Researchers are working on developing data analytics tools that better protect privacy and make progress on eliminating the trade-off.</p>
<p>For instance, over the past seven years, the concept of <a href="https://doi.org/10.48550/arXiv.1602.05629">federated machine learning</a> has attracted attention because it allows algorithms to learn from the data on your local device without copying the data to a central server. For instance, Google’s Gboard keyboard benefits from federated learning to better guess the next word you are likely to type <a href="https://support.google.com/gboard/answer/12373137?hl=en#zippy=%2Cfederated-learning">without sharing your private data with a server</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/zqv1eELa7fs?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Federated learning is a technique for training AI models that keeps people’s data private.</span></figcaption>
</figure>
<p>Research led by Ervin Moore, a Ph.D. student at Florida International University’s <a href="https://solidlab.network">Sustainability, Optimization, and Learning for InterDependent Networks laboratory</a>, and published in IEEE Internet of Things Journal explored the idea of using <a href="https://doi.org/10.1109/JIOT.2023.3313055">blockchain-based federated machine learning</a> to improve the privacy and security of users and their sensitive data. The technique could be used to protect drivers’ data. There are other techniques to preserve privacy as well, such as <a href="https://doi.org/10.1007/978-3-540-73538-0_4">location obfuscation</a>, which alters the user’s location data to prevent their location from being revealed.</p>
<p>While there is still a trade-off between user privacy and quality of service, privacy-preserving data analytics techniques could pave the way for using data without leaking drivers’ and passengers’ personally identifiable information. This way, drivers could benefit from a wide range of modern cars’ services and features without paying the high cost of lost privacy.</p><img src="https://counter.theconversation.com/content/213213/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>M. Hadi Amini receives funding for researching privacy and security of transportation systems from U.S. Department of Transportation. Opinions expressed represent the author's personal or professional opinions and do not represent or reflect the position of Florida International University.
His work on transportation system cybersecurity is in part supported by the National Center for Transportation Cybersecurity and Resiliency (TraCR). Any opinions, findings, conclusions, and recommendations expressed in this material are those of the author and do not necessarily reflect the views of TraCR or the U.S. Government generally. </span></em></p>Your car’s safety technology takes you into account. But a lot of that technology helps car companies collect data about you. Researchers are working on closing the gap between safety and privacy.M. Hadi Amini, Assistant Professor of Computing and Information Sciences, Florida International UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2109802023-08-08T20:05:49Z2023-08-08T20:05:49ZWorldcoin is scanning eyeballs to build a global ID and finance system. Governments are not impressed<figure><img src="https://images.theconversation.com/files/541601/original/file-20230808-25-mlnz26.jpeg?ixlib=rb-1.1.0&rect=0%2C0%2C2560%2C1708&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Worldcoin</span></span></figcaption></figure><p>Millions of people worldwide are lining up to stare into a silver sphere about the size of a bowling ball so their irises can be scanned in exchange for online identity verification and “free” cryptocurrency. </p>
<p>The silver spheres, known as “Orbs”, are part of the <a href="https://www.technologyreview.com/2023/08/07/1077250/worldcoin-officially-launched-why-its-being-investigated/">Worldcoin platform</a>, which officially launched in July 2023 after an 18-month testing phase. Led by Sam Altman (chief executive of OpenAI, the company behind ChatGPT) and entrepreneur Alex Blania, Worldcoin offers users a “digital passport” known as World ID and small allocations of a cryptocurrency token also called Worldcoin (WLD), “<a href="https://worldcoin.org/cofounder-letter">simply for being human</a>”. </p>
<p>Worldcoin aims to provide a “<a href="https://worldcoin.org/blog/worldcoin/proof-of-personhood-what-it-is-why-its-needed">proof of personhood</a>” to distinguish humans from artificial intelligence (AI) systems online. </p>
<p>However, critics say the company is essentially bribing people to hand over highly sensitive biometric data. Governments are taking note: the Worldcoin platform has already been suspended in Kenya, and is under investigation in several other countries.</p>
<h2>Gaze into the Orb</h2>
<p>Users can download the WorldApp on their mobile phone, then find their “nearest Orb”. The Orb uses iris scans to uniquely identify a person. </p>
<p>Once the person has their iris scanned, they receive a World ID which will function as an online ID much like a Google or Facebook login. World ID is meant to be different because it can prove the user is human – and more private, because it does not link to other personal information about the user. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/541648/original/file-20230808-15-37xtfa.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/541648/original/file-20230808-15-37xtfa.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/541648/original/file-20230808-15-37xtfa.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/541648/original/file-20230808-15-37xtfa.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/541648/original/file-20230808-15-37xtfa.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/541648/original/file-20230808-15-37xtfa.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/541648/original/file-20230808-15-37xtfa.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/541648/original/file-20230808-15-37xtfa.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Worldcoin says an iris scan can be used as ‘proof of personhood’.</span>
<span class="attribution"><span class="source">Worldcoin</span></span>
</figcaption>
</figure>
<p>Despite the “digital passport” label, World ID is not intended to reveal or verify a user’s identity in the conventional sense. It merely establishes the user as “a unique and real person”, rather than a bot. </p>
<p>In most countries, the user is also entitled to units of WLD cryptocurrency once their iris scan is complete.</p>
<p>The Worldcoin website currently lists <a href="https://worldcoin.org/find-orb">60 Orb locations</a> worldwide, particularly in Europe, Asia, North America and South America, and notes there will also be Orb “pop-ups”. </p>
<p>At the time of writing, there appear to be no Orb locations in Australia, so people in Australia cannot earn WLD tokens “for being human”. But they can purchase the WLD cryptocurrency via certain cryptocurrency exchanges and download the World App, which also functions as a cryptocurrency wallet. </p>
<h2>Cash for eyeballs jeopardises human rights</h2>
<p>Altman is a key player in the AI boom that supposedly makes Worldcoin necessary, so critics have <a href="https://www.thestreet.com/cryptocurrency/worldcoin-sam-altman-ai-biometric-data-collection-outlandish-bribe">suggested</a> he is “simply profiting from both AI’s problem and solution”. </p>
<p>When the Worldcoin platform officially launched, after signing up some 2 million users in a testing phase, Altman said the Orbs were scanning a <a href="https://www.cryptopolitan.com/sam-altman-claims-worldcoin-onboarding-1-user-every-8-seconds-despite-skepticism-and-waning-interest/">new user every eight seconds</a>. </p>
<p>In Kenya, the launch saw “tens of thousands of individuals waiting in lines over a three-day period to secure a World ID”, which Worldcoin attributed to <a href="https://time.com/6300522/worldcoin-sam-altman/">“overwhelming” demand</a> for identity verification. </p>
<p>Independent reporting suggests the promise of “free” cryptocurrency was a more common motive. In most locations, Worldcoin offers a “<a href="https://www.coindesk.com/business/2023/07/24/worldcoin-release-tokenomics-report-geofenced-for-some-countries/">genesis grant</a>” of 25 units of its WLD cryptocurrency when users scan their irises. (The value of WLD fluctuates, but the grant has been worth around US$50, or $A75, over the past month.)</p>
<p>People queuing for the Orb in Kenya <a href="https://www.bbc.com/news/world-africa-66383325">told the BBC</a> “I want to register because I’m jobless and I’m broke,” and</p>
<blockquote>
<p>I really like Worldcoin because of the money. I’m not worried about the data. As long as the money comes.</p>
</blockquote>
<p>Orb operators are also <a href="https://worldcoin.org/be-a-worldcoin-operator">paid for each user they sign up</a>.</p>
<p>Critics have labelled this strategy of paying people to scan their irises as <a href="https://www.thestreet.com/cryptocurrency/worldcoin-sam-altman-ai-biometric-data-collection-outlandish-bribe">dystopian and equivalent to bribery</a>. </p>
<p>Offering money for sensitive data arguably makes privacy – a human right – a luxury only the wealthy can afford. People experiencing poverty may risk future harms to meet their immediate survival needs. </p>
<h2>‘Cataloguing eyeballs’: the risks of using biometric data</h2>
<p>Worldcoin uses irises for verification because every iris is unique and therefore difficult to fake. But the risks of handing over such data are very high. Unlike a driver’s licence or a passport, you cannot replace your iris if the data is compromised. </p>
<p>Surveillance whistleblower Edward Snowden has criticised Worldcoin for “<a href="https://twitter.com/Snowden/status/1451990496537088000">cataloguing eyeballs</a>”, and <a href="https://twitter.com/Snowden/status/1451993036196618251?ref_src=twsrc%5Etfw">tweeted</a> about the unacceptable risks: </p>
<blockquote>
<p>Don’t use biometrics for anything. […] The human body is not a ticket-punch.</p>
</blockquote>
<p>Worldcoin claims the iris scans are deleted after being converted into a unique iris code, which becomes the user’s World ID. The World ID is then stored on a decentralised blockchain, with the aim of preventing fakes or duplicates.</p>
<p>However, the iris scan is only deleted <em>if</em> the user opts for the “Without Data Storage” option (which may mean they need to return to an Orb to re-verify in the future). If the user selects the “<a href="https://worldcoin.org/privacy">With Data Storage</a>” option, Worldcoin states the iris scan is sent via encrypted communication channels to its distributed data stores where it is encrypted at rest.</p>
<p>In either case, the user must <a href="https://www.technologyreview.com/2023/08/07/1077250/worldcoin-officially-launched-why-its-being-investigated/">simply trust</a> the company to delete the biometric data, or appropriately secure it against misuse. </p>
<p>There have been many instances in which Silicon Valley companies have promised to secure data and to strictly limit its use, only to <a href="https://edition.cnn.com/2023/05/03/tech/ftc-meta-younger-users/index.html">break those promises</a> by disclosing the data to other companies or government agencies or failing to secure it against attack.</p>
<p>Journalist Eileen Guo also points out that Worldcoin has not yet clarified <a href="https://www.technologyreview.com/2023/08/07/1077250/worldcoin-officially-launched-why-its-being-investigated/">whether it still uses stored biometric data to train AI models</a> and whether it has deleted biometric data collected during its test phase.</p>
<p>And despite the supposed security of biometric scanning, there have already been reports of fraudulent uses of the Worldcoin system. For example, <a href="https://twitter.com/BlockBeatsAsia/status/1659060950748782594">black market speculators</a> are alleged to have persuaded people in Cambodia and Kenya to sign up for Worldcoin and then sell their World IDs and WLD tokens for cash. </p>
<h2>Regulatory action</h2>
<p>Regulators in several countries are taking action. The Kenyan government has now suspended Worldcoin’s activities, stating regulatory concerns surrounding the project “require urgent action”. </p>
<p>The Communications Authority of Kenya and Office of the Data Protection Commissioner say they are concerned about the offer of money in exchange for consent to data collection; how securely the data are stored; and “<a href="https://www.ca.go.ke/index.php/ca-and-data-commissioner-warn-kenyans-over-worldcoin">massive citizen data in the hands of private actors without an appropriate framework</a>”. </p>
<p>The <a href="https://www.reuters.com/technology/frances-privacy-watchdog-says-worldcoin-legality-seems-questionable-2023-07-28/">German privacy watchdog</a> is investigating Worldcoin’s business practices with support from the French privacy regulator, which called Worldcoin’s data practices “questionable”. The <a href="https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2023/07/ico-statement-on-worldcoin/">UK Information Commissioner’s Office</a> has announced it will investigate Worldcoin, referring to the high risk of processing special category biometric data.</p>
<p>While there are no Orbs in Australia yet, the federal privacy regulator has previously found some companies in <a href="http://www.austlii.edu.au/cgi-bin/viewdoc/au/cases/cth/AICmr/2021/50.html?context=1;query=20initiated20into22;mask_path=">breach of the privacy law</a> for failing to obtain valid consent for the use of biometric data and collecting it when it was not reasonably necessary.</p><img src="https://counter.theconversation.com/content/210980/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from the UNSW Allens Hub for Technology, Law and Innovation. She is a Member of the Expert Panel of the Consumer Policy Research Centre, and the Australian Privacy Foundation.</span></em></p>Worldcoin wants to provide ‘proof of personhood’ in an AI-filled future, but critics and governments are unimpressedKatharine Kemp, Associate Professor, Faculty of Law & Justice, and Deputy Director, Allens Hub for Technology, Law & Innovation, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2081662023-07-18T12:30:07Z2023-07-18T12:30:07ZRegistering refugees using personal information has become the norm – but cybersecurity breaches pose risks to people giving sensitive biometric data<figure><img src="https://images.theconversation.com/files/537876/original/file-20230717-228004-jmszuz.jpg?ixlib=rb-1.1.0&rect=260%2C116%2C5452%2C3871&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A refugee from the Democratic Republic of Congo registers his fingerprints on a biometric machine in Uganda in 2022. </span> <span class="attribution"><a class="source" href="https://media.gettyimages.com/id/1241304591/photo/uganda-drcongo-conflict-refugees.jpg?s=612x612&w=gi&k=20&c=GIveI_8HesUbGVDN5Hf6zrzolPVHEsfdeTOn98hP4qM=">Badru Katumba/AFP via Getty Images</a></span></figcaption></figure><p>The <a href="https://www.politico.eu/article/worldwide-refugees-reach-all-time-high/">number of refugees worldwide</a> reached record high levels in 2022. More than 108.4 million people have been forced to flee their homes because of violence or persecution. Meanwhile, <a href="https://hir.harvard.edu/new-technologies-that-monitor-displaced-persons/">governments and aid agencies are increasingly using</a> a <a href="https://www.digital-adoption.com/what-is-digital-technology/">controversial method</a> of effectively identifying and tracking many refugees. </p>
<p>This method, known as biometrics, involves collecting someone’s physical or behavioral characteristics, ranging from fingerprints to voice. Organizations that collect the personal physical data can store it to instantly recognize someone after scanning their fingerprints or irises, for example. </p>
<p>The United Nations refugee agency, often known as UNHCR, is among the groups that have grown their <a href="https://www.unhcr.org/blogs/unhcrs-biometric-tools-in-2023/#:%7E:text=UNHCR%2C%20the%20UN%20Refugee%20Agency,in%20countries%20across%20the%20world">biometrics programs</a> over the past several years to <a href="https://foreignpolicy.com/2020/09/02/big-brother-turns-its-eye-on-refugees/">help identify refugees</a> and deliver lifesaving aid and other services. </p>
<p>As a <a href="https://scholar.google.com/citations?user=ITS9Jk4AAAAJ&hl=en&oi=ao">cybersecurity scholar</a>, I think it is important to understand that while identifying people using biometrics might be convenient for organizations collecting the data, the practice comes with inherent privacy risks that can threaten vulnerable people’s safety. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/537873/original/file-20230717-200504-rtj5r8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A cellphone showing a woman's face is held up near the same woman's face." src="https://images.theconversation.com/files/537873/original/file-20230717-200504-rtj5r8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/537873/original/file-20230717-200504-rtj5r8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/537873/original/file-20230717-200504-rtj5r8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/537873/original/file-20230717-200504-rtj5r8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/537873/original/file-20230717-200504-rtj5r8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/537873/original/file-20230717-200504-rtj5r8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/537873/original/file-20230717-200504-rtj5r8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An election official in Afghanistan scans a voter’s face with a biometric device at a polling center in 2018.</span>
<span class="attribution"><a class="source" href="https://media.gettyimages.com/id/1052593356/photo/an-afghan-independent-election-commission-official-scans-a-voters-face-with-a-biometric.jpg?s=612x612&w=0&k=20&c=52TDw37mZutBK1lSY-OBeiT1MXwqLT_5Jwre4gM15wI=">Hoshang Hashimi/AFP via Getty Images</a></span>
</figcaption>
</figure>
<h2>How it works</h2>
<p>The biometrics data-gathering process begins with <a href="https://iow.eui.eu/wp-content/uploads/sites/18/2013/04/07-Rijpma-Background4-Refugees-and-Biometrics.pdf">enrollment, which involves</a> representatives from a government or organization collecting someone’s personal physical information when they perform intake into a registration system.</p>
<p>Many people also routinely use biometrics for personal reasons, like recording their own fingerprints so they can unlock and use their phone.</p>
<p>Organizations can use this kind of personal biometric information to authenticate a person’s identity – meaning, confirming that a person is who they say they are. Or, they can use it to simply identify someone and determine who they are. </p>
<p>Authentication works by comparing a person’s previously captured images or recordings – their biometrics – with their recently collected biometrics information.</p>
<p>Identification, on the other hand, compares a person’s recently collected biometrics against all other people’s templates stored in a biometrics database. </p>
<p><a href="https://www.cbp.gov/travel/biometrics">U.S. law enforcement</a> and international <a href="https://www.cntraveler.com/story/how-airports-are-using-biometrics-so-you-can-spend-less-time-waiting-in-lines">travel-related companies</a> alike tend to use biometrics in their work. <a href="https://le.fbi.gov/science-and-lab/biometrics-and-fingerprints/biometrics/next-generation-identification-ngi">That ranges from identifying</a> re-offending criminals across multiple jurisdictions, for example, or quickly identifying people as they <a href="https://www.tsa.gov/biometrics-technology#:%7E:text=During%20the%20tests%2C%20TSA%20will,between%20TSA%20officers%20and%20passengers.">pass through an airport</a> or cross an international border. </p>
<h2>Cybersecurity challenges</h2>
<p>For groups of people like refugees who might not be carrying passports or other forms of identification, biometrics provides a convenient and reliable way to verify their identities while reducing the risk of fraud.</p>
<p>Aid workers can also use <a href="https://www.asisonline.org/security-management-magazine/monthly-issues/security-technology/archive/2021/december/reaching-the-remote-with-fingerprint-biometrics/">biometrics systems in remote areas</a> with limited cell service or internet, which is common in refugee processing centers in poor countries. </p>
<p>More than <a href="https://www.abc.net.au/news/science/2019-06-21/biometric-data-is-being-collected-from-refugees-asylum-seekers/11209274">80% of the refugees</a> registered with UNHCR have a biometric record. In most cases, this is considered a standard practice that is necessary for refugees to receive aid. </p>
<p>In Jordan, for instance, <a href="https://help.unhcr.org/jordan/wp-content/uploads/sites/46/2022/04/Biometrics-EN_Final_April2022.pdf">UNHCR uses</a> uses iris scans to identify refugees and distribute monthly allowances. </p>
<h2>Human rights concerns</h2>
<p>But refugees and advocacy groups alike have voiced <a href="https://odi.org/en/publications/digital-identity-biometrics-and-inclusion-in-humanitarian-responses-to-refugee-crises/">human rights concerns</a>, arguing that collecting refugees’ biometric data can put an already vulnerable group at risk. That can happen if a militant group or government that pushed people to become refugees gets hold of their personal information and is able to potentially identify them if they are in hiding. </p>
<p>Unlike passwords and PIN numbers, fingerprints and facial recognition are unique and cannot be changed if there is a security breach. </p>
<p>Ukrainians in need of aid following Russia’s invasion of Ukraine <a href="https://www.thenewhumanitarian.org/opinion/2023/07/11/you-dont-need-demand-sensitive-biometric-data-give-aid-ukraine-response-shows">have pushed back on UNHCR</a> and other U.N. agencies using biometrics. As a result, it has become more common there for people to be registered in other ways, such as by using their Ukrainian national tax identity numbers or their passports. </p>
<p>Another concern observers have made is that if a biometric database is breached, <a href="https://www.ibm.com/topics/cyber-attack">cybercriminals can take</a> people’s data and try to impersonate them and steal their identities. </p>
<p>Security breaches can be <a href="https://www.crowdstrike.com/cybersecurity-101/threat-actor/">particularly dangerous</a> for refugees.</p>
<p>Researchers at the University of North Carolina exposed flaws of compromised biometric systems in 2016 <a href="https://www.wired.com/2016/08/hackers-trick-facial-recognition-logins-photos-facebook-thanks-zuck/">when they designed</a> an experiment to spoof facial recognition systems. The researchers downloaded social media photos of volunteers and and used the images to construct three-dimensional replicas of faces. The 3D-developed faces successfully tricked four of the five facial recognition systems. </p>
<h2>Things have gone wrong</h2>
<p>Refugees and other people in vulnerable positions have experienced devastating consequences after <a href="https://www.theguardian.com/global-development/2021/jun/15/un-put-rohingya-at-risk-by-sharing-data-without-consent-says-rights-group">having their biometric data breached</a>. </p>
<p>For instance, the Taliban in Afghanistan seized the <a href="https://theconversation.com/the-taliban-may-have-access-to-the-biometric-data-of-civilians-who-helped-the-u-s-military-166475">U.S. military’s biometric collection and identification</a> devices in August 2021 after the U.S. withdrew its final troops from Afghanistan. The U.S. collected and used this <a href="https://theintercept.com/2021/08/17/afghanistan-taliban-military-biometrics/">data to track terrorists</a> and other potential insurgents. </p>
<p>Human rights activists expressed <a href="https://www.nbcnews.com/tech/security/us-built-biometric-system-sparks-concerns-afghans-rcna1829">concern that the Taliban could use the biometric</a> data to identify – and target – Afghans who helped the U.S. coalition forces by serving as translators and in other positions after the U.S. withdrawal. </p>
<p>The <a href="https://www.army.mil/article/51768/troopers_deploy_hiide_system_at_border_crossing_point">biometric devices</a>, contained Afghans’ biometric data, including iris scans and fingerprints. </p>
<p>While the Taliban have said that they will not retaliate against Afghans who had worked with the U.S. and other Western coalition forces, the U.N. has tied <a href="https://www.technologyreview.com/2021/08/30/1033941/afghanistan-biometric-databases-us-military-40-data-points/">reports of civilians and Afghan soldiers being executed</a> to compromised U.S. biometrics databases. </p>
<p>Similarly, in 2021 news reports revealed that the U.N. shared its <a href="https://www.thenewhumanitarian.org/opinion/2021/6/21/rohingya-data-protection-and-UN-betrayal">biometric data of more than 800,000 Rohingya refugees</a> living in Bangladesh with the government there. The Bangladeshi government then shared the information with the Myanmar government – the same government that Rohingya refugees feared would hurt or kill them. </p>
<p>The U.S.-based advocacy group Human Rights Watch reported that the U.N. <a href="https://www.hrw.org/news/2021/06/15/un-shared-rohingya-data-without-informed-consent">had informed Rohingya refugees</a> that they needed to give their biometrics information in order to receive lifesaving aid and other services from the U.N. Some people interviewed in refugee camps said that they went into hiding after they learned that their information had been shared. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/537878/original/file-20230717-248129-moawde.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A woman wears a face mask and stands next to a computer next to a small child. A man in a green uniform and a mask holds her finger down near the computer." src="https://images.theconversation.com/files/537878/original/file-20230717-248129-moawde.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/537878/original/file-20230717-248129-moawde.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/537878/original/file-20230717-248129-moawde.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/537878/original/file-20230717-248129-moawde.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/537878/original/file-20230717-248129-moawde.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/537878/original/file-20230717-248129-moawde.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/537878/original/file-20230717-248129-moawde.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A migrant and her daughter have their biometric information entered at a Texas immigrant detention center in 2021.</span>
<span class="attribution"><a class="source" href="https://media.gettyimages.com/id/1232022951/photo/topshot-us-texas-border-immigration-detention.jpg?s=612x612&w=gi&k=20&c=0S_KzPxMiQEr4qn3baY5OH_LMolPLhMwb9H3j6wpXUY=">Dario Lopez-Mills/AFP via Getty Images</a></span>
</figcaption>
</figure>
<h2>A need for reform</h2>
<p>I believe that there is a need to consider whether and how refugees are giving consent for the recording <a href="https://www.theguardian.com/global-development/2021/jun/15/un-put-rohingya-at-risk-by-sharing-data-without-consent-says-rights-group">of their personal information</a> – and whether refugees are fully informed of the inherent risks associated with biometric system use. </p>
<p>At a minimum, I think that UNHCR and other groups collecting biometric data information should set up stronger <a href="https://www.crowdstrike.com/cybersecurity-101/zero-trust-security/">security models</a> and undertake routine <a href="https://www.cisa.gov/sites/default/files/2023-02/22_1201_safecom_guide_to_cybersecurity_risk_assessment_508-r1.pdf">cyber risk assessments</a> to understand evolving threats. </p>
<p>Without the necessary money and technological ability to respond to cyberthreats, U.N. agencies and others will remain vulnerable to cyberattacks, which can undermine people’s rights and ability to find safe refuge.</p><img src="https://counter.theconversation.com/content/208166/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Joseph K. Nwankpa does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Capturing biometric data helps UN agencies and other groups avoid the risk of fraud and increase efficiency. But the practice is complicated and has created security risks for vulnerable groups.Joseph K. Nwankpa, Associate Professor of Information Systems & Analytics, Miami UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1912632022-11-29T19:10:32Z2022-11-29T19:10:32ZAs more biometric data is collected in schools, parents need to ask these 10 questions<figure><img src="https://images.theconversation.com/files/497797/original/file-20221128-26-1ysvrd.jpg?ixlib=rb-1.1.0&rect=28%2C28%2C4805%2C2493&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>A Sydney high school recently introduced fingerprint technology to “<a href="https://www.abc.net.au/news/2022-09-06/moorebank-high-school-fingerprints-students-going-to-toilet/101410544">help narrow down</a>” students who were vandalising school toilets. </p>
<p>Under the plan, students needed to to scan their fingerprints to get access to the toilets or <a href="https://www.9news.com.au/national/moorebank-high-school-introduces-fingerprint-scanning-technology-to-stop-graffiti-and-anti-social-behaviour/e9fd3dc4-3420-4a58-a04d-e40f88c2f91d">pick-up a swipe card</a> if they opted out. </p>
<p>Some parents were supportive, but other parents and <a href="https://digitalrightswatch.org.au/2022/09/07/nswdet-letter-biometric-surveillance/">digital rights advocates</a> raised privacy and security concerns. The NSW Education Department <a href="https://www.news.com.au/lifestyle/parenting/school-life/sydney-high-school-backs-down-on-fingerprint-scanning-for-students-to-use-toilets/news-story/818cdce419c79fef50f6fbe9fe06da49">has since noted</a> the school is still considering how it will handle anti-social behaviour and the community will be “consulted”. </p>
<p>While the fingerprint plan appears to have stalled, it shows how easily biometric technology can be introduced into schools. </p>
<p>This debate may seem new to Australian parents but it is set to become an increasing issue, thanks to a rapidly growing education technology (“edtech”) sector. </p>
<p>How is biometric data being used in schools and what questions do parents need to ask? </p>
<h2>Biometric data in schools</h2>
<p>Biometrics measure a person’s unique physical or behavioural characteristics to identify them. This could be a fingerprint, face, iris, voice, or the way you walk, type, behave, or express an emotion.</p>
<p>Biometric technology was first introduced in schools in the United Kingdom around 2000. It has since become a <a href="https://defenddigitalme.org/research/state-biometrics-2022/">routine part</a> of school life. Fingerprints and facial recognition are used for things like the canteen payments, library borrowing, door access, photocopying, locker access, vending machines and laptop access.</p>
<figure class="align-center ">
<img alt="A student walks through library shelves." src="https://images.theconversation.com/files/497799/original/file-20221129-12-nq66tz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/497799/original/file-20221129-12-nq66tz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/497799/original/file-20221129-12-nq66tz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/497799/original/file-20221129-12-nq66tz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/497799/original/file-20221129-12-nq66tz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/497799/original/file-20221129-12-nq66tz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/497799/original/file-20221129-12-nq66tz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Biometric technology is used for regular school activities like borrowing books, buying canteen food and registering attendance.</span>
<span class="attribution"><span class="source">Redd F/ Unsplash</span></span>
</figcaption>
</figure>
<p>It is <a href="https://www.tandfonline.com/doi/full/10.1080/17439884.2020.168601">often argued</a> these technologies save money, time, are efficient and can respond to students’ individual needs.</p>
<p>In the United States, fingerprint technology was introduced in some schools around 2006. Schools also use palm scanning and facial recognition technology, although a small number of states have laws regulating the use of biometric technology in schools and Florida has banned it completely. </p>
<p>We don’t yet have a clear sense about the extent to which biometric data is collected in Australian schools. But in 2018 <a href="https://www.theage.com.au/national/victoria/minority-report-crackdown-on-facial-recognition-technology-in-schools-20181005-p5080p.html">concerns were raised</a> over trials of facial recognition technology to mark the roll in some Victorian schools. In 2015, parents <a href="https://www.adelaidenow.com.au/news/south-australia/east-para-primary-school-pupils-to-have-fingerprints-scanned-as-part-of-new-student-attendance-recordkeeping-program/news-story/6623d38216455a7d4db7482c8b695aad">raised privacy concerns</a> when a South Australian primary school asked students for a fingerprint to “register” for the day.</p>
<h2>Why is this a problem?</h2>
<p>The UK’s <a href="https://www.gov.uk/government/people/fraser-sampson">commissioner for biometric material</a> Fraser Sampson is calling for a <a href="https://defenddigitalme.org/research/state-biometrics-2022/#foreword">ban of biometrics in UK schools</a>. As he said in a report this year: </p>
<blockquote>
<p>Harm is already very real […] Further risks to the rights and freedoms, and full and free development of the child, may not be fully realised yet.</p>
</blockquote>
<p>This is similar to other calls in France and Sweden. We do not have enough independent research or a broad enough understanding of potential harms, which could range from privacy to security, identity theft, and infringements upon children’s rights and freedoms.</p>
<h2>The rise of edtechs</h2>
<p>Meanwhile, biometrics are part of a booming edtech sector, which is about using technology to improve teaching and learning outcomes. They can be used for school management as well as in the classroom. </p>
<p><a href="https://www.pwc.com.au/government/government-matters/education-tech-edtech-revolutionise-education-institutions.html">According to PwC</a>, edtech is the second largest startup community in Australia (behind financial technology) and has more than doubled since 2017. Globally it is estimated to be worth US$250 billion (A$376 billion). </p>
<p>But while edtech companies collect information about students, we still don’t have a good understanding of how this is then used. Or adequate regulations to protect this information. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/edtech-is-treating-students-like-products-heres-how-we-can-protect-childrens-digital-rights-184312">Edtech is treating students like products. Here's how we can protect children's digital rights</a>
</strong>
</em>
</p>
<hr>
<p>Earlier this year in New York, about 820,000 public school students had <a href="https://www.k12dive.com/news/data-breach-exposes-820k-new-york-city-students-information/621352/">personal information exposed</a> after a cyber attack on a company that provides software to track grades and attendance. </p>
<p>Biometric technology can easily be integrated into everyday edtech and school operations to manage things like <a href="https://link.springer.com/article/10.1007/s12008-021-00760-6">attendance, exams and how students learn</a>. </p>
<p>Reports by the UK <a href="https://digitalfuturescommission.org.uk/beneficial-uses-of-education-data/">Digital Futures Commission</a> highlight the intense pressures and uncertainties schools, students, and parents/caregivers face in a rapidly expanding edtech system. Many school community members <a href="https://digitalfuturescommission.org.uk/wp-content/uploads/2021/06/Governance-of-data-for-children-learning.pdf">struggle to make informed choices</a>. </p>
<h2>10 questions to ask about these issues</h2>
<p>Australia lags behind other countries in understanding the short and long-term repercussions of biometrics in schools. But we can catch up. </p>
<p>Going forward we need more community education about different biometric technologies and a public register, so there is transparency about where technologies are being used, introduced and refused. </p>
<p>Parents, teachers and school communities need to be better equipped to scrutinise the potential benefits and harms. In most cases this will also need technical, ethical, policy and legal expertise. </p>
<p>During a recent <a href="https://education-futures-studio.sydney.edu.au/2022/09/story-five/">workshop</a> between universities, industry and <a href="https://www.techforsocialgood.org/about">advocacy groups</a>, we developed information to help parents, schools and policymakers think about these issues and work together to discuss them. Next year we will release a resource for people to learn about edtech, register specific cases across schools, and critically evaluate technologies. </p>
<p>In the meantime, here are some basic questions parents can ask if a biometric technology is being used or proposed in their child’s school:</p>
<p><strong>1.</strong> exactly what information is being collected, when and why?</p>
<p><strong>2.</strong> how is the data being stored, processed, and analysed? </p>
<p><strong>3.</strong> who has access to the system and how will it be maintained over time?</p>
<p><strong>4.</strong> what data privacy and security provisions are in place? </p>
<p><strong>5.</strong> what happens if I/my child opts out? </p>
<p><strong>6.</strong> what implications are there for the time and expertise of teachers, and other school staff?</p>
<p><strong>7.</strong> is there enough independent evidence to support claims a new technology will improve learning or school operations? </p>
<p><strong>8.</strong> how will funding this technology impact other school budget and resourcing priorities?</p>
<p><strong>9.</strong> is there another way to address this issue, rather than using a biometric solution? </p>
<p><strong>10.</strong> has my school community had a meaningful opportunity to learn about and discuss this change?</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/why-is-tech-giant-apple-trying-to-teach-our-teachers-186752">Why is tech giant Apple trying to teach our teachers?</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/191263/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Kalervo Gulson receives funding from the Australian Research Council </span></em></p><p class="fine-print"><em><span>Terry Flew receives funding from the Australian Research Council and the Canadian Social Science and Humanities Research Council. </span></em></p><p class="fine-print"><em><span>Fiona Suwana and Teresa Swist do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Biometric data can be used in schools to track everything from attendance to exam behaviour and what students buy from the canteen.Teresa Swist, Researcher, University of SydneyFiona Suwana, Lecturer, University of SydneyKalervo Gulson, Professor and ARC Future Fellow, Education & Social Work, Education Futures Studio, University of SydneyTerry Flew, Professor of Digital Communications and Culture, The University of Sydney, University of SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1834472022-05-24T06:01:41Z2022-05-24T06:01:41ZPay ‘with a smile or a wave’: why Mastercard’s new face recognition payment system raises concerns<figure><img src="https://images.theconversation.com/files/464421/original/file-20220520-19-yabkx8.jpeg?ixlib=rb-1.1.0&rect=0%2C31%2C3943%2C2787&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Mastercard’s <a href="https://www.mastercard.com/news/press/2022/may/with-a-smile-or-a-wave-paying-in-store-just-got-personal/">“smile to pay”</a> system, announced last week, is supposed to save time for customers at checkouts. It is being trialled in Brazil, with future pilots planned for the Middle East and Asia.</p>
<p>The company argues touch-less technology will help speed up transaction times, shorten lines in shops, heighten security and improve hygiene in businesses. But it raises concerns relating to customer privacy, data storage, crime risk and bias. </p>
<h2>How will it work?</h2>
<p>Mastercard’s biometric checkout system will provide customers facial recognition-based payments, by linking the biometric authentication systems of a number of third-party companies with Mastercard’s own payment systems. </p>
<p>A Mastercard spokesperson told The Conversation it had already partnered with NEC, Payface, Aurus, Fujitsu Limited, PopID and PayByFace, with more providers to be named. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/464953/original/file-20220524-22-ga0v7l.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="The 'Fujitsu' logo in red is displayed on a building's side" src="https://images.theconversation.com/files/464953/original/file-20220524-22-ga0v7l.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/464953/original/file-20220524-22-ga0v7l.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/464953/original/file-20220524-22-ga0v7l.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/464953/original/file-20220524-22-ga0v7l.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/464953/original/file-20220524-22-ga0v7l.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/464953/original/file-20220524-22-ga0v7l.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/464953/original/file-20220524-22-ga0v7l.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Mastercard has partnered with Fujitsu, a massive information and communications technology firm offering many different products and services.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>They said “providers need to go through independent laboratory certification against the program criteria to be considered” – but details of these criteria aren’t yet publicly available.</p>
<p>According to <a href="https://www.siliconrepublic.com/business/mastercard-facial-recognition-biometric-payments">media</a> reports, customers will have to install an app which will take their picture and payment information. This information will be saved and stored on the third-party provider’s servers. </p>
<p>At the checkout, the customer’s face will be matched with the stored data. And once their identity is verified, funds will be deducted automatically. The “wave” option is a bit of a trick: as the customer watches the camera while waving, the camera still scans their face – not their hand.</p>
<p>Similar authentication technologies are used on smartphones (face ID) and in many airports around the world, including “<a href="https://www.abf.gov.au/entering-and-leaving-australia/smartgates/arrivals">smartgates</a>” in Australia.</p>
<p><a href="https://www.theverge.com/2017/9/4/16251304/kfc-china-alipay-ant-financial-smile-to-pay">China</a> started using biometrics-based checkout technology back in 2017. But Mastercard is among the first to launch such a system in Western markets – competing with the “pay with your palm” <a href="https://techcrunch.com/2020/09/29/amazon-introduces-the-amazon-one-a-way-to-pay-with-your-palm-when-entering-stores/">system</a> used at cashier-less Amazon Go and Whole Foods brick and mortars in the United States.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ai-facial-analysis-is-scientifically-questionable-should-we-be-using-it-for-border-control-155474">AI facial analysis is scientifically questionable. Should we be using it for border control?</a>
</strong>
</em>
</p>
<hr>
<h2>What we don’t know</h2>
<p>Much about the precise functioning of Mastercard’s system isn’t clear. How accurate will the facial recognition be? Who will have access to the databases of biometric data? </p>
<p>A Mastercard spokesperson told The Conversation customers’ data would be stored with the relevant biometric service provider in encrypted form, and removed when the customer “indicates they want to end their enrolment”. But how will the removal of data be enforced if Mastercard itself can’t access it?</p>
<p>Obviously, privacy protection is a major concern, especially when there are many potential third-party providers involved.</p>
<p>On the bright side, Mastercard’s <a href="https://www.investopedia.com/articles/markets/032615/how-mastercard-makes-its-money-ma.asp">customers</a> will have a choice as to whether or not they use the biometrics checkout system. However, it will be at retailers’ discretion whether they offer it, or whether they offer it exclusively as the only payment option.</p>
<p>Similar face-recognition technologies used in airports, and <a href="https://www.brookings.edu/research/police-surveillance-and-facial-recognition-why-data-privacy-is-an-imperative-for-communities-of-color/">by police</a>, often offer no choice. </p>
<p>We can assume Mastercard and the biometrics provider with whom they partner will require customer consent, as per most privacy laws. But will customers know what they are consenting to? </p>
<p>Ultimately, the biometric service providers Mastercard teams up with will decide how they use the data, for how long, where they store it, and who can access it. Mastercard will merely decide what providers are “good enough” to be accepted as partners, and the minimum standards they must adhere to. </p>
<p>Customers who want the convenience of this checkout service will have to consent to all the related data and privacy terms. And as reports have noted, there is potential for Mastercard to integrate the feature with loyalty schemes and make personalised recommendations <a href="https://www.cnbc.com/2022/05/17/mastercard-launches-tech-that-lets-you-pay-with-your-face-or-hand.html">based on purchases</a>. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/fingerprint-login-should-be-a-secure-defence-for-our-data-but-most-of-us-dont-use-it-properly-127442">Fingerprint login should be a secure defence for our data, but most of us don't use it properly</a>
</strong>
</em>
</p>
<hr>
<h2>Accuracy is a problem</h2>
<p>While the accuracy of face recognition technologies has previously been challenged, the current <em>best</em> facial authentication algorithms have an error of just 0.08%, according to tests by the <a href="https://github.com/usnistgov/frvt/blob/nist-pages/reports/1N/frvt_1N_report_2020_03_27.pdf">National Institute of Standards and Technology</a>. In some countries, even banks have <a href="https://techhq.com/2020/09/biometrics-the-most-secure-solution-for-banking/">become comfortable</a> relying on it to log users into their accounts.</p>
<p>Yet we can’t know how accurate the technologies used in Mastercard’s biometric checkout system will be. The algorithms underpinning a technology can work almost perfectly when trailed in a lab, but perform <a href="https://www.csis.org/blogs/technology-policy-blog/how-accurate-are-facial-recognition-systems-%E2%80%93-and-why-does-it-matter">poorly</a> in real life settings, where lighting, angles and other parameters are varied.</p>
<h2>Bias is another problem</h2>
<p>In a 2019 study, NIST <a href="https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf#page=5">found</a> that out of 189 facial recognition algorithms, the majority were biased. Specifically, they were less accurate on people from racial and ethnic minorities. </p>
<p>Even if the technology has improved in the past few years, it’s not foolproof. And we don’t know the extent to which Mastercard’s system has overcome this challenge.</p>
<p>If the software fails to recognise a customer at the check out, they might end up disappointed, or even become irate – which would completely undo any promise of speed or convenience.</p>
<p>But if the technology misidentifies a person (for instance, John is recognised as Peter – or <a href="https://www.youtube.com/watch?v=e8-yupM-6Oc">twins are confused</a> for each other), then money could be taken from the wrong person’s account. How would such a situation be dealt with?</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/464424/original/file-20220520-19-5hfuvx.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/464424/original/file-20220520-19-5hfuvx.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/464424/original/file-20220520-19-5hfuvx.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=617&fit=crop&dpr=1 600w, https://images.theconversation.com/files/464424/original/file-20220520-19-5hfuvx.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=617&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/464424/original/file-20220520-19-5hfuvx.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=617&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/464424/original/file-20220520-19-5hfuvx.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=776&fit=crop&dpr=1 754w, https://images.theconversation.com/files/464424/original/file-20220520-19-5hfuvx.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=776&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/464424/original/file-20220520-19-5hfuvx.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=776&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">There’s no evidence facial recognition technology is infallible. These systems can misidentify and also have biases.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>Is the technology secure?</h2>
<p>We often hear about software and databases being hacked, even in <a href="https://www.csoonline.com/article/2130877/the-biggest-data-breaches-of-the-21st-century.html">cases of</a> supposedly very “secure” organisations. Despite Mastercard’s <a href="https://wwmastw.cnbc.com/2022/05/17/mastercard-launches-tech-that-lets-you-pay-with-your-face-or-hand.html">efforts</a> to ensure security, there’s no guarantee the third-party providers’ databases – with potentially millions of people’s biometric data – won’t be hacked.</p>
<p>In the wrong hands, this data could lead to <a href="https://www.comparitech.com/identity-theft-protection/identity-theft-statistics/">identity theft</a>, which is one of the fastest growing types of crime, and financial fraud. </p>
<h2>Do we want it?</h2>
<p>Mastercard suggests 74% of customers are in favour of using such technology, referencing a stat from its <a href="https://www.mastercard.com/news/ap/en/newsroom/press-releases/en/2020/april/mastercard-study-shows-consumers-moving-to-contactless-payments-for-everyday-purchases/">own study</a> – also used by <a href="https://www.mastercard.com/news/ap/en/newsroom/press-releases/en/2020/october/mastercard-idemia-and-matchmove-pilot-fingerprint-biometric-card-in-asia-to-enhance-security-and-safety-of-contactless-payments">business partner</a> Idemia (a company that sells biometric identification products). </p>
<p>But the report cited is vague and brief. Other studies show entirely different results. For example, <a href="https://www.getapp.com/resources/facial-recognition-technology/#how-comfortable-are-consumers-with-facial-recognition-technology">this study</a> suggests 69% of customers aren’t comfortable with face recognition tech being used in retail settings. And <a href="https://www.securitymagazine.com/articles/93521-are-consumers-comfortable-with-facial-recognition-it-depends-says-new-study">this one</a> shows only 16% trust such tech.</p>
<p>Also, if consumers knew the risks the technology poses, the number of those willing to use it might drop even lower.</p><img src="https://counter.theconversation.com/content/183447/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Rita Matulionyte receives funding from Lithuanian Research Council for the research project 'Government Use of Facial Recognition Technologies: Legal Challenges and Possible Solutions' (2021-2023). She is affiliated with Australian Society for Computers and Law (AUSCL). </span></em></p>The technology is currently being trialled outside of Australia. It’s one of the first major attempts to bring it to western markets on a large scale.Rita Matulionyte, Senior Lecturer in Law, Macquarie UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1758172022-02-01T13:16:12Z2022-02-01T13:16:12ZGovernment agencies are tapping a facial recognition company to prove you’re you – here’s why that raises concerns about privacy, accuracy and fairness<figure><img src="https://images.theconversation.com/files/443239/original/file-20220128-19-ghy893.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C8000%2C5317&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Beginning this summer, you might need to upload a selfie and a photo ID to a private company, ID.me, if you want to file your taxes online.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/young-woman-using-smartphone-while-working-with-royalty-free-image/1224140562">Oscar Wong/Moment via Getty Images</a></span></figcaption></figure><p>The U.S. Internal Revenue Service is planning to <a href="https://www.irs.gov/newsroom/irs-unveils-new-online-identity-verification-process-for-accessing-self-help-tools">require citizens to create accounts</a> with a private facial recognition company in order to file taxes online. The IRS is joining a growing number of federal and state agencies that have contracted with <a href="https://www.id.me/">ID.me</a> to authenticate the identities of people accessing services.</p>
<p>The IRS’s move is aimed at cutting down on identity theft, a crime that <a href="https://www.ftc.gov/system/files/documents/reports/consumer-sentinel-network-data-book-2020/csn_annual_data_book_2020.pdf">affects millions of Americans</a>. The IRS, in particular, has reported a number of tax filings from people claiming to be others, and <a href="https://www.cnbc.com/2021/12/21/criminals-have-stolen-nearly-100-billion-in-covid-relief-funds-secret-service.html">fraud in many of the programs</a> that were administered as part of the <a href="https://www.whitehouse.gov/american-rescue-plan/">American Relief Plan</a> has been a major concern to the government.</p>
<p>The IRS decision has prompted a backlash, in part over concerns about requiring citizens to use facial recognition technology and in part over difficulties some people have had in using the system, particularly with some state agencies that provide unemployment benefits. The reaction has prompted the IRS to <a href="https://www.bloomberg.com/news/articles/2022-01-28/treasury-weighing-id-me-alternatives-over-privacy-concerns?sref=Hjm5biAW">revisit its decision</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/443053/original/file-20220127-9782-2f0nex.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a webpage with the IRS logo in the top left corner and buttons for creating or logging into an account" src="https://images.theconversation.com/files/443053/original/file-20220127-9782-2f0nex.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/443053/original/file-20220127-9782-2f0nex.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=309&fit=crop&dpr=1 600w, https://images.theconversation.com/files/443053/original/file-20220127-9782-2f0nex.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=309&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/443053/original/file-20220127-9782-2f0nex.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=309&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/443053/original/file-20220127-9782-2f0nex.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=388&fit=crop&dpr=1 754w, https://images.theconversation.com/files/443053/original/file-20220127-9782-2f0nex.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=388&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/443053/original/file-20220127-9782-2f0nex.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=388&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Here’s what greets you when you click the link to sign into your IRS account. If current plans remain in place, the blue button will go away in the summer of 2022.</span>
<span class="attribution"><a class="source" href="https://sa.www4.irs.gov/secureaccess/ui/?TYPE=33554433&REALMOID=06-0006b18e-628e-1187-a229-7c2b0ad00000&GUID=&SMAUTHREASON=0&METHOD=GET&SMAGENTNAME=-SM-u0ktItgVFneUJDzkQ7tjvLYXyclDooCJJ7%2bjXGjg3YC5id2x9riHE98hoVgd1BBv&TARGET=-SM-http%3a%2f%2fsa%2ewww4%2eirs%2egov%2fola%2f">Screenshot, IRS sign-in webpage</a></span>
</figcaption>
</figure>
<p>As a <a href="https://scholar.google.com/citations?user=JNPbTdIAAAAJ&hl=en">computer science researcher</a> and the chair of the <a href="https://www.acm.org/public-policy/tpc">Global Technology Policy Council of the Association for Computing Machinery</a>, I have been involved in exploring some of the issues with government use of facial recognition technology, both its use and its potential flaws. There have been a great number of concerns raised over the general <a href="https://theconversation.com/feds-are-increasing-use-of-facial-recognition-systems-despite-calls-for-a-moratorium-145913">use of this technology in policing and other government functions</a>, often focused on whether the accuracy of these algorithms can have discriminatory affects. In the case of ID.me, there are other issues involved as well.</p>
<h2>ID dot who?</h2>
<p>ID.me is a private company that <a href="https://www.bloomberg.com/news/features/2022-01-20/cybersecurity-company-id-me-is-becoming-government-s-digital-gatekeeper?sref=Hjm5biAW">formed as TroopSwap</a>, a site that offered retail discounts to members of the armed forces. As part of that effort, the company created an ID service so that military staff who qualified for discounts at various companies could prove they were, indeed, service members. In 2013, the company renamed itself ID.me and started to market its ID service more broadly. The U.S. Department of Veterans Affairs began using the technology in 2016, the company’s first government use.</p>
<p>To use ID.me, a user loads a mobile phone app and takes a selfie – a photo of their own face. ID.me then compares that image to various IDs that it obtains either through open records or through information that applicants provide through the app. If it finds a match, it creates an account and uses image recognition for ID. If it cannot perform a match, users can contact a “trusted referee” and have a video call to fix the problem.</p>
<p>A number of companies and <a href="https://www.usnews.com/news/technology/articles/2021-07-22/factbox-states-using-idme-rival-identity-check-tools-for-jobless-claims">states</a> have been using ID.me for several years. News reports have documented <a href="https://www.cpr.org/2021/05/10/unemployment-payouts-have-dropped-40-percent-is-id-me-stopping-scams-or-blocking-benefits/">problems people have had with ID.me</a> failing to authenticate them, and with the company’s customer support in resolving those problems. Also, the system’s technology requirements <a href="https://www.usnews.com/news/best-states/colorado/articles/2021-05-02/system-for-unemployment-benefits-exposes-digital-divide">could widen the digital divide</a>, making it harder for many of the people who need government services the most to access them. </p>
<p>But much of the concern about the IRS and other federal agencies using ID.me revolves around its use of facial recognition technology and collection of biometric data.</p>
<h2>Accuracy and bias</h2>
<p>To start with, there are a number of general concerns about the accuracy of facial recognition technologies and whether there are <a href="https://theconversation.com/ai-technologies-like-police-facial-recognition-discriminate-against-people-of-colour-143227">discriminatory biases</a> in their accuracy. These have led the Association for Computing Machinery, among other organizations, to <a href="https://theconversation.com/feds-are-increasing-use-of-facial-recognition-systems-despite-calls-for-a-moratorium-145913">call for a moratorium on government use</a> of facial recognition technology. </p>
<p>A study of commercial and academic facial recognition algorithms by the National Institute of Standards and Technology found that U.S. facial-matching algorithms generally have <a href="https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software">higher false positive rates for Asian and Black faces</a> than for white faces, although recent results have improved. ID.me claims that there is <a href="https://insights.id.me/viewpoint/no-identity-left-behind-american-increased-access-online-services/">no racial bias</a> in its face-matching verification process. </p>
<p>There are many other conditions that can also cause inaccuracy – physical changes caused by illness or an accident, hair loss due to chemotherapy, color change due to aging, gender conversions and others. How any company, including ID.me, handles such situations is unclear, and this is one issue that has raised concerns. Imagine having a disfiguring accident and not being able to log into your medical insurance company’s website because of damage to your face.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/BqQT4sIOYA0?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Facial recognition technology is spreading fast. Is the technology – and society – ready?</span></figcaption>
</figure>
<h2>Data privacy</h2>
<p>There are other issues that go beyond the question of just how well the algorithm works. As part of its process, ID.me collects a very large amount of personal information. It has a very long and difficult-to-read privacy policy, but essentially while ID.me doesn’t share most of the personal information, it does share various information about internet use and website visits with other partners. The nature of these exchanges is not immediately apparent. </p>
<p>So one question that arises is what level of information the company shares with the government, and whether the information can be used in tracking U.S. citizens between regulated boundaries that apply to government agencies. Privacy advocates on both the left and right have long opposed any form of a mandatory uniform government identification card. Does handing off the identification to a private company allow the government to essentially achieve this through subterfuge? It’s not difficult to imagine that some states – and maybe eventually the federal government – could insist on an identification from ID.me or one of its competitors to access government services, get medical coverage and even to vote. </p>
<p>As Joy Buolamwini, an MIT AI researcher and founder of the <a href="https://www.ajl.org/">Algorithmic Justice League</a>, argued, beyond accuracy and bias issues is the question of <a href="https://www.theatlantic.com/ideas/archive/2022/01/irs-should-stop-using-facial-recognition/621386/">the right not to use biometric technology</a>. “Government pressure on citizens to share their biometric data with the government affects all of us — no matter your race, gender, or political affiliations,” she wrote.</p>
<h2>Too many unknowns for comfort</h2>
<p>Another issue is who audits ID.me for the security of its applications? While no one is accusing ID.me of bad practices, security researchers are worried about how the company may protect the incredible level of personal information it will end up with. Imagine a security breach that released the IRS information for millions of taxpayers. In the fast-changing world of cybersecurity, with threats ranging from individual hacking to international criminal activities, experts would like assurance that a company provided with so much personal information is using state-of-the-art security and keeping it up to date. </p>
<p>[<em>Over 140,000 readers rely on The Conversation’s newsletters to understand the world.</em> <a href="https://memberservices.theconversation.com/newsletters/?source=inline-140ksignup">Sign up today</a>.]</p>
<p>Much of the questioning of the IRS decision comes because these are early days for government use of private companies to provide biometric security, and some of the details are still not fully explained. Even if you grant that the IRS use of the technology is appropriately limited, this is potentially the start of what could quickly snowball to many government agencies using commercial facial recognition companies to get around regulations that were put in place specifically to rein in government powers. </p>
<p>The U.S. stands at the edge of a slippery slope, and while that doesn’t mean facial recognition technology shouldn’t be used at all, I believe it does mean that the government should put a lot more care and due diligence into exploring the terrain ahead before taking those critical first steps.</p><img src="https://counter.theconversation.com/content/175817/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>James Hendler receives funding from IBM, DARPA, and the NSF. He is a Professor at Rensselaer Polytechnic Institute, affiliated with the Association for Computing Machinery (ACM) and consults or has consulted for a number of government agencies. The opinions expressed in this piece are solely those of the author and do not necessarily represent the opinions of the ACM or any of the other organizations with which he is affiliated.</span></em></p>Federal and state governments are turning to a facial recognition company to ensure that people accessing services are who they say they are. The move promises to cut down on fraud, but at what cost?James Hendler, Professor of Computer, Web and Cognitive Sciences, Rensselaer Polytechnic InstituteLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1459132021-09-01T18:49:39Z2021-09-01T18:49:39ZFeds are increasing use of facial recognition systems – despite calls for a moratorium<figure><img src="https://images.theconversation.com/files/418725/original/file-20210831-17-1g1kwc7.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C1962%2C1467&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Government agencies are increasingly using facial recognition technology, including through security cameras like this one being installed on the Lincoln Memorial in 2019.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/technician-installs-a-security-camera-atop-of-the-lincoln-news-photo/1159591338">Mark Wilson/Getty Images</a></span></figcaption></figure><p>Despite growing opposition, the U.S. government is on track to increase its use of controversial facial recognition technology.</p>
<p>The U.S. Government Accountability Office released <a href="https://www.gao.gov/assets/gao-21-526.pdf">a report</a> on Aug. 24, 2021, detailing current and planned use of facial recognition technology by federal agencies. The GAO surveyed <a href="https://www.cfo.gov/about-the-council/">24 departments and agencies</a> – from the Department of Defense to the Small Business Administration – and found that 18 reported using the technology and 10 reported plans to <a href="https://www.technologyreview.com/2021/08/24/1032967/us-government-agencies-plan-to-increase-their-use-of-facial-recognition-technology/">expand their use of it</a>.</p>
<p>The report comes more than a year after the <a href="https://www.acm.org/public-policy/ustpc">U.S. Technology Policy Committee</a> of the Association for Computing Machinery, the world’s largest educational and scientific computing society, called for <a href="https://www.acm.org/binaries/content/assets/public-policy/ustpc-facial-recognition-tech-statement.pdf">an immediate halt</a> to virtually all government use of facial recognition technology. </p>
<p>The U.S. Technology Policy Committee is one of numerous groups and prominent figures, including the <a href="https://www.aclu.org/news/topic/stopping-face-recognition-surveillance/?redirect=facerecognition">ACLU</a>, the <a href="https://www.ala.org/advocacy/intfreedom/facialrecognitionresolution">American Library Association</a> and the United Nations <a href="https://www.ohchr.org/EN/NewsEvents/Pages/DisplayNews.aspx?NewsID=24736">Special Rapporteur on Freedom of Opinion and Expression</a>, to call for curbs on use of the technology. A common theme of this opposition is the lack of standards and regulations for facial recognition technology.</p>
<p>A year ago, Amazon, IBM and Microsoft also announced that they would <a href="https://www.vox.com/recode/2020/6/10/21287194/amazon-microsoft-ibm-facial-recognition-moratorium-police">stop selling facial recognition technology</a> to police departments pending federal regulation of the technology. Congress is <a href="https://www.markey.senate.gov/news/press-releases/senators-markey-merkley-lead-colleagues-on-legislation-to-ban-government-use-of-facial-recognition-other-biometric-technology">weighing a moratorium</a> on government use of the technology. Some cities and states, notably <a href="https://slate.com/technology/2021/07/maine-facial-recognition-government-use-law.html">Maine</a>, have introduced restrictions.</p>
<h2>Why computing experts say no</h2>
<p>The Association for Computing Machinery’s U.S. Technology Policy Committee, which issued the call for a moratorium, includes computing professionals from academia, industry and government, a number of whom were actively involved in the development or analysis of the technology. As chair of the committee at the time the statement was issued and as a <a href="https://scholar.google.com/citations?user=JNPbTdIAAAAJ&hl=en">computer science researcher</a>, I can explain what prompted our committee to recommend this ban and, perhaps more significantly, what it would take for the committee to rescind its call.</p>
<p>If your cellphone doesn’t recognize your face and makes you type in your passcode, or if the photo-sorting software you’re using misidentifies a family member, no real harm is done. On the other hand, if you become liable for arrest or denied entrance to a facility because the recognition algorithms are imperfect, the impact can be drastic.</p>
<p>The statement we wrote outlines principles for the use of facial recognition technologies in these consequential applications. The first and most critical of these is the need to understand the accuracy of these systems. One of the key problems with these algorithms is that they <a href="https://theconversation.com/ai-technologies-like-police-facial-recognition-discriminate-against-people-of-colour-143227">perform differently for different ethnic groups</a>. </p>
<p>An <a href="https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software">evaluation of facial recognition vendors</a> by the U.S. National Institute of Standards and Technology found that the majority of the systems tested had clear differences in their ability to match two images of the same person when one ethnic group was compared with another. Another study found the algorithms are <a href="http://proceedings.mlr.press/v81/buolamwini18a.html?mod=article_inline">more accurate for lighter-skinned males</a> than for darker-skinned females. Researchers are also exploring how other features, such as age, disease and <a href="https://sheribyrnehaber.medium.com/disability-and-ai-bias-cced271bd533">disability status</a>, affect these systems. These studies are also <a href="https://www.csis.org/blogs/technology-policy-blog/how-accurate-are-facial-recognition-systems-%E2%80%93-and-why-does-it-matter">turning up disparities</a>. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/-_ydGhdYd0M?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">MIT’s Joy Buolamwini explains her study finding racial and gender bias in facial recognition technology.</span></figcaption>
</figure>
<p>A number of other features affect the performance of these algorithms. Consider the difference between how you might look in a nice family photo you have shared on social media versus a picture of you taken by a grainy security camera, or a moving police car, late on a misty night. Would a system trained on the former perform well in the latter context? How <a href="https://www.csis.org/blogs/technology-policy-blog/how-accurate-are-facial-recognition-systems-%E2%80%93-and-why-does-it-matter">lighting, weather, camera angle and other factors affect these algorithms</a> is still an open question. </p>
<p>In the past, systems that matched <a href="http://onin.com/fp/ridgeology.pdf">fingerprints</a> or <a href="https://www.ncbi.nlm.nih.gov/books/NBK232607/">DNA traces</a> had to be formally evaluated, and standards set, before they were trusted for use by the police and others. Until facial recognition algorithms can meet similar standards – and researchers and regulators truly understand how the context in which the technology is used affects its accuracy – the systems shouldn’t be used in applications that can have serious consequences for people’s lives.</p>
<h2>Transparency and accountability</h2>
<p>It’s also important that organizations using facial recognition provide some form of meaningful advanced and ongoing public notice. If a system can result in you losing your liberty or your life, you should know it is being used. In the U.S., this has been a principle for the use of many potentially harmful technologies, from speed cameras to <a href="https://www.law.berkeley.edu/files/Video_surveillance_guidelines.pdf">video surveillance</a>, and the USTPC’s position is that facial recognition systems should be held to the same standard.</p>
<p>To get transparency, there also must be rules that govern the collection and use of the personal information that underlies the training of facial recognition systems. The company Clearview AI, which now has software <a href="https://www.theverge.com/2020/8/26/21402978/clearview-ai-ceo-interview-2400-police-agencies-facial-recognition">in use by police agencies around the world</a>, is a <a href="https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html">case in point</a>. The company collected its data – photos of individuals’ faces – with no notification. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/sxQXARMJcys?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">PBS Nova explains Clearview AI’s massive database of images of people.</span></figcaption>
</figure>
<p>Clearview AI collected data from many different applications, vendors and systems, taking advantage of the <a href="https://www.nytimes.com/2018/04/01/opinion/facebook-lax-privacy-rules.html">lax laws controlling such collection</a>. Kids who post videos of themselves on TikTok, users who tag friends in photos on Facebook, consumers who make purchases with Venmo, people who upload videos to YouTube and many others all create images that can be linked to their names and scraped from these applications by companies like Clearview AI. </p>
<p>Are you in the dataset Clearview uses? You have no way to know. The ACM’s position is that you should have a right to know, and that governments should put limits on how this data is collected, stored and used.</p>
<p>In 2017, the Association for Computing Machinery U.S. Technology Policy Committee and its European counterpart released a <a href="https://www.acm.org/binaries/content/assets/public-policy/2017_joint_statement_algorithms.pdf">joint statement</a> on algorithms for automated decision-making about individuals that can result in harmful discrimination. In short, we called for policymakers to hold institutions using analytics to the same standards as for institutions where humans have traditionally made decisions, whether it be traffic enforcement or criminal prosecution.</p>
<p>This includes understanding the trade-offs between the risks and benefits of powerful computational technologies when they are put into practice and having clear principles about who is liable when harms occur. Facial recognition technologies are in this category, and it’s important to understand how to measure their risks and benefits and who is responsible when they fail.</p>
<h2>Protecting the public</h2>
<p>One of the primary roles of governments is to manage technology risks and protect their populations. The principles the Association for Computing Machinery’s USTPC has outlined have been used in regulating transportation systems, medical and pharmaceutical products, food safety practices and many other aspects of society. The Association for Computing Machinery’s USTPC is, in short, asking that governments recognize the potential for facial recognition systems to cause significant harm to many people, through errors and bias. </p>
<p>These systems are still in an early stage of maturity, and there is much that researchers, government and industry don’t understand about them. Until facial recognition technologies are better understood, their use in consequential applications should be halted until they can be properly regulated.</p>
<p>[<em>Get the best of The Conversation, every weekend.</em> <a href="https://theconversation.com/us/newsletters/weekly-highlights-61?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=weeklybest">Sign up for our weekly newsletter</a>.]</p><img src="https://counter.theconversation.com/content/145913/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>James Hendler receives funding from the US Defense Advanced Research Projects Agency, the US National Science Foundation and IBM Corporation. </span></em></p>Politicians of all stripes, computer professionals and even big-tech executives are calling on government to hit the brakes on using these algorithms. The feds are hitting the gas.James Hendler, Professor of Computer, Web and Cognitive Sciences, Rensselaer Polytechnic InstituteLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1664652021-08-30T12:27:57Z2021-08-30T12:27:57ZAfghanistan’s Taliban reportedly have control of US biometric devices – a lesson in life-and-death consequences of data privacy<figure><img src="https://images.theconversation.com/files/417480/original/file-20210823-22-pzodwj.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C4043%2C2625&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A U.S. Army soldier scans the irises of an Afghan civilian in 2012 as part of an effort by the military to collect biometric information from much of the Afghan population.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/an-american-isaf-solider-from-team-apache-of-task-force-news-photo/149781425">Jose Cabezas/AFP via GettyImages</a></span></figcaption></figure><p>In the wake of the Taliban’s takeover of Kabul and the ouster of the Afghan national government in August 2021, <a href="https://www.reuters.com/article/afghanistan-tech-conflict/afghans-scramble-to-delete-digital-history-evade-biometrics-idUSL8N2PO1FH">alarming reports</a> indicated that the insurgents had potentially accessed biometric data collected by the U.S. to track Afghans, including people who worked for U.S. and coalition forces. </p>
<p>Afghans who once supported the U.S. have been attempting to <a href="https://www.theguardian.com/world/2021/aug/15/an-afghan-woman-in-kabul-now-i-have-to-burn-everything-i-achieved">hide</a> or <a href="https://timesofindia.indiatimes.com/world/south-asia/i-am-burning-my-id-card-and-fleeing-my-house-the-future-of-afghans-in-jeopardy-as-taliban-regains-control/articleshow/85422687.cms">destroy</a> physical and digital evidence of their identities. Many Afghans fear that the identity <a href="https://www.bbc.com/news/technology-58245121">documents</a> and <a href="https://www.politico.com/news/2021/08/24/taliban-afghan-data-target-allies-506638">databases</a> storing personally identifiable data could be transformed into <a href="https://www.wsj.com/articles/afghanistan-veterans-in-congress-trying-to-prevent-a-death-warrant-for-helping-america-11629299971">death warrants</a> in the hands of the Taliban, and a March 30, 2022, report from Human Rights Watch indicated the Taliban have been <a href="https://www.hrw.org/news/2022/03/30/new-evidence-biometric-data-systems-imperil-afghans">collecting biometric data</a> to potentially match against captured U.S. and Afghan government databases. U.S. military devices and the data they contain have since <a href="https://www.nytimes.com/2022/12/27/technology/for-sale-on-ebay-a-military-database-of-fingerprints-and-iris-scans.html">turned up on the open market</a>.</p>
<p>This data breach underscores that data protection in zones of <a href="https://theconversation.com/the-taliban-may-have-access-to-the-biometric-data-of-civilians-who-helped-the-u-s-military-166475">conflict</a>, especially biometric data and databases that connect online activity to physical locations, can be a matter of life and death. My <a href="https://pennstatelaw.psu.edu/faculty/hu">research</a> and the work of <a href="https://anniejacobsen.com">journalists</a> and <a href="https://dx.doi.org/10.2139/ssrn.2134481">privacy advocates</a> who study biometric cybersurveillance anticipated these data privacy and security risks.</p>
<h2>Biometric-driven warfare</h2>
<p>Investigative journalist Annie Jacobsen documented the birth of biometric-driven warfare in Afghanistan following the terrorist attacks on Sept. 11, 2001, in her book “<a href="https://www.penguinrandomhouse.com/books/624446/first-platoon-by-annie-jacobsen/">First Platoon</a>.” The U.S. Department of Defense quickly viewed biometric data and what it called “identity dominance” as the cornerstone of multiple counterterrorism and counterinsurgency strategies. Identity dominance means being able to keep track of people the military considers a potential threat regardless of aliases, and ultimately denying organizations the ability to use anonymity to hide their activities.</p>
<p>By 2004, thousands of U.S. military personnel had been trained to collect biometric data to support the wars in Afghanistan and Iraq. By 2007, U.S. forces were collecting biometric data primarily through mobile devices such as the <a href="https://www.nist.gov/system/files/documents/2021/03/23/ansi-nist_archived_vermury-bat-hiide.pdf">Biometric Automated Toolset</a> (BAT) and <a href="https://www.nist.gov/system/files/documents/2021/03/23/ansi-nist_archived_vermury-bat-hiide.pdf">Handheld Interagency Identity Detection Equipment</a> (HIIDE). BAT includes a laptop, fingerprint reader, iris scanner and camera. HIIDE is a single small device that incorporates a fingerprint reader, iris scanner and camera. Users of these devices can collect iris and fingerprint scans and facial photos, and match them to entries in military databases and biometric watchlists.</p>
<p>In addition to biometric data, the system includes biographic and contextual data such as criminal and terrorist watchlist records, enabling users to determine if an individual is flagged in the system as a suspect. Intelligence analysts can also use the system to monitor people’s movements and activities by tracking biometric data recorded by troops in the field.</p>
<p>By 2011, a decade after 9/11, the Department of Defense <a href="https://www.gao.gov/assets/a317375.html">maintained approximately 4.8 million biometric records</a> of people in Afghanistan and Iraq, with about 630,000 of the records collected using HIIDE devices. Also by that time, the U.S. Army and its military partners in the Afghan government were using <a href="https://info.publicintelligence.net/CALL-AfghanBiometrics.pdf">biometric-enabled intelligence</a> or <a href="https://dx.doi.org/10.2139/ssrn.2886575">biometric cyberintelligence</a> on the battlefield to identify and track insurgents. </p>
<p>In 2013, the U.S. Army and Marine Corps used the <a href="https://www.marcorsyscom.marines.mil/News/News-Article-Display/Article/509568/new-biometrics-device-helps-marines-determine-friend-or-foe/">Biometric Enrollment and Screening Device</a>, which enrolled the iris scans, fingerprints and digital face photos of “persons of interest” in Afghanistan. That device was replaced by the <a href="https://www.marines.mil/News/News-Display/Article/1394036/marine-corps-fields-game-changer-biometric-data-collection-system/utm_content/bufferec10a/utm_medium/social/utm_campaign/buffer/?utm_source=plus.google.com">Identity Dominance System-Marine Corps</a> in 2017, which uses a laptop with biometric data-collection sensors, <a href="https://arstechnica.com/information-technology/2015/10/military-looks-to-upgrade-its-tactical-biometrics-with-identity-dominance-system-2/">known as the Secure Electronic Enrollment Kit</a>.</p>
<p>Over the years, to support these military objectives, the Department of Defense aimed to create a biometric database on <a href="https://www.npr.org/2021/01/14/956705029/first-platoon-examines-how-war-on-terror-birthed-pentagons-biometrics-id-system">80% of the Afghan population</a>, approximately 32 million people at today’s population level. It is unclear how close the military came to this goal. </p>
<h2>More data equals more people at risk</h2>
<p>In addition to the use of biometric data by the U.S. and Afghan military for security purposes, the Department of Defense and the Afghan government eventually adopted the technologies for a range of day-to-day governmental uses. These included <a href="https://www.fbi.gov/news/stories/mission-afghanistan-biometrics#:%7E:text=The%20Afghan%20biometrics%20program%20was%20barely%20off%20the,insurgents%20from%20infiltrating%20the%20army%20and%20police%20force.">evidence</a> for criminal prosecution, <a href="https://www.afcea.org/content/us-defense-department-expands-biometrics-technologies-information-sharing">clearing</a> Afghan workers for employment and <a href="https://www.reuters.com/article/us-afghanistan-election-technology/biometric-machines-in-afghan-vote-improve-after-last-years-glitches-idUSKBN1WD0DM">election security</a>. </p>
<p>In addition, the Afghan National ID system and voter registration databases contained sensitive data, including <a href="https://www.politico.com/news/2021/08/24/taliban-afghan-data-target-allies-506638">ethnicity data</a>. The Afghan ID, the <a href="https://www.loc.gov/item/global-legal-monitor/2018-07-19/afghanistan-distribution-of-controversial-electronic-identity-cards-launched/">e-Tazkira</a>, is an <a href="https://www.justice.gov/sites/default/files/eoir/legacy/2014/04/03/afg104742.e.pdf">electronic identification document that includes biometric data</a>, which increases the privacy risks posed by Taliban access to the National ID system.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/418115/original/file-20210826-15-1mh4vcb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A computer screen shows an enlarged image of a pair of eyes as an arm holds a boxlike object in front of the eyes of a woman wearing a headscarf and facemask" src="https://images.theconversation.com/files/418115/original/file-20210826-15-1mh4vcb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/418115/original/file-20210826-15-1mh4vcb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/418115/original/file-20210826-15-1mh4vcb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/418115/original/file-20210826-15-1mh4vcb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/418115/original/file-20210826-15-1mh4vcb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/418115/original/file-20210826-15-1mh4vcb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/418115/original/file-20210826-15-1mh4vcb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Before falling to the Taliban, the Afghan government made extensive use of biometric security, including scanning the irises of people like this woman who applied for passports.</span>
<span class="attribution"><a class="source" href="https://newsroom.ap.org/detail/AfghanistanAnxiousAfghans/ed3c480aa7be4e11998a0d472e625ad7/photo">AP Photo/Rahmat Gul</a></span>
</figcaption>
</figure>
<p>We do not yet know the extent to which the Taliban have been able to commandeer the biometric data once held by the U.S. military. One report suggested that the Taliban may not be able to access the biometric data collected through HIIDE because they <a href="https://theintercept.com/2021/08/17/afghanistan-taliban-military-biometrics/">lack the technical capacity to do so</a>. However, it’s possible the Taliban could turn to longtime ally Inter-Services Intelligence, Pakistan’s intelligence agency, for help getting at the data. Like many national intelligence services, ISI likely has the necessary technology. </p>
<p>Another report indicated that the Taliban <a href="https://www.reuters.com/article/afghanistan-tech-conflict/afghans-scramble-to-delete-digital-history-evade-biometrics-idUSL8N2PO1FH">have already started to deploy a “biometrics machine”</a> to conduct “house-to-house inspections” to identify former Afghan officials and security forces. This is consistent with prior Afghan news reports that described the Taliban subjecting <a href="https://pajhwok.com/2017/02/14/taliban-subject-passengers-biometric-screening/">bus passengers</a> to biometric screening and using biometric data to <a href="https://tolonews.com/afghanistan/taliban-used-biometric-system-during-kunduz-kidnapping">target</a> Afghan security forces for kidnapping and assassination.</p>
<h2>Concerns about collecting biometric data</h2>
<p>For years following 9/11, researchers, activists and policymakers raised concerns that the mass collection, storage and analysis of sensitive biometric data posed dangers to <a href="https://ssrn.com/abstract=2041946">privacy rights</a> and <a href="https://www.humanrightsfirst.org/resource/steps-protect-your-online-identity-taliban-digital-history-and-evading-biometrics-abuses">human rights</a>. Reports of the Taliban potentially accessing U.S. biometric data stored by the military show that those concerns were not unfounded. They reveal potential cybersecurity vulnerabilities in the U.S. military’s biometric systems. In particular, the situation raises questions about the security of the mobile biometric data-collection devices used in Afghanistan. </p>
<p>The data privacy and cybersecurity concerns surrounding Taliban access to U.S. and former Afghan government databases are a warning for the future. In building biometric-driven warfare technologies and protocols, it appears that the <a href="https://nsarchive.gwu.edu/document/24571-department-defense-directive-8521-01e-department-defense-biometrics-january-13-2016">Department of Defense assumed</a> the Afghan government would have the minimum level of stability needed to protect the data. </p>
<p>The U.S. military should assume that any <a href="https://www.politico.com/news/2021/08/24/taliban-afghan-data-target-allies-506638">sensitive data</a> – biometric and biographical data, wiretap data and communications, geolocation data, government records – could potentially fall into enemy hands. In addition to building robust security to protect against unauthorized access, the Pentagon should use this as an opportunity to question whether it was necessary to collect the biometric data in the first instance.</p>
<p>Understanding the unintended consequences of the U.S. experiment in biometric-driven warfare and biometric cyberintelligence is critically important for determining <a href="https://privacyinternational.org/sites/default/files/2021-06/Biometrics%20for%20Counter-Terrorism-%20Case%20study%20of%20the%20U.S.%20military%20in%20Iraq%20and%20Afghanistan%20-%20Nina%20Toft%20Djanegara%20-%20v6.pdf">whether and how</a> the military should collect biometric information. In the case of Afghanistan, the biometric data that the U.S. military and the Afghan government had been using to track the Taliban could one day soon – if it’s not already – be used by the Taliban to track Afghans who supported the U.S.</p>
<p><em>This article has been updated to include news that biometric data from Afghanistan has been sold on the open market.</em></p><img src="https://counter.theconversation.com/content/166465/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Margaret Hu is affiliated with the Future of Privacy Forum, a non-profit think tank that provides policy guidance on data privacy. Some of Hu's research assistants receive funding from Microsoft Research. She received an honorarium for speaking at an event hosted by Microsoft Research.</span></em></p>The potential failure of the US military to protect information that can identify Afghan citizens raises questions about whether and how biometric data should be collected in war zones.Margaret Hu, Professor of Law, William & Mary Law SchoolLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1664752021-08-23T16:17:20Z2021-08-23T16:17:20ZThe Taliban may have access to the biometric data of civilians who helped the U.S. military<figure><img src="https://images.theconversation.com/files/417265/original/file-20210820-19-qdck6z.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C6413%2C4224&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Taliban fighters stand guard at a checkpoint in Kabul, Afghanistan, on Aug. 18, 2021.</span> <span class="attribution"><span class="source">(AP Photo/Rahmat Gul) </span></span></figcaption></figure><iframe style="width: 100%; height: 175px; border: none; position: relative; z-index: 1;" allowtransparency="" src="https://narrations.ad-auris.com/widget/the-conversation-canada/the-taliban-may-have-access-to-the-biometric-data-of-civilians-who-helped-the-u-s--military" width="100%" height="400"></iframe>
<p>In 2007, the United States military began using a small, handheld device to collect and match the iris, fingerprint and facial scans of over <a href="https://www.nytimes.com/2011/07/14/world/asia/14identity.html">1.5 million Afghans</a> against a database of biometric data. The device, known as Handheld Interagency Identity Detection Equipment (HIIDE), was initially developed by the U.S. government as a means to <a href="https://www.army.mil/article/32609/bats_helps_id_insurgents_hostages">locate insurgents</a> and other wanted individuals. Over time, for the sake of efficiency, the system came to include the data of Afghans assisting the U.S. during the war. </p>
<p>Today, HIIDE provides access to a database of biometric and biographic data, including of those who aided coalition forces. Military equipment and devices — including the collected data — are speculated to have been <a href="https://www.reuters.com/article/afghanistan-tech-conflict/afghans-scramble-to-delete-digital-history-evade-biometrics-idUSL8N2PO1FH">captured by the Taliban</a>, who have taken over Afghanistan. </p>
<p>This development is the latest in many incidents that exemplify why governments and international organizations cannot yet securely collect and use biometric data in conflict zones and in their crisis responses.</p>
<h2>Building biometric databases</h2>
<p>Biometric data, or simply biometrics, are unique physical or behavioural characteristics that can be used to identify a person. These include facial features, voice patterns, fingerprints or iris features. Often described as the most secure method of verifying an individual’s identity, biometric data are being used by <a href="https://www.thalesgroup.com/en/markets/digital-identity-and-security/banking-payment/cards/biometrics-in-banking">governments and organizations</a> to verify and grant citizens and clients access to personal information, finances and accounts. </p>
<p>According to a <a href="https://www.nist.gov/system/files/documents/2021/03/23/ansi-nist_archived_vermury-bat-hiide.pdf">2007 presentation</a> by the <a href="https://www.army.mil/article/21940/biometrics_on_the_ground_and_in_the_dod">U.S. Army’s Biometrics Task Force</a>, HIIDE collected and matched fingerprints, iris images, facial photos and biographical contextual data of persons of interest against an internal database. </p>
<p>In a <a href="https://privacyinternational.org/sites/default/files/2021-06/Biometrics%20for%20Counter-Terrorism-%20Case%20study%20of%20the%20U.S.%20military%20in%20Iraq%20and%20Afghanistan%20-%20Nina%20Toft%20Djanegara%20-%20v6.pdf">May 2021 report</a>, anthropologist Nina Toft Djanegara illustrates how the collection and use of biometrics by the U.S. military in Iraq set the precedent for similar efforts in Afghanistan. There, the “U.S. Army Commander’s Guide to Biometrics in Afghanistan” advised officials to “<a href="https://info.publicintelligence.net/CALL-AfghanBiometrics.pdf">be creative and persistent in their efforts to enrol as many Afghans as possible</a>.” The guide recognized that people may hesitate to provide their personal information and therefore, officials should “frame biometric enrolment as a matter of ‘protecting their people.’”</p>
<p>Inspired by <a href="https://www.dhs.gov/biometrics">the U.S. biometrics system</a>, the Afghan government began work to establish <a href="https://www.nytimes.com/2011/11/20/world/asia/in-afghanistan-big-plans-to-gather-biometric-data.html?pagewanted=all">a national ID card</a>, collecting biometric data from university students, soldiers and passport and driver license applications. </p>
<p>Although it remains uncertain at this time whether the Taliban has captured HIIDE and if it can access the aforementioned biometric information of individuals, the risk to those whose data is stored on the system is high. In 2016 and 2017, the Taliban stopped passenger buses across the country to <a href="https://www.planetbiometrics.com/article-details/i/5529/desc/taliban-subject-passengers-to-biometric-screening/">conduct biometric checks of all passengers to determine whether there were government officials on the bus</a>. These stops sometimes resulted in <a href="https://tolonews.com/afghanistan/taliban-used-biometric-system-during-kunduz-kidnapping">hostage situations and executions</a> carried out by the Taliban.</p>
<h2>Placing people at increased risk</h2>
<p>We are familiar with biometric technology through mobile features like <a href="https://support.apple.com/en-ca/HT201371">Apple’s Touch ID</a> or <a href="https://www.samsung.com/ca/support/mobile-devices/galaxy-phone-fingerprint-sensor/">Samsung’s fingerprint scanner</a>, or by engaging with facial recognition systems while passing through international borders. For many people located in conflict zones or rely on humanitarian aid in the Middle East, Asia and Africa, biometrics are presented as a secure measure for accessing resources and services to fulfil their most basic needs.</p>
<p>In 2002, the United Nations High Commissioner for Refugees (UNHCR) introduced iris-recognition technology during the repatriation of more than 1.5 million Afghan refugees from Pakistan. The technology was used to identify individuals who sought funds “<a href="https://www.unhcr.org/news/latest/2002/10/3d9c57708/afghan-recyclers-under-scrutiny-new-technology.html">more than once</a>.” If the algorithm matched a new entry to a pre-existing iris record, the claimant was refused aid. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/417266/original/file-20210820-15-itiets.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A UNHCR employee hands an Afghan woman sacks of supplies" src="https://images.theconversation.com/files/417266/original/file-20210820-15-itiets.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/417266/original/file-20210820-15-itiets.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/417266/original/file-20210820-15-itiets.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/417266/original/file-20210820-15-itiets.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/417266/original/file-20210820-15-itiets.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/417266/original/file-20210820-15-itiets.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/417266/original/file-20210820-15-itiets.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An Afghan internally displaced refugee receives winter necessities from the UNHCR in 2017.</span>
<span class="attribution"><span class="source">(AP Photo/Rahmat Gul)</span></span>
</figcaption>
</figure>
<p>The UNHCR was so confident in the use of biometrics that it altogether decided not to allow disputes from refugees. From March to October 2002, 396,000 false claimants were turned away from receiving aid. However, as communications scholar Mirca Madianou argues, <a href="https://www.doi.org/10.1177/1527476419857682">iris recognition has an error rate of two to three per cent</a>, suggesting that roughly 11,800 claimants out of the alleged false claimants were wrongly denied aid.</p>
<p>Additionally, since 2018, the UNHCR has collected biometric data from Rohingya refugees. However, reports recently emerged that <a href="https://www.hrw.org/news/2021/06/15/un-shared-rohingya-data-without-informed-consent">the UNHCR shared this data with the government of Bangladesh, who subsequently shared it with the Myanmar government to identify individuals for possible repatriation</a> (all without the Rohingya’s consent). The Rohingya, like the Afghan refugees, were instructed to <a href="https://odi.org/en/insights/although-shocking-the-rohingya-biometrics-scandal-is-not-surprising-and-could-have-been-prevented/">register their biometrics to receive and access aid in conflict areas</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/8D7HBDXy9aI?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">The UNHCR collects the biometric data of refugees in Uganda.</span></figcaption>
</figure>
<p>In 2007, as the U.S. government was introducing HIIDE in Afghanistan, <a href="https://www.wired.com/2007/08/fallujah-pics/">U.S. Marine Corps were walling off Fallujah in Iraq</a> to supposedly deny insurgents freedom of movement. To get into Fallujah, individuals would require a badge, obtained by exchanging their biometric data. After the U.S. retreated from Iraq in 2020, the database remained in place, including all the biometric data of those who worked on bases. </p>
<h2>Protecting privacy over time</h2>
<p>Registering in a biometric database means trusting not just the current organization requesting the data but any future organization that may come into power or have access to the data. Additionally, the collection and use of biometric data in conflict zones and crisis response present heightened risks for already vulnerable groups. </p>
<p>While collecting biometric data is useful in specific contexts, this must be done carefully. Ensuring the security and privacy of those who could be most at risk and those who are likely to be compromised or made vulnerable is critical. If security and privacy cannot be ensured, then biometric data collection and use should not be deployed in conflict zones and crisis response.</p><img src="https://counter.theconversation.com/content/166475/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Lucia Nalbandian previously received funding from the Canada Excellence Research Chair in Migration and Integration.</span></em></p>The U.S. military collected biometric data on Afghan civilians. The information may have fallen into the hands of the Taliban, highlighting why collecting the data is too risky in the first place.Lucia Nalbandian, Researcher, Canada Excellence Research Chair in Migration and Integration, Toronto Metropolitan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1658612021-08-11T13:49:48Z2021-08-11T13:49:48ZPaying with a palm print? We’re victims of our own psychology in making privacy decisions<figure><img src="https://images.theconversation.com/files/415618/original/file-20210811-19-13frcrt.jpg?ixlib=rb-1.1.0&rect=44%2C98%2C5946%2C3889&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/identification-biometrics-cyber-security-concepts-1636395238">Natasa Adzic/Shutterstock</a></span></figcaption></figure><p>The online retail giant Amazon has moved from our screens to our streets, with the introduction of Amazon grocery and book stores. With this expansion came the introduction of Amazon One – a service that lets customers use their handprint to pay, rather than tapping or swiping a card. According to recent reports, Amazon is now <a href="https://techcrunch.com/2021/08/02/amazon-credit-palm-biometrics/">offering promotional credit</a> to users who enroll.</p>
<p>In the UK we’re quickly becoming used to biometric-based identification. Many of us use a thumbprint or facial recognition to access our smartphones, authorise payments or cross international borders.</p>
<p>Using a biometric (part of your body) rather than a credit card (something you own) to make a purchase might offer a lot more convenience for what feels like very little cost. But there are several complex issues involved in giving up your biometric data to another party, which is why we should be wary of companies such as Amazon incentivising us to use biometrics for everyday transactions.</p>
<p>Amazon’s handprint incentive adds to an ongoing academic and policy debate about when and where to use biometrics to “authenticate” yourself to a system (to prove that you are who you say you are). </p>
<p>On the benefits side, you’re never without your biometric identifier -– your face, hand or finger travel with you. Biometrics are pretty hard to steal (modern fingerprint systems typically include a <a href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8288857&casa_token=4pLXNO5rYzgAAAAA:QAAWsZeISkZffxVgzxvGlkbq95xnXkWA7EOGUrQKduyLd7_X4leii97QIjdYCkmUHdkMH5ZGXA&tag=1">“liveness” test</a> so that no attacker would be tempted to chop a finger off or make latex copies). They’re also easy to use -– gone are the problems of remembering multiple passwords to access different systems and services. </p>
<p>What about the costs? You don’t have many hands –- and you can’t get a new one –- so one biometric will have to serve as an entry point to multiple systems. That becomes a real problem if a biometric is hacked. </p>
<p>Biometrics can also be discriminatory. Many facial recognition systems fail ethnic minorities (because the systems have been trained with <a href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9209125&casa_token=UFqEAflmuV0AAAAA:pgFUQb0n1uvaGctCfNEfZla50Z9JpdfKtE4wziZ_elJtJs4HgVHoxb1L7SgKPgh5yWBTt0dhSg">predominantly white faces</a>. Fingerprint systems may <a href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8509614">fail older adults</a>, who have thinner skin and less marked whorls, and all systems would fail those with certain disabilities – arthritis, for example, could make it difficult to <a href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=5721811&casa_token=8mm6kbiJvFYAAAAA:XIibuZWoPKxsE41TU2uic6N1e_qK8M-xsFaJujnJsZ0RYgoT9mjTLPMxnYPzYVdjCVD2V_wTvw">yield a palm print</a>.</p>
<h2>Who should we trust?</h2>
<p>A key issue for biometrics “identity providers” is that they can be trusted. This means that they will keep the data secure and will be “proportional” in their use of biometrics as a means of identification. In other words, they will use biometrics when it is necessary – say, for security purposes – but not simply because it seems convenient. </p>
<p>The UK government is currently <a href="https://www.gov.uk/government/publications/uk-digital-identity-attributes-trust-framework-updated-version">consulting on a new</a> digital identity and attributes trust framework where firms can be certified to offer biometric and other forms of identity management services. </p>
<p>As the number of daily digital transactions we make grows, so does the need for simple, seamless authentication, so it is not surprising that Amazon might want to become a major player in this space. Offering to pay for you to use a biometric sign-in is a quick means of getting you to choose Amazon as your trusted identity provider … but are you sure you want to do that?</p>
<h2>Privacy paradox</h2>
<p>Unfortunately we’re victims of our own psychology in this process. We will often say we value our privacy and want to protect our data, but then, with the promise of a quick reward, we will simply click on that link, accept those cookies, login via Facebook, offer up that fingerprint and buy into that shiny new thing.</p>
<p>Researchers have a name for this: the <a href="https://www.sciencedirect.com/science/article/pii/S0167404815001017?casa_token=G49KiKJxhI4AAAAA:LBLbp3PbJ2OhePoYcucU4HamwFvVeOtwXswO6AkZZf91VK_6e01XH99JksNpv9h3bDzA3LMr2Q">privacy paradox</a>. In survey after survey, people will argue that they care deeply about privacy, data protection and digital security, but these attitudes are not supported in their behaviour. Several explanations exist for this, with some researchers arguing that people employ a privacy calculus to assess the costs and benefits of disclosing particular information. </p>
<p>The problem, as always, is that certain types of cognitive or social bias begin to creep into this calculus. We know, for example, that people will underestimate the risks associated with things they like and overestimate the risks associated with things they dislike (something known as the <a href="https://www.sciencedirect.com/science/article/pii/S0377221705003577?casa_token=YTy8mFBCKVcAAAAA:EumjAvyFQYVRJX7zjUQS-VJzomWVQw9n-e6vADtZHuyUvI2JHNkLEI8IrLmQrO-iIy1WPFTfhQ">“affect heuristic”</a>). </p>
<p>As a consequence, people tend to share more personal data than they should, and the amount of such data in circulation grows exponentially. The same is true for biometrics. People will say that only trusted organisations should hold biometric data, but then go on to give their biometrics up with a small incentive. In <a href="https://dl.acm.org/doi/pdf/10.1145/2778972?casa_token=wtG1zO9QeAQAAAAA:htr-861bqneZm40HDDSo8ezNGQc3xqGRC6-RDH8MEiDFajPCYSM3ba7NRS5uuzKzQU0hMfNNJwFg">my own research</a>, I’ve linked this behavioural paradox to the fact that security and privacy are things we need to do, but they don’t give us any joy, so our motivation to act is low.</p>
<p>Any warnings about the longer-term risks of taking the Amazon shilling might be futile, but I leave you with this: your biometrics don’t just confirm your identity, they are more revealing than that. They say something very clearly about ethnicity and age, but may also unknowingly reveal information about disability or even mood (in the example of, say, a voice biometric). </p>
<p>Biometric analysis can be done without permission (state regulations permitting) and, in some cases, <a href="https://www.jtl.columbia.edu/bulletin-blog/a-face-in-the-crowd-facial-recognition-technology-and-the-value-of-anonymity">at scale</a>. China leads the way in the use of face recognition to identify individuals in a crowd, <a href="https://www.reuters.com/article/us-health-coronavirus-facial-recognition-idUSKBN20W0WL">even when wearing masks</a>. Exchanging a palm print for the equivalent of a free book may seem like a vastly different thing, but it is the thin end of the biometric wedge.</p><img src="https://counter.theconversation.com/content/165861/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Pam Briggs receives funding from the Economic and Social Research Council (ESRC).</span></em></p>Amazon is offering an incentive to pay with our palm prints. Why is it so difficult to make decisions about biometric privacy?Pam Briggs, Research Chair in Applied Psychology, Northumbria University, NewcastleLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1612572021-05-31T16:41:53Z2021-05-31T16:41:53ZThe United Nations needs to start regulating the ‘Wild West’ of artificial intelligence<figure><img src="https://images.theconversation.com/files/402388/original/file-20210524-21-1y5xpmu.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C6270%2C3750&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Global governance of artificial intelligence is necessary to regulate AI industries.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>The European Commission recently published a <a href="https://digital-strategy.ec.europa.eu/en/library/proposal-regulation-laying-down-harmonised-rules-artificial-intelligence">proposal for a regulation on artificial intelligence (AI)</a>. This is the first document of its kind to attempt to tame the multi-tentacled beast that is artificial intelligence. </p>
<p>“<a href="https://fortune.com/2021/04/27/the-sun-is-setting-on-a-i-s-wild-west/">The sun is starting to set on the Wild West days of artificial intelligence</a>,” writes Jeremy Kahn. He may have a point.</p>
<p>When this regulation comes into effect, it will change the way that we conduct AI research and development. In the last few years of AI, there were few rules or regulations: if you could think it, you could build it. That is no longer the case, at least in the European Union. </p>
<p>There is, however, a notable exception in the regulation, which is that is does not apply to international organizations like the United Nations.</p>
<p>Naturally, the European Union does not have jurisdiction over the United Nations, which is <a href="https://www.un.org/en/our-work/uphold-international-law">governed by international law</a>. The exclusion therefore does not come as a surprise, but does point to a gap in AI regulation. The United Nations therefore needs its own regulation for artificial intelligence, and urgently so.</p>
<h2>AI in the United Nations</h2>
<p>Artificial intelligence technologies have been used increasingly by the United Nations. Several research and development labs, including the <a href="https://www.unglobalpulse.org/">Global Pulse Lab</a>, <a href="https://jetson.unhcr.org/">the Jetson initiative by the UN High Commissioner for Refugees </a>, <a href="https://www.unicef.org/innovation/topics/innovation-labs">UNICEF’s Innovation Labs</a> and <a href="https://centre.humdata.org/">the Centre for Humanitarian Data</a> have focused their work on developing artificial intelligence solutions that would support the UN’s mission, notably in terms of anticipating and responding to humanitarian crises.</p>
<p>United Nations agencies have also used biometric identification to manage humanitarian logistics and refugee claims. The UNHCR developed a biometrics database which <a href="https://www.unhcr.org/blogs/data-millions-refugees-securely-hosted-primes/">contained the information of 7.1 million refugees</a>. The World Food Program has also used biometric identification in aid distribution to refugees, <a href="https://www.thenewhumanitarian.org/opinion/2019/07/17/head-head-biometrics-and-aid">coming under some criticism in 2019 for its use of this technology in Yemen</a>.</p>
<p>In parallel, the United Nations has partnered with private companies that provide analytical services. A notable example is the World Food Programme, which in 2019 signed a <a href="https://slate.com/technology/2019/02/palantir-un-world-food-programme-data-humanitarians.html">contract worth US$45 million with Palantir</a>, an American firm specializing in data collection and artificial intelligence modelling. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/FDOptbuz_fg?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A UNESCO video on its applications of AI.</span></figcaption>
</figure>
<h2>No oversight, regulation</h2>
<p>In 2014, the United States Bureau of Immigration and Customs Enforcement (ICE) awarded a US$20 billion-dollar contract to Palantir to <a href="https://theintercept.com/2017/03/02/palantir-provides-the-engine-for-donald-trumps-deportation-machine/">track undocumented immigrants in the U.S.</a>, especially family members of children who had crossed the border alone. Several human rights watchdogs, <a href="https://www.amnesty.org/en/documents/amr51/3124/2020/en/">including Amnesty International</a>, have raised concerns about Palantir for human rights violations.</p>
<p>Like most AI initiatives developed in recent years, this work has happened largely without regulatory oversight. There have been many attempts to set up ethical modes of operation, such as the Office for the Co-ordination of Humanitarian Affairs’ <a href="https://data.humdata.org/dataset/2048a947-5714-4220-905b-e662cbcd14c8/resource/76e488d9-b69d-41bd-927c-116d633bac7b/download/peer-review-framework-2020.pdf">Peer Review Framework</a>, which sets out a method for overseeing the technical development and implementation of AI models. </p>
<p>In the absence of regulation, however, tools such as these, without legal backing, are merely best practices with no means of enforcement.</p>
<p>In the European Commission’s AI regulation proposal, developers of high-risk systems must go through an authorization process before going to market, just like a new drug or car. They are required to put together a detailed package before the AI is available for use, involving a description of the models and data used, along with an explanation of how accuracy, privacy and discriminatory impacts will be addressed.</p>
<p>The AI applications in question include biometric identification, categorization and evaluation of the eligibility of people for public assistance benefits and services. They may also be used to dispatch of emergency first response services — all of these are current uses of AI by the United Nations.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/403244/original/file-20210527-23-1t1mzqk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Yemeni men carrying sacks of food for distribution." src="https://images.theconversation.com/files/403244/original/file-20210527-23-1t1mzqk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/403244/original/file-20210527-23-1t1mzqk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=407&fit=crop&dpr=1 600w, https://images.theconversation.com/files/403244/original/file-20210527-23-1t1mzqk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=407&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/403244/original/file-20210527-23-1t1mzqk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=407&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/403244/original/file-20210527-23-1t1mzqk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=511&fit=crop&dpr=1 754w, https://images.theconversation.com/files/403244/original/file-20210527-23-1t1mzqk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=511&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/403244/original/file-20210527-23-1t1mzqk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=511&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The UN World Food Program distributed food in Yemen, as shown in this photo from Sept. 2018. The United Nations later faced criticism for its use of biometrics in aid distribution.</span>
<span class="attribution"><span class="source">(AP Photo/Hammadi Issa)</span></span>
</figcaption>
</figure>
<h2>Building trust</h2>
<p>Conversely, the lack of regulation at the United Nations can be considered a challenge for agencies seeking to adopt more effective and novel technologies. As such, many systems seem to have been developed and later abandoned without being integrated into actual decision-making systems.</p>
<p>An example of this is the Jetson tool, which was developed by UNHCR to predict the arrival of internally displaced persons to refugee camps in Somalia. The tool <a href="https://jetson.unhcr.org/">does not appear to have been updated</a> since 2019, and seems unlikely to transition into the humanitarian organization’s operations. Unless, that is, it can be properly certified by a new regulatory system. </p>
<p>Trust in AI is difficult to obtain, particularly in United Nations work, which is highly political and affects very vulnerable populations. The onus has largely been on data scientists to develop the credibility of their tools. </p>
<p>A regulatory framework like the one proposed by the European Commission would take the pressure off data scientists in the humanitarian sector to individually justify their activities. Instead, agencies or research labs who wanted to develop an AI solution would work within a regulated system with built-in accountability. This would produce more effective, safer and more just applications and uses of AI technology.</p><img src="https://counter.theconversation.com/content/161257/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Eleonore Fournier-Tombs has recently consulted for the United Nations and the World Bank. </span></em></p>The new EU regulation is about to change the way we do artificial intelligence. The United Nations needs to follow suit.Eleonore Fournier-Tombs, Senior Researcher, Data and Technology, Institute in Macau (UNU-Macau), United Nations UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1594052021-05-19T15:12:27Z2021-05-19T15:12:27ZWhy we need to seriously reconsider COVID-19 vaccination passports<figure><img src="https://images.theconversation.com/files/401456/original/file-20210519-21-1olrz49.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5400%2C3023&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Vaccine passports may soon be required for travelling amid the COVID-19 pandemic. Like biometrics, they'll likely become a permanent part of our daily lives — and there's barely been any debate about them.</span> <span class="attribution"><span class="source"> (AP Photo/Rick Bowmer)</span></span></figcaption></figure><p>In 2003, Canada’s immigration and citizenship minister, Denis Coderre, declared that “<a href="https://www.yumpu.com/en/document/read/48843828/biometrics-public-policy-forum">the biometrics train has left the station</a>,” making reference to <a href="https://searchsecurity.techtarget.com/definition/biometrics">new technologies</a> like facial recognition and retina scans. </p>
<p>Coderre’s statement demonstrated the perceived inevitability, along with the innocent embrace, of new <a href="https://searchsecurity.techtarget.com/definition/biometrics">biometric technologies</a>.</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/401468/original/file-20210519-17-1y8hpoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Coderre gestures while speaking." src="https://images.theconversation.com/files/401468/original/file-20210519-17-1y8hpoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/401468/original/file-20210519-17-1y8hpoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=848&fit=crop&dpr=1 600w, https://images.theconversation.com/files/401468/original/file-20210519-17-1y8hpoe.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=848&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/401468/original/file-20210519-17-1y8hpoe.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=848&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/401468/original/file-20210519-17-1y8hpoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1066&fit=crop&dpr=1 754w, https://images.theconversation.com/files/401468/original/file-20210519-17-1y8hpoe.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1066&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/401468/original/file-20210519-17-1y8hpoe.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1066&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Denis Coderre at a news conference in Ottawa in 2002.</span>
<span class="attribution"><span class="source">(CP PHOTO/Fred Chartrand)</span></span>
</figcaption>
</figure>
<p>It’s eerily similar <a href="https://www.macleans.ca/opinion/vaccine-passports-are-inevitable-and-canada-should-prepare/">to contemporary statements about vaccine passports</a>. And, much like the rollout of biometrics, the solutions promised by these technologies outweigh the public’s appetite for debate. So what’s changed in the past 20 years, and why should we care?</p>
<p>Proposed vaccine passports are moving forward with little scrutiny due to their promise to solve many <a href="https://globalnews.ca/news/7850798/covid-vaccine-passport-canadians-support-poll/">travel-related challenges</a> during and after the COVID-19 pandemic. The emergence of biometrics and surveillance in <a href="https://www.routledge.com/Security-Risk-and-the-Biometric-State-Governing-Borders-and-Bodies/Muller/p/book/9780415484404">post-9/11 border security</a> tells a similar story. </p>
<p>Currently, vaccine passports <a href="https://www.ctvnews.ca/politics/canada-will-align-policy-on-vaccine-passports-with-international-allies-trudeau-1.5413894">are presented as a relatively simple technological solution to our current travel woes</a>. However, like biometrics, vaccine passports will likely become permanent parts of our daily lives. That means meaningful public debate and discussion about their merits and problems is essential. </p>
<h2>‘Function creep’</h2>
<p>There is growing scholarly trepidation with “<a href="https://iep.utm.edu/surv-eth/#H10">function creep</a>” — the way technologies are gradually used for much more than their originally intended purposes. </p>
<p>These concerns dovetail with related fears about the rapid erosion of privacy. They should not be ignored, nor should they be considered trade-offs for political promises of safer and more efficient travel. Regardless of how effective vaccine passports may be, concerns about their use demand public conversation. </p>
<p><a href="https://dx.doi.org/10.2139/ssrn.677563">Intensified security at borders and in airports</a> was believed to be a necessary evil of the post-9/11 world. Biometrics and surveillance provided a “sorting” function that improved travellers’ experiences. They promised to streamline interactions with reinforced border security. This positive dividend overlooked the wider social sorting functions of these technologies. </p>
<p>Largely ignored was the way travellers and populations were categorized along lines of race, gender and class. Similarly, in the face of nationwide lockdowns, the promise of a return to safe and efficient travel quiets criticism. </p>
<h2>Personal privacy</h2>
<p>Such technologies also challenge how we negotiate personal privacy. They contribute to <a href="https://www.publichealthontario.ca/-/media/documents/ncov/phm/2021/03/covid-19-environmental-scan-immunity-passports.pdf?la=en">enhanced law enforcement powers</a>, and <a href="https://doi.org/10.1016/S0969-4765(20)30009-6">are increasingly presented</a> as acceptable trade-offs for <a href="https://www.kotatv.com/2021/04/27/vaccine-passports-lets-travel/">rediscovered mobility</a>. </p>
<p>The pandemic, together with related government responses, have exposed the inequities in our society. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/with-covid-19s-third-wave-were-far-from-all-in-this-together-159178">With COVID-19's third wave, we're far from 'all in this together'</a>
</strong>
</em>
</p>
<hr>
<p>As a result, we should be troubled by the open embrace of vaccine passports. The lessons of the past two decades of surveillance in society have shown us that identification technologies such as biometrics have consequences that <a href="https://thetyee.ca/News/2021/04/28/RCMP-Secret-Facial-Recognition-Tool-Looked-Matches-Terrorists/">go well beyond their intended use</a>.</p>
<p>Contemporary vaccine passports will bear little resemblance to the handwritten vaccination cards of the past. Instead, they will likely reside on our smartphones. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/398686/original/file-20210504-15-1fqgcki.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A person holds a phone that says COVID-19 Digital Immune Passport." src="https://images.theconversation.com/files/398686/original/file-20210504-15-1fqgcki.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/398686/original/file-20210504-15-1fqgcki.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/398686/original/file-20210504-15-1fqgcki.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/398686/original/file-20210504-15-1fqgcki.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/398686/original/file-20210504-15-1fqgcki.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/398686/original/file-20210504-15-1fqgcki.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/398686/original/file-20210504-15-1fqgcki.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Vaccine passports will probably live on our smartphones.</span>
<span class="attribution"><span class="source">Wuestenigel/Flickr</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<h2>Responsibility rests with us</h2>
<p>That means the responsibility for them rests squarely with the citizen. Decidedly different than the responses to security challenges after 9/11, vaccine passports are not products of large transnational corporations.</p>
<p>Instead, regular citizens with programming skills who engage in “participatory democracy” on GitHub, an internet platform that hosts software development through volunteer programming, <a href="https://github.com/vaccine-passport/docs">are proposing solutions</a>. In the months following the first media mentions of vaccine passports, more than 40 related projects <a href="https://github.com/search?q=covid+passport">were launched on GitHub</a>. </p>
<p>The majority of them are apps that use a smartphone’s algorithms to collect sensitive data such as name, date of birth, vaccine brand, dosage and mailing addresses. As one volunteer <a href="https://github.com/alexandrutatarciuc/Covid19PassportApp">programmer writes</a>: “I decided to stop enduring the effects of the pandemic and start to act.” </p>
<p>A trend is emerging: programming-savvy citizens who code for corporations by day now do so for public safety by night. The political significance of this cannot be understated.</p>
<p>The next generation of entrepreneurs are technologically savvy. These citizen-programmers imagine a future where safety, mobility, freedom and the dream of the return to pre-pandemic normalcy may intersect. But this intersection will be on the smartphone. </p>
<h2>Post 9/11 consequences</h2>
<p>The consequences of biometrics and surveillance rolled out in response to the security challenges of the post-9/11 world <a href="https://hedgehogreview.com/issues/fear-itself/articles/fear-surveillance-and-consumption">had widespread consequences</a>. Similarly, leveraging smartphones as the vehicle for vaccine passports will be fraught with rights and civil liberties violations. </p>
<figure class="align-center ">
<img alt="A police officer's hand rests on his gun as politicians speak in the background." src="https://images.theconversation.com/files/401469/original/file-20210519-23-hr6ivt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/401469/original/file-20210519-23-hr6ivt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=425&fit=crop&dpr=1 600w, https://images.theconversation.com/files/401469/original/file-20210519-23-hr6ivt.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=425&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/401469/original/file-20210519-23-hr6ivt.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=425&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/401469/original/file-20210519-23-hr6ivt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=534&fit=crop&dpr=1 754w, https://images.theconversation.com/files/401469/original/file-20210519-23-hr6ivt.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=534&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/401469/original/file-20210519-23-hr6ivt.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=534&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A police officer stands by as federal government officials answer questions during a news conference on airport security a month after the 9/11 terrorist attacks in 2001.</span>
<span class="attribution"><span class="source">(CP PHOTO/Aaron Harris)</span></span>
</figcaption>
</figure>
<p><a href="https://global.oup.com/academic/product/surveillance-studies-9780190297817?cc=ca&lang=en&">Research over the past two decades</a> into surveillance is clear — it threatens individual freedoms and amplifies social differences. Social sorting technologies like biometrics not only verify that “you are who you say you are,” they also assess risk and <a href="https://www.surveillanceincanada.org/">categorize each of us in the process</a>. </p>
<p>Proposed to solve problems related to enhancing secure and efficient travel, the consequences of vaccine passports are much broader. Surveillance and biometrics assign worth and opportunity. They also assign differential access to goods, services and places. </p>
<p>Vaccine passports provide the opportunity to add health data to our mobile personal data devices. While the promise of improved pandemic travel will likely be kept, there will also be a series of policy challenges, privacy concerns and troubling consequences of social sorting.</p>
<h2>Real debate is needed</h2>
<p>The absence of meaningful debate about turning to consumer technology as a vehicle for vaccine passports is serious. In the early 2000s, <a href="https://www.ifsecglobal.com/access-control/truth-is-the-key-addressing-criticism-of-biometrics/">questioning the reliance on biometrics and surveillance</a> was often regarded as suspicious, speculative and even anti-modern.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/before-we-introduce-vaccine-passports-we-need-to-know-how-theyll-be-used-156197">Before we introduce vaccine passports we need to know how they'll be used</a>
</strong>
</em>
</p>
<hr>
<p>Today, public criticism and deliberation about vaccine passports is also overlooked and even discredited. Concerns over vaccine passports <a href="https://www.latimes.com/california/story/2021-05-12/how-digital-vaccine-passports-became-a-rally-cry-for-anti-mask-movement">are sometimes conflated with anti-mask and anti-vaccination sentiments</a>. </p>
<p>Safe and efficient travel is the coveted prize. However, failure to have fulsome public conversations about the long-term societal impact of vaccine passports will leave our privacy and civil liberties exposed.</p><img src="https://counter.theconversation.com/content/159405/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Benjamin Muller receives funding from King's University College Internal Research Grant and Social Science and Humanities Research Council of Canada. </span></em></p><p class="fine-print"><em><span>Tommy Cooke does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>COVID-19 vaccine passports are being presented as a relatively simple technological solution to our current travel woes. But meaningful public debate about their merits and problems is essential.Tommy Cooke, SSHRC Postdoctoral Researcher, Digital Privacy, Queen's University, OntarioBenjamin Muller, Associate Professor in Political Science and Sociology, Western UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1577892021-03-29T14:16:00Z2021-03-29T14:16:00ZAI-driven CCTV upgrades are coming to the ‘world’s most watched’ streets – will they make Britain safer?<figure><img src="https://images.theconversation.com/files/391440/original/file-20210324-21-1n7zubv.jpeg?ixlib=rb-1.1.0&rect=113%2C0%2C881%2C625&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">CCTV technology has evolved in the decades since it was first introduced.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/old-rough-dirty-security-video-cameras-1456529840">Orso/Shutterstock</a></span></figcaption></figure><p>Renewed concern about the safety of public streets, <a href="https://theconversation.com/survey-shows-32-of-british-women-dont-feel-safe-walking-alone-at-night-compared-to-just-13-of-men-157446">especially for women</a>, has prompted the UK government to announce the doubling of a “Safer Streets” fund <a href="https://metro.co.uk/2021/03/15/ministers-vow-45m-fund-for-safer-streets-including-lighting-and-cctv-14249940/">to £45 million</a>, with planned measures including <a href="https://www.bbc.co.uk/news/uk-56410943">more CCTV</a> in public places such as parks. </p>
<p>This would be to add to a street surveillance ecosystem that is already extensive in the UK – often referred to as the <a href="https://www.bloomsbury.com/uk/the-maximum-surveillance-society-9781847881069/">most surveilled</a> nation on Earth. The first wave of surveillance cameras went in <a href="https://dspace.stir.ac.uk/bitstream/1893/12083/1/Webster_2004_The_Diffusion_Regulation_and_Governance%2520of_CCTV.pdf">30 years ago</a>, and by 2013 an estimated <a href="https://www.telegraph.co.uk/technology/10172298/One-surveillance-camera-for-every-11-people-in-Britain-says-CCTV-survey.html">5.9 million units</a> were watching UK streets. That figure is likely far higher today, driven in part by the new availability of compact cameras like <a href="https://wiredsmart.io/dash-cams/evolution-and-history/">dashcams</a>, <a href="https://lerablog.org/technology/electronics/the-rise-of-the-body-worn-camera/">bodycams</a> and <a href="https://nsjonline.com/article/2019/09/doorbell-cameras-and-privacy-who-is-watching/">doorbell cameras</a>. </p>
<p>But the overall picture of the UK’s street surveillance ecosystem is muddled, with some cameras too old to produce quality images, others aimed at entryways rather than streets, and some smaller cameras, like those attached to bodies and vehicles, not suited to general public safety. </p>
<p>Enlarging that ecosystem still further may be a seductive policy solution to street safety concerns, but there’s limited evidence of their effectiveness at <a href="https://library.college.police.uk/docs/what-works/What-works-briefing-effects-of-CCTV-2013.pdf">reducing</a> and <a href="https://library.college.police.uk/HeritageScripts/Hapi.dll/search2?searchterm=c46511&Fields=Z&Media=%23&Bool=AND&searchterm=c46511&Fields=Z&Media=%23&Bool=AND">deterring</a> crime. And, as women’s groups have recently pointed out, the focus on street surveillance neglects the <a href="https://www.nytimes.com/2021/03/21/world/europe/sarah-everard-police-uk.html">wider societal change</a> required in order to make women feel safer in public places.</p>
<h2>CCTV ecosystem</h2>
<p>Most CCTV cameras in the UK are actually privately owned – either put up by businesses looking to protect their premises, or attached to private residencies for security. According to some estimates, just <a href="https://www.infologue.com/industry/just-1-in-70-cctv-cameras-are-state-owned-bsia-survey-reveals/">1 in 70</a> CCTV cameras are state-owned, and many of these are placed in and around public buildings.</p>
<p>This has resulted in a <a href="https://uk.rs-online.com/web/generalDisplay.html?id=i%2Fcctv-hotspots-uk">disparate and fragmented</a> CCTV ecosystem, with cameras concentrated in commercial districts rather than in residential neighbourhoods. This disparity has led to concerns that cameras may serve to <a href="https://journals.sagepub.com/doi/abs/10.1177/1748895809102554">displace crime</a> from central, surveilled areas into residential ones.</p>
<p>Even in commercial areas, many cameras were initially installed to monitor entryways into buildings – not to enhance street safety – and the angle at which they’re positioned reflects this function. Meanwhile, a certain proportion of cameras are broken and out of use – some are too old to offer <a href="https://www.telegraph.co.uk/news/politics/6088086/Worthless-CCTV-camera-footage-is-not-good-enough-to-fight-crime-leading-QC-warns.html">reliable footage</a> in criminal prosecutions, while others are actually switched off too due to <a href="https://www.theguardian.com/world/2016/nov/16/cctv-cameras-being-switched-off-to-save-money-watchdog-warns">funding issues</a>. There are concerns that such cameras merely offer <a href="https://www.schneier.com/blog/archives/2009/11/beyond_security.html">the illusion</a> of safety and security, without the capacity to record crime on our streets.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/survey-shows-32-of-british-women-dont-feel-safe-walking-alone-at-night-compared-to-just-13-of-men-157446">Survey shows 32% of British women don't feel safe walking alone at night – compared to just 13% of men</a>
</strong>
</em>
</p>
<hr>
<p>Public support for CCTV, which is still relatively <a href="https://securitynewsdesk.com/cctv-enjoys-86-public-support-call-better-monitoring-information/">strong</a>, is based on the premise that <a href="https://www.scotsman.com/news/cctv-does-it-actually-work-2507086">cameras work</a> – and that they can be used in the public interest. While there are millions of cameras watching UK streets, they’re only watching select parts of them in what is a fragmented patchwork that may have little effect on street safety.</p>
<h2>New cameras</h2>
<p>But the CCTV ecosystem is also evolving. Old cameras have been replaced by new digital ones with significantly improved <a href="https://theconversation.com/surveillance-cameras-will-soon-be-unrecognisable-time-for-an-urgent-public-conversation-118931">surveillance capabilities</a>. Sharper recordings now offer clearer pictures that could used as trustworthy evidence in legal proceedings. And the growing profusion of internet-connected “smart cameras”, offer a new way to analyse footage via Artificial Intelligence (AI), both in real-time or via recordings after incidents have occurred.</p>
<figure class="align-center ">
<img alt="The shoulder of a police officer in the UK with a bodycam on it" src="https://images.theconversation.com/files/392262/original/file-20210329-21-sa5uv9.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/392262/original/file-20210329-21-sa5uv9.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/392262/original/file-20210329-21-sa5uv9.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/392262/original/file-20210329-21-sa5uv9.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/392262/original/file-20210329-21-sa5uv9.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/392262/original/file-20210329-21-sa5uv9.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/392262/original/file-20210329-21-sa5uv9.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">In the UK, bodycams that also record audio are worn by some police officers.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/london-uk-26th-june-2019-body-1456475102">John Gomez/Shutterstock</a></span>
</figcaption>
</figure>
<p>Such AI, in use across some CCTV ecosystems, can be used to automatically analyse unfolding situations, potentially enhancing public safety. These systems are proving useful for identifying objects on train tracks, monitoring crowd size, <a href="https://www.wired.com/insights/2014/08/the-new-eyes-of-surveillance-artificial-intelligence-and-humanizing-technology/">recognising unusual behaviour</a>, and identifying known suspects in a dragnet of recordings from a certain area. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/surveillance-cameras-will-soon-be-unrecognisable-time-for-an-urgent-public-conversation-118931">Surveillance cameras will soon be unrecognisable – time for an urgent public conversation</a>
</strong>
</em>
</p>
<hr>
<p>But new, AI-driven surveillance technology is <a href="https://videosurveillance.blog.gov.uk/2019/03/21/the-debate-on-automatic-facial-recognition-continues/">fiercely contested</a>. For instance, facial recognition software, which is seen as desirable for policing, has been criticised for being unreliable and <a href="https://www.nature.com/articles/d41586-020-03186-4">racially biased</a>. Police access to personal surveillance footage, like that from a doorbell camera which records everyone who visits your home, could also become a contentious privacy issue in the near future.</p>
<h2>Governing CCTV</h2>
<p>New technology within and behind cameras has the potential to enhance the reliability of street surveillance. If it’s leveraged correctly, it could deter crime and facilitate the successful prosecution of criminals caught on CCTV. But to operate effectively and legally, this new ecosystem will require <a href="https://www.gov.uk/government/publications/domestic-cctv-using-cctv-systems-on-your-property/domestic-cctv-using-cctv-systems-on-your-property">new forms of governance</a> and coordination that weren’t needed a decade ago.</p>
<p>Earlier this month, the UK government appointed a new <a href="https://www.gov.uk/government/news/new-biometrics-and-surveillance-camera-commissioner-appointed">Surveillance Camera Commissioner</a>, who has been tasked with governing the fast-moving world of surveillance cameras. Noticeably, this office has been combined with that of the <a href="https://www.gov.uk/government/organisations/biometrics-commissioner">Biometrics Commissioner</a> – a possible indicator of the direction of travel for the UK’s CCTV ecosystem, which may be set to merge with biometrics and advanced surveillance software.</p>
<p>Still, the UK’s Safer Streets initiative does also look beyond CCTV: funding improved street lighting and increased street patrols. This points to a recognition that CCTV technology is no silver bullet solution for public safety issues – even within the limited scope of <a href="https://www.architectsjournal.co.uk/news/opinion/better-street-lighting-alone-wont-make-our-cities-safer-for-women?tkn=1">urban design</a>.</p>
<p>In this context, and given existing flaws in the UK’s patchy CCTV ecosystem, faith in street surveillance as an effective public safety provision may be misplaced. Real street safety, extending far beyond the reach of CCTV cameras, won’t be achieved by technology – it’ll be achieved by social change.</p><img src="https://counter.theconversation.com/content/157789/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>William Webster is an unpaid advisor to the Surveillance Camera Commissioner and is responsible for leading the Civil Engagement element of the National Surveillance Camera Strategy.</span></em></p>The CCTV ecosystem is evolving – but it’s still a sparse patchwork with limited efficacy in reducing or prosecuting crime.William Webster, Professor and Director, Centre for Research into Information, Surveillance and Privacy, University of StirlingLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1444172020-09-06T20:13:11Z2020-09-06T20:13:11ZFace masks and facial recognition will both be common in the future. How will they co-exist?<figure><img src="https://images.theconversation.com/files/356521/original/file-20200904-20-1h9t8oj.jpg?ixlib=rb-1.1.0&rect=22%2C14%2C1894%2C833&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Pixabay</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>It’s surprising how quickly public opinion can change. Winding the clocks back 12 months, many of us would have looked at a masked individual in public with suspicion. </p>
<p>Now, some countries have enshrined face mask use <a href="https://www.aljazeera.com/news/2020/04/countries-wearing-face-masks-compulsory-200423094510867.html">in law</a>. They’ve also been made <a href="https://www.health.gov.au/news/health-alerts/novel-coronavirus-2019-ncov-health-alert/how-to-protect-yourself-and-others-from-coronavirus-covid-19/masks">compulsory in Victoria</a> and are recommended in several other states.</p>
<p>One consequence of this is that facial recognition systems in place for security and crime prevention may no longer be able to fulfil their purpose. In Australia, most agencies are silent about the use of facial recognition. </p>
<p>But documents leaked earlier this year revealed <a href="https://www.buzzfeednews.com/article/ryanmac/clearview-ai-fbi-ice-global-law-enforcement">Australian Federal Police</a> and state police in <a href="https://theconversation.com/australian-police-are-using-the-clearview-ai-facial-recognition-system-with-no-accountability-132667">Queensland, Victoria and South Australia</a> all use Clearview AI, a commercial facial recognition platform. New South Wales police <a href="https://www.abc.net.au/news/2020-01-23/australian-founder-of-clearview-facial-recognition-interview/11887112">also admitted</a> using a biometrics tool called PhotoTrac.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/your-face-is-part-of-australias-national-security-weapon-should-you-be-concerned-47364">Your face is part of Australia's 'national security weapon': should you be concerned?</a>
</strong>
</em>
</p>
<hr>
<h2>What is facial recognition?</h2>
<p><a href="https://us.norton.com/internetsecurity-iot-how-facial-recognition-software-works.html">Facial recognition</a> involves using computing to identify human faces in images or videos, and then measuring specific facial characteristics. This can include the distance between eyes, and the relative positions of the nose, chin and mouth. </p>
<p>This information is combined to create a <a href="https://www.eff.org/pages/face-recognition">facial signature, or profile</a>. When used for individual recognition – such as to unlock your phone – an image from the camera is compared to a recorded profile. This process of facial “verification” is relatively simple.</p>
<p>However, when facial recognition is used to identify faces in a crowd, it requires a significant database of profiles against which to compare the main image. </p>
<p>These profiles can be legally collected by enrolling large numbers of users <a href="https://immi.homeaffairs.gov.au/help-support/meeting-our-requirements/biometrics">into systems</a>. But they’re sometimes collected through <a href="https://news.miami.edu/stories/2020/02/new-facial-recognition-software-scrapes-inventory-from-social-media.html">covert means</a>.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/353221/original/file-20200817-24-10qf4oa.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/353221/original/file-20200817-24-10qf4oa.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=500&fit=crop&dpr=1 600w, https://images.theconversation.com/files/353221/original/file-20200817-24-10qf4oa.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=500&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/353221/original/file-20200817-24-10qf4oa.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=500&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/353221/original/file-20200817-24-10qf4oa.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=628&fit=crop&dpr=1 754w, https://images.theconversation.com/files/353221/original/file-20200817-24-10qf4oa.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=628&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/353221/original/file-20200817-24-10qf4oa.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=628&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Facial ‘verification’ (the method used to unlock smartphones) compares the main image with a single pre-saved facial signature. Facial ‘identification’ requires examining the image against an entire database of facial signatures.</span>
<span class="attribution"><span class="source">teguhjatipras/pixabay</span></span>
</figcaption>
</figure>
<h2>The problem with face masks</h2>
<p>As facial signatures are based on mathematical models of the relative positions of facial features, anything that reduced the visibility of key characteristics (such as the nose, mouth and chin) interferes with facial recognition.</p>
<p>There are already many ways to <a href="https://www.businessinsider.com.au/clothes-accessories-that-outsmart-facial-recognition-tech-2019-10">evade or interfere</a> with facial recognition technologies. Some of these evolved from techniques designed to evade number plate recognition systems.</p>
<p>Although the coronavirus pandemic has escalated concerns around the evasion of facial recognition systems, <a href="https://theintercept.com/2020/07/16/face-masks-facial-recognition-dhs-blueleaks/">leaked US documents</a> show these <a href="https://www.documentcloud.org/documents/6989376-U-FOUO-in-Violent-Adversaries-Likely-to-Use.html">discussions</a> taking place back in 2018 and 2019, too. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/Xpu2MSmZkmU?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">This clip shows how fashion designers are outsmarting facial recognition surveillance / YouTube.</span></figcaption>
</figure>
<p>And while the debate on the <a href="https://theconversation.com/large-scale-facial-recognition-is-incompatible-with-a-free-society-126282">use</a> and <a href="https://theconversation.com/facial-recognition-technology-is-expanding-rapidly-across-australia-are-our-laws-keeping-pace-141357">legality</a> of facial recognition continues, the focus has recently shifted to the challenges presented by mask-wearing in public.</p>
<p>On this front, the US National Institute of Standards and Technology (NIST) coordinated a <a href="https://www.nist.gov/news-events/news/2020/07/nist-launches-studies-masks-effect-face-recognition-software">major research project</a> to evaluate how masks impacted the performance of various facial recognition systems used across the globe.</p>
<p>Its <a href="https://nvlpubs.nist.gov/nistpubs/ir/2020/NIST.IR.8311.pdf">report</a>, published in July, found some algorithms struggled to correctly identify mask-wearing individuals <a href="https://www.vox.com/recode/2020/7/28/21340674/face-masks-facial-recognition-surveillance-nist">up to 50% of the time</a>. This was a significant error rate compared to when the same algorithms analysed unmasked faces.</p>
<p>Some algorithms even <a href="https://edition.cnn.com/2020/07/28/tech/face-masks-facial-recognition/index.html">struggled to locate a face</a> when a mask was covering too much of it. </p>
<h2>Finding ways around the problem</h2>
<p>There are currently no usable photo data sets of mask-wearing people that can be used to train and evaluate facial recognition systems. </p>
<p>The NIST study addressed this problem by <a href="https://www.theverge.com/2020/7/28/21344751/facial-recognition-face-masks-accuracy-nist-study">superimposing</a> masks (of various colours, sizes and positions) over images of faces, as seen here:</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1287876259586822144"}"></div></p>
<p>While this may not be a realistic portrayal of a person wearing a mask, it’s effective enough to study the effects of mask-wearing on facial recognition systems. </p>
<p>It’s possible images of real masked people would allow more details to be extracted to improve recognition systems – perhaps by estimating the nose’s position based on visible protrusions in the mask.</p>
<p>Many facial recognition technology vendors are already <a href="https://www.cnet.com/health/facial-recognition-firms-are-scrambling-to-see-around-face-masks/">preparing for</a> a future where mask use will continue, or even increase. <a href="https://www.dezeen.com/2020/02/27/face-recognition-masks-resting-risk-face/">One US company</a> offers masks with customers’ faces printed on them, so they can unlock their smartphones without having to remove it. </p>
<h2>Growing incentives for wearing masks</h2>
<p>Even <a href="https://qz.com/299003/a-quick-history-of-why-asians-wear-surgical-masks-in-public/">before the coronavirus pandemic</a>, masks were a common defence against air pollution and viral infection in countries including China and Japan. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ive-always-wondered-why-many-people-in-asian-countries-wear-masks-and-whether-they-work-90178">I've always wondered: why many people in Asian countries wear masks, and whether they work</a>
</strong>
</em>
</p>
<hr>
<p>Political <a href="https://www.bbc.com/news/world-asia-china-49939173">activists</a> also wear masks to evade detection on the streets. Both the <a href="https://www.nytimes.com/2019/07/26/technology/hong-kong-protests-facial-recognition-surveillance.html">Hong Kong</a> and <a href="https://www.washingtonpost.com/technology/2020/06/12/facial-recognition-ban/">Black Lives Matter</a> protests have reinforced protesters’ desire to dodge facial recognition by <a href="https://www.theverge.com/2020/7/17/21328287/face-masks-facial-recognition-privacy-security-protests">authorities and government agencies</a>. </p>
<p>As experts forecast a future with more <a href="https://theconversation.com/this-isnt-the-first-global-pandemic-and-it-wont-be-the-last-heres-what-weve-learned-from-4-others-throughout-history-136231">pandemics</a>, <a href="https://www.theguardian.com/environment/2016/may/12/air-pollution-rising-at-an-alarming-rate-in-worlds-cities">rising levels</a> of <a href="https://www.who.int/health-topics/air-pollution#tab=tab_1">air pollution</a>, persisting <a href="https://freedomhouse.org/report/freedom-world/2020/leaderless-struggle-democracy">authoritarian regimes</a> and a projected <a href="https://theconversation.com/how-climate-change-is-increasing-the-risk-of-wildfires-99056">increase</a> in <a href="https://theconversation.com/climate-change-is-bringing-a-new-world-of-bushfires-123261">bushfires</a> producing dangerous smoke – it’s likely mask-wearing will become the norm for at least a proportion of us.</p>
<p>Facial recognition systems will need to adapt. Detection will be based on features that remain visible such as the eyes, eyebrows, hairline and general shape of the face. </p>
<p>Such technologies are already under development. Several suppliers are offering <a href="https://www.facewatch.co.uk/2020/05/11/facewatch-launches-facemask-recognition-upgrade/">upgrades</a> and <a href="https://www.prnewswire.com/news-releases/finally-a-biometric-solution-that-recognizes-users-wearing-face-masks-and-doesnt-require-touch-301069400.html">solutions</a> that claim to deliver reliable results with mask-wearing subjects.</p>
<p>For those who oppose the use of facial recognition and wish to go undetected, a plain mask may suffice for now. But in the future they might have to consider alternatives, such as a mask printed with a fake <a href="https://thispersondoesnotexist.com/">computer-generated face</a>.</p><img src="https://counter.theconversation.com/content/144417/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Paul Haskell-Dowland does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>With face masks now compulsory or recommended in various parts of the country, how are facial recognition systems functioning?Paul Haskell-Dowland, Associate Dean (Computing and Security), Edith Cowan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1446822020-09-03T20:01:33Z2020-09-03T20:01:33ZCan I still be hacked with 2FA enabled?<figure><img src="https://images.theconversation.com/files/356028/original/file-20200902-20-1ogicca.jpg?ixlib=rb-1.1.0&rect=119%2C29%2C4872%2C3712&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Cybersecurity is like a game of whack-a-mole. As soon as the good guys put a stop to one type of attack, another pops up. </p>
<p>Usernames and passwords were once good enough to keep an account secure. But before long, cybercriminals figured out how to get around this. </p>
<p>Often they’ll use “<a href="https://www.kaspersky.com/resource-center/definitions/brute-force-attacks">brute force attacks</a>”, bombarding a user’s account with various password and login combinations in a bid to guess the correct one.</p>
<p>To deal with such attacks, a second layer of security was added in an approach known as two-factor authentication, or 2FA. It’s widespread now, but does 2FA also leave room for loopholes cybercriminals can exploit?</p>
<iframe src="https://giphy.com/embed/IgLIVXrBcID9cExa6r" width="100%" height="480" frameborder="0" class="giphy-embed" allowfullscreen=""></iframe>
<h2>2FA via text message</h2>
<p>There are various types of 2FA. The most common method is to be sent a single-use code as an SMS message to your phone, which you then enter following a prompt from the website or service you’re trying to access. </p>
<p>Most of us are familiar with this method as it’s favoured by major social media platforms. However, while it may seem safe enough, it isn’t necessarily. </p>
<p>Hackers have been known to <a href="https://www.youtube.com/watch?v=kHI90LbBwaQ">trick</a> mobile phone carriers (such as Telstra or Optus) into transferring a victim’s phone number to their own phone.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/2-5-billion-lost-over-a-decade-nigerian-princes-lose-their-sheen-but-scams-are-on-the-rise-141289">$2.5 billion lost over a decade: 'Nigerian princes' lose their sheen, but scams are on the rise</a>
</strong>
</em>
</p>
<hr>
<p>Pretending to be the intended victim, the hacker contacts the carrier with a story about losing their phone, requesting a new SIM with the victim’s number to be sent to them. Any authentication code sent to that number then goes directly to the hacker, granting them access to the victim’s accounts.<br>
This method is called <a href="https://securelist.com/large-scale-sim-swap-fraud/90353/">SIM swapping</a>. It’s probably the easiest of <a href="https://www.forbes.com/sites/forbestechcouncil/2020/08/21/how-threat-actors-are-bypassing-two-factor-authentication-for-privileged-access/#50278f2b649e">several types</a> of scams that can circumvent 2FA.</p>
<p>And while carriers’ verification processes for SIM requests are improving, a competent trickster can talk their way around them. </p>
<h2>Authenticator apps</h2>
<p>The authenticator method is more secure than 2FA via text message. It works on a principle known as TOTP, or “time-based one-time password”. </p>
<p>TOTP is more secure than SMS because a code is generated on your device rather than being sent across the network, where it might be intercepted. </p>
<p>The authenticator method uses apps such as Google Authenticator, LastPass, 1Password, Microsoft Authenticator, Authy and Yubico.</p>
<p>However, while it’s safer than 2FA via SMS, there have been <a href="https://www.zdnet.com/article/android-malware-can-steal-google-authenticator-2fa-codes/">reports</a> of hackers stealing authentication codes from Android smartphones. They do this by tricking the user into installing <a href="https://au.pcmag.com/security/65791/android-malware-can-steal-2fa-codes-from-google-authenticator-app#:%7E:text=To%20steal%20the%20Google%20Authenticator,be%20advertised%20by%20Cerberus's%20creators.">malware</a> (software designed to cause harm) that copies and sends the codes to the hacker. </p>
<p>The Android operating system is easier to hack than the iPhone iOS. Apple’s iOS is proprietary, while Android is open-source, making it easier to install malware on.</p>
<h2>2FA using details unique to you</h2>
<p>Biometric methods are another form of 2FA. These include fingerprint login, face recognition, retinal or iris scans, and voice recognition. Biometric identification is becoming popular for its ease of use. </p>
<p>Most smartphones today can be unlocked by placing a finger on the scanner or letting the camera scan your face – much quicker than entering a password or passcode. </p>
<p>However, biometric data can be hacked, too, either from the servers where they are stored or from the software that processes the data. </p>
<p>One case in point is last year’s <a href="https://www.theverge.com/2019/8/14/20805194/suprema-biostar-2-security-system-hack-breach-biometric-info-personal-data">Biostar 2 data breach</a> in which nearly 28 million biometric records were hacked. BioStar 2 is a security system that uses facial recognition and fingerprinting technology to help organisations secure access to buildings.</p>
<p>There can also be false negatives and false positives in biometric recognition. Dirt on the fingerprint reader or on the person’s finger can lead to false negatives. Also, faces can sometimes be similar enough to <a href="https://www.wired.co.uk/article/avoid-facial-recognition-software">fool facial recognition systems</a>.</p>
<iframe src="https://giphy.com/embed/jnEQ1YoSLy9gSic7Qv" width="100%" height="480" frameborder="1" class="giphy-embed" allowfullscreen=""></iframe>
<p><a href=""></a></p>
<p>Another type of 2FA comes in the form of personal security questions such as “what city did your parents meet in?” or “what was your first pet’s name?”</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/dont-be-phish-food-tips-to-avoid-sharing-your-personal-information-online-138613">Don't be phish food! Tips to avoid sharing your personal information online</a>
</strong>
</em>
</p>
<hr>
<p>Only the most determined and resourceful hacker will be able to find answers to these questions. It’s unlikely, but still possible, especially as more of us adopt public online profiles.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/356182/original/file-20200903-14-1hxkata.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Person looks at a social media post from a woman, on their mobile." src="https://images.theconversation.com/files/356182/original/file-20200903-14-1hxkata.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/356182/original/file-20200903-14-1hxkata.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/356182/original/file-20200903-14-1hxkata.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/356182/original/file-20200903-14-1hxkata.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/356182/original/file-20200903-14-1hxkata.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/356182/original/file-20200903-14-1hxkata.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/356182/original/file-20200903-14-1hxkata.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Often when we share our lives on the internet, we fail to consider what kinds of people may be watching.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>2FA remains best practice</h2>
<p>Despite all of the above, the biggest vulnerability to being hacked is still the human factor. Successful hackers have a bewildering array of psychological tricks in their arsenal.</p>
<p>A cyber attack could come as a polite request, a scary warning, a message ostensibly from a friend or colleague, or an intriguing “clickbait” link in an email.</p>
<p>The best way to protect yourself from hackers is to develop a healthy amount of scepticism. If you carefully check websites and links before clicking through and also use 2FA, the chances of being hacked become vanishingly small. </p>
<p>The bottom line is that 2FA is effective at keeping your accounts safe. However, try to avoid the less secure SMS method when given the option. </p>
<p>Just as burglars in the real world focus on houses with poor security, hackers on the internet look for weaknesses. </p>
<p>And while any security measure can be overcome with enough effort, a hacker won’t make that investment unless they stand to gain something of greater value.</p><img src="https://counter.theconversation.com/content/144682/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>David Tuffley does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Two-factor authentication is certainly an added layer of security as we traverse the online world. But it comes in various forms, and they’re not all equally protective.David Tuffley, Senior Lecturer in Applied Ethics & CyberSecurity, Griffith UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1443992020-08-17T13:47:42Z2020-08-17T13:47:42ZCountries around the world are using border surveillance systems against their own citizens<figure><img src="https://images.theconversation.com/files/353151/original/file-20200817-24-1p2wthl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Klein Ongaki</span></span></figcaption></figure><p>Hamdi was just an infant when his relatives first brought him to the Dadaab refugee camp in north-eastern Kenya to be registered as a refugee – despite the fact he was a Kenyan citizen. Like many ethnic Somali citizens of Kenya living in the vicinity of the camp, they were drawn to the prospect of obtaining food aid for their family. </p>
<p>Since the outbreak of the Somali civil war in the early 1990s, north-eastern Kenya has experienced periodic droughts, propelling many Kenyan Somalis to slip into the refugee system. At Dadaab, they could access free education, food and medical services that, as citizens of one of the country’s most neglected and marginalised regions, were often out of their reach. </p>
<p>Hamdi’s relatives did not anticipate that a seemingly harmless lie would hound him for almost half a decade. Over the last few years, tens of thousands of Kenyan citizens like Hamdi who have tried to obtain a Kenyan national ID <a href="https://www.codastory.com/authoritarian-tech/kenya-biometrics-double-registration/">have been turned away</a> because their fingerprints are captured in the refugee database.</p>
<p>I met many of these people last December while conducting research in the town of Garissa in the eastern part of the country. This was part of an <a href="https://www.kerenweitzberg.com/">ongoing project</a> into the <a href="https://doi.org/10.1017/S002185372000002X">troubled history</a> of biometrics and identification in Kenya. I then returned to London to news of a <a href="https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/874022/6.5577_HO_Windrush_Lessons_Learned_Review_WEB_v2.pdf">newly published inquiry</a> into the Windrush scandal. It detailed how hundreds of British Commonwealth citizens had been wrongly detained, deported and mistreated under the government’s harsh, data-driven immigration rules.</p>
<p>I was struck by the similarities between the UK and Kenya (two countries bound together by colonial history). The Windrush scandal, which came to public light in 2017, has attracted comparatively more attention than the plight faced by victims of double registration in Kenya. However, both cases reveal the harms caused by data-driven technologies in the service of anti-immigrant and exclusionary policies. </p>
<p>Migrant surveillance systems affect citizens and residents, particularly those who <a href="https://www.ohioswallow.com/book/We+Do+Not+Have+Borders">do not fit</a> neatly into legal categories or typical ideas of who “belongs”. Civil rights activists have been <a href="https://www.aclu.org/issues/privacy-technology/surveillance-technologies/what-lurks-behind-all-immigration-data">raising alarms</a> for years about the impacts of data-driven and biometric surveillance on irregular migrants. But it’s equally important to keep citizens and legal migrants in mind when we think about the risks of increasingly digitised border-control regimes.</p>
<h2>Reliance on biometrics</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/353147/original/file-20200817-18-56gxta.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Man has fingerprint scanned with woman set nearby" src="https://images.theconversation.com/files/353147/original/file-20200817-18-56gxta.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/353147/original/file-20200817-18-56gxta.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=901&fit=crop&dpr=1 600w, https://images.theconversation.com/files/353147/original/file-20200817-18-56gxta.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=901&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/353147/original/file-20200817-18-56gxta.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=901&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/353147/original/file-20200817-18-56gxta.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1132&fit=crop&dpr=1 754w, https://images.theconversation.com/files/353147/original/file-20200817-18-56gxta.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1132&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/353147/original/file-20200817-18-56gxta.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1132&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Biometric technology has become routine.</span>
<span class="attribution"><span class="source">Klein Ongaki</span></span>
</figcaption>
</figure>
<p>Hamdi’s troubles can be traced to the growing use of biometric data (such as fingerprint and iris scanning) within the humanitarian sector. In the years after 9/11, the United Nations High Commissioner for Refugees (UNHCR) began <a href="https://www.jstor.org/stable/26292335">experimenting with iris scans</a> along the Afghanistan-Pakistan frontier. With the backing of the US, the agency developed a standardised biometric registration system, now in use in Kenya and <a href="https://blogs.lse.ac.uk/internationaldevelopment/2019/07/18/biometric-refugee-registration-between-benefits-risks-and-ethics/">over 50 host countries</a> worldwide. </p>
<p>The Kenyan government soon followed suit, building its own biometric refugee database with support from the UNHCR. Today, Kenyan officials are <a href="http://documents1.worldbank.org/curated/en/575001469771718036/pdf/Kenya-ID4D-Diagnostic-WebV42018.pdf">able to cross-check</a> the fingerprints of those who apply for a national ID, effectively shutting the door of citizenship to anyone found on the refugee database. Without a national ID in Kenya, many basic political and economic rights are <a href="https://www.aljazeera.com/indepth/opinion/2014/06/politics-identity-belonging-ken-201462354628329892.html">out of reach</a>. </p>
<p>These developments in Kenya mirror broader global trends. States, refugee aid organisations, and international border-control agencies increasingly rely upon centralised biometric databases to better track migration. States and intergovernmental bodies also stress the <a href="https://www.codastory.com/authoritarian-tech/eu-border-patrol-technology/">value of interoperability</a>, which is the ability to exchange and make use of data across different systems. </p>
<p>EU member states, for example, use <a href="https://ec.europa.eu/knowledge4policy/dataset/ds00008_en">EURODAC</a>, an asylum fingerprint database, when processing the applications of asylum seekers. Beginning in 2008, migrants in the UK who originate from outside the European Economic Area have been required to carry <a href="https://www.legislation.gov.uk/ukdsi/2008/9780110818382/contents">biometric residence permits</a>. </p>
<h2>Hostile environment</h2>
<p>The gathering and indefinite storage of biometric data has gone hand in hand with other forms of data collection. In 2012, the UK government launched its <a href="https://theconversation.com/hostile-environment-the-uk-governments-draconian-immigration-policy-explained-95460">hostile environment policy</a>, which put in place measures designed to make life more challenging for irregular migrants. Under this new regime, the Home Office can <a href="https://www.freemovement.org.uk/briefing-what-is-the-hostile-environment-where-does-it-come-from-who-does-it-affect/">access data</a> from hospitals, banks, employers and landlords.</p>
<p>Though intended to root out “illegal” immigrants, these policies punished many legal migrants of Caribbean descent and their children. Known collectively as the <a href="https://theconversation.com/windrush-generation-latest-to-be-stripped-of-their-rights-in-the-name-of-migration-control-95158">Windrush generation</a> (after <a href="https://doi.org/10.1080/17449850902819920">the celebrated ship</a> that carried Caribbean people to Britain in 1948), Commonwealth citizens who immigrated before 1973 have the <a href="https://globalcit.eu/the-windrush-generation-and-citizenship/">legal right</a> to live in the UK. </p>
<p>Many members of such British-Caribbean families were nevertheless harassed by Home Office staff, denied services, illegally detained and, in extreme cases, deported to countries they barely knew. <a href="https://www.theguardian.com/uk-news/2018/apr/09/special-needs-teacher-uk-50-years-loses-job-immigration-status">Michael Braithwaite</a>, who had lived in the UK for over 55 years, lost his job when a routine immigration check revealed that he did not have a biometric residence permit.</p>
<h2>Data dangers</h2>
<p>While scholars and privacy advocates are rightly concerned about big data <a href="https://www.theguardian.com/uk-news/2017/mar/14/public-faces-mass-invasion-of-privacy-as-big-data-and-surveillance-merge">expanding the scope</a> of government oversight, we should also consider the blindness built into many data-heavy surveillance systems. Collecting ever more <a href="https://theconversation.com/after-paris-its-traditional-detective-work-that-will-keep-us-safe-not-mass-surveillance-50830">fine-grained sensitive data</a> does not necessarily give states or international bodies a more accurate picture of individuals’ lives. </p>
<p>Trust in automated decision-making can also lead to blunt, inaccurate assessments of people’s legal status. Growing demands for identification can <a href="https://doi.org/10.1017/asr.2016.39">increase opportunities for gatekeeping</a>, which may exclude those whose legal documents or biometric data do not tell the “correct” bureaucratic story. </p>
<p>In addition, officials frequently privilege easily accessible digital records and biometric data over older, often more straightforward paper documents. Many Kenyans denied IDs had ample proof of citizenship, including birth certificates, letters from local chiefs attesting to their parentage and school records. In 2010, the Home Office negligently <a href="https://www.theguardian.com/uk-news/2018/apr/17/home-office-destroyed-windrush-landing-cards-says-ex-staffer">destroyed the</a> paper landing cards of Windrush migrants, which had been used in deciding immigration cases.</p>
<p>Proponents of enhanced border security often <a href="https://spheres-journal.org/contribution/the-body-border-governing-irregular-migration-through-biometric-technology/">argue that</a> technologies like biometrics are more objective, neutral, and non-discriminatory. The hardships faced by thousands of British Caribbeans and Kenyan Somalis, however, tell a different story. </p>
<p>Individuals and groups who share the same ethnic or national background as targeted migrants may find themselves the unwitting victims of data-driven border enforcement. A callous, shortsighted faith in digital border controls has rendered many ethnic-minority citizens and legal migrants effectively stateless.</p><img src="https://counter.theconversation.com/content/144399/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Research for this piece was funded by a grant from Privacy International. Keren Weitzberg's research into biometrics and identification in East Africa has also been funded by the British Academy/Leverhulme Trust, the American Council of Learned Societies, the Fulbright US Scholar program, and the British Institute in Eastern Africa.</span></em></p>Biometric data is being used to target those deemed unwanted aliens.Keren Weitzberg, Teaching Fellow in History, UCLLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1421572020-07-07T19:54:21Z2020-07-07T19:54:21ZChina could be using TikTok to spy on Australians, but banning it isn’t a simple fix<figure><img src="https://images.theconversation.com/files/345991/original/file-20200707-194396-1ealmrs.jpg?ixlib=rb-1.1.0&rect=35%2C71%2C5955%2C3296&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>In an age of isolation, video sharing platform TikTok has emerged as a bonding force for many. But recent headlines allege the service, owned by Beijing-based company ByteDance, is feeding users’ data to the Chinese Communist Party.</p>
<p>Earlier this week, the <a href="https://www.heraldsun.com.au/news/victoria/calls-for-tiktok-app-to-be-banned-in-australia/news-story/5b8b294b0cc2679b76221de89e4e7202">Herald Sun reported</a> an unnamed federal MP was pushing for the app to be banned.</p>
<p>Following suit, Liberal senator Jim Molan <a href="https://www.theguardian.com/technology/2020/jul/06/tiktok-may-be-data-collection-service-disguised-as-social-media-liberal-senator-says">said</a> TikTok was being “used and abused” by the Chinese government, while Labor senator Jenny McAllister <a href="https://www.businessinsider.com.au/tiktok-australia-data-privacy-china-concerns-2020-7">called on</a> TikTok’s representatives to face the Select Committee on Foreign Interference Through Social Media. </p>
<p>TikTok has <a href="https://thenewdaily.com.au/news/2020/07/07/tiktoks-australia-future/">denied</a> the accusations and rebuffed suggestions it should be banned in Australia. </p>
<p>But why is the federal government <a href="https://www.smh.com.au/technology/tiktok-on-thin-ice-as-parliament-prepares-to-pore-over-app-20200706-p559d9.html">examining this app so closely</a>? And could it really be a tool used by the Chinese government to spy on us?</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1161600933974794240"}"></div></p>
<h2>A growing following</h2>
<p>With a reported <a href="https://www.theverge.com/2020/4/29/21241788/tiktok-app-download-numbers-update-2-billion-users">two billion downloads</a> worldwide, TikTok’s <a href="http://www.roymorgan.com/findings/8289-launch-of-tiktok-in-australia-december-2019-202002240606">Australian market</a> is also significant. It has an estimated 1.6 million Aussie users, mostly aged 16-24 but with a growing number of <a href="https://slate.com/technology/2018/09/tiktok-app-musically-guide.html">older users too</a>.</p>
<p>Simply, users generate short videos that are shared in the app, with many celebrities also <a href="https://socialblade.com/tiktok/top/50/most-followers">signing up</a>. But although TikTok seems to offer carefree entertainment, is there a darker side?</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/1WB4fG3OmqA?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Australian television presenter Andrew Probyn had an unexpected TikTok moment.</span></figcaption>
</figure>
<h2>What information is collected?</h2>
<p>When installed, TikTok <a href="https://www.proofpoint.com/us/corporate-blog/post/understanding-information-tiktok-gathers-and-stores">asks users to grant</a> several permissions, including the use of the camera, microphone and contact list. However, it may also collect location data, along with information from other apps on the device.</p>
<p>Last year, a proposed class action <a href="https://www.courthousenews.com/wp-content/uploads/2019/12/Tiktok.pdf">lawsuit</a> <a href="https://www.cnet.com/news/tiktok-accused-of-secretly-gathering-user-data-and-sending-it-to-china/">filed against</a> TikTok in California claimed the company gathered users’ data, including phone numbers, emails, location, IP addresses, and social network contacts. </p>
<p>The lawsuit also stated TikTok concealed the transfer of data (including biometric data), and continued to harvest it even after the app was closed. This would mean when a user shoots a video and clicks the “next” button, the video could be automatically transferred to servers – without the user’s knowledge. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/tiktok-the-worlds-most-valuable-startup-that-youve-never-heard-of-109302">TikTok: the world’s most valuable startup that you've never heard of</a>
</strong>
</em>
</p>
<hr>
<h2>Where is the data stored?</h2>
<p>While TikTok’s headquarters are in Beijing, <a href="https://www.smh.com.au/technology/tiktok-on-thin-ice-as-parliament-prepares-to-pore-over-app-20200706-p559d9.html">Australian general manager</a> Lee Hunter recently claimed Australian users’ data was stored in Singapore.</p>
<p>A major challenge in sorting the truth from fiction lies in how we define “data”. While TikTok users’ details and videos may be stored in Singapore, there’s still potential for data to be extracted from this video content and the device and sent to China’s servers (although this hasn’t been proven to have happened).</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/huaweis-window-of-opportunity-closes-how-geopolitics-triumphed-over-technology-142158">Huawei's window of opportunity closes: how geopolitics triumphed over technology</a>
</strong>
</em>
</p>
<hr>
<p>Hypothetically, it would then be possible for Chinese authorities to use <a href="https://theconversation.com/fingerprint-and-face-scanners-arent-as-secure-as-we-think-they-are-112414">biometric data</a> to identify people using facial recognition. It would also be possible to map rooms and locations by using “<a href="https://deepai.org/machine-learning-glossary-and-terms/feature-extraction#:%7E:text=Feature%20extraction%20is%20a%20process,of%20computing%20resources%20to%20process.">feature extraction</a>” (a machine learning method) on videos. </p>
<p>This could then aid the creation of new, advanced <a href="https://theconversation.com/detecting-deepfake-videos-in-the-blink-of-an-eye-101072">deepfake videos</a> potentially targeting specific people. </p>
<p>While this may seem far-fetched, there have already been preemptive TikTok bans within major organisations to ensure sensitive information isn’t leaked. </p>
<p>For instance, the app has been banned from devices used by the <a href="https://www.abc.net.au/news/science/2020-01-16/defence-ban-tiktok-china-security-fears/11869512">Australian Defence Department</a>, the <a href="https://www.nytimes.com/2020/01/04/us/tiktok-pentagon-military-ban.html">US Department of Defence</a>, and even entire countries – with the <a href="https://techcrunch.com/2020/06/29/india-bans-tiktok-dozens-of-other-chinese-apps/">Indian government</a> announcing a nationwide ban last month.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1280372829494501376"}"></div></p>
<h2>Privacy issues</h2>
<p>ByteDance <a href="https://datacenterfrontier.com/growth-of-tiktok-boosts-data-center-appetite-for-bytedance/#:%7E:text=%E2%80%9CWe%20store%20all%20TikTok%20US,to%20serve%20the%20Chinese%20market.//">claims</a> its data is stored in servers in the US and Singapore:</p>
<blockquote>
<p>Our data centers are located entirely outside of China, and none of our data is subject to Chinese law. </p>
</blockquote>
<p>TikTok’s privacy policy is ambiguous. As of January, <a href="https://www.tiktok.com/legal/privacy-policy?lang=en">it states</a>:</p>
<blockquote>
<p>You should understand that no data storage system or transmission of data over the Internet or any other public network can be guaranteed to be 100% secure.</p>
</blockquote>
<p>From a user privacy perspective, TikTok has access to a device’s location and a user’s personal information. Although TikTok’s servers may be located outside China, it’s very difficult (if not impossible) to confirm where this data could end up, or what it could be used for. </p>
<p>While the location of servers can be important, possession of data is more relevant. Once data is obtained, it can be used. If data is stored on a server in Australia, for instance, Australian jurisdiction applies. But once it is sent to another country, that country’s laws take precedent. </p>
<p>And if a TikTok user decides to delete their content from their device, or if there is a government-imposed ban, data can’t be retrospectively erased. Once information is transferred, it’s impossible to retract without the cooperation of the organisation or agency concerned (in this case, TikTok).</p>
<h2>Can the government actually ban TikTok?</h2>
<p>The fact is, enforcing an Australia-wide ban on TikTok isn’t a simple prospect. While the federal government <em>could</em> request the app’s removal from the Apple App Store and Google Play Store, it could only do this for Australian regions and marketplaces. </p>
<p>Users in Australia would still be able to download TikTok from another region’s store, or via a third-party source. Also, banning the app <a href="https://www.indiatvnews.com/technology/apps-chinese-apps-banned-will-they-vanish-from-phones-what-happens-next-tells-cyber-expert-630322">won’t automatically remove it</a> from devices on which it is already installed.</p>
<p>Blocking access to TikTok’s servers would be done in conjunction with internet service providers (such as Telstra and Optus), as they can block access to apps and websites. But users could still use proxies or <a href="https://us.norton.com/internetsecurity-privacy-what-is-a-vpn.html">Virtual Private Networks</a> (VPNs) to circumvent these controls.</p>
<p>And even if TikTok was banned, citizen data already handed over would remain stored, and could be accessed for the foreseeable future.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/restricting-underage-access-to-porn-and-gambling-sites-a-good-idea-but-technically-tricky-133153">Restricting underage access to porn and gambling sites: a good idea, but technically tricky</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/142157/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The US is also ‘looking at’ banning the Chinese social media app.Paul Haskell-Dowland, Associate Dean (Computing and Security), Edith Cowan UniversityJames Jin Kang, Lecturer, Computing and Security, Edith Cowan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1320472020-02-26T14:55:03Z2020-02-26T14:55:03ZFacial recognition is spreading faster than you realise<figure><img src="https://images.theconversation.com/files/317329/original/file-20200226-24701-1rco3gv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/iot-machine-learning-human-object-recognition-794528230">Monopoly919/Shutterstock</a></span></figcaption></figure><p>The UK is currently witnessing a tug of war over facial recognition. On the streets <a href="http://news.met.police.uk/news/met-begins-operational-use-of-live-facial-recognition-lfr-technology-392451">of London</a> and in <a href="https://www.south-wales.police.uk/en/news-room/introduction-of-facial-recognition-into-south-wales-police/">South Wales</a>, live systems have been deployed by the police, <a href="https://homeofficemedia.blog.gov.uk/2019/09/06/6607">supported by</a> the UK government. But in the Scottish parliament, the Justice Sub-Committee on Policing is trying to <a href="https://digitalpublications.parliament.scot/Committees/Report/JSP/2020/2/11/Facial-recognition--how-policing-in-Scotland-makes-use-of-this-technology#Executive-Summary">halt use of the technology</a>.</p>
<p>I recently <a href="https://www.parliament.scot/S5_JusticeSubCommitteeOnPolicing/Inquiries/JS519FR12_Dr_Benjamin.pdf">gave evidence</a> to the Scottish sub-committee’s inquiry, highlighting the cost of this technology in terms of its damage to freedom, trust and inclusivity in society. This comes not just from the use of facial recognition but in the ways it is designed and tested as well. And yet the benefits are often exaggerated - or have yet to be proven.</p>
<p>Facial recognition systems have already been tested and deployed across the UK. Investigative journalist Geoff White has created <a href="https://facialrecognitionmap.com/">a map</a> to show where systems are being, or have been, used, identifying dozens of sites across the country. Another <a href="https://www.banfacialrecognition.com/map/">map for the US</a> shows a similar situation. If you see facial recognition technologies being used somewhere you can let such sites know to add the location and details. The results can be surprising.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/317330/original/file-20200226-24680-1p2ijyl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/317330/original/file-20200226-24680-1p2ijyl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=260&fit=crop&dpr=1 600w, https://images.theconversation.com/files/317330/original/file-20200226-24680-1p2ijyl.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=260&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/317330/original/file-20200226-24680-1p2ijyl.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=260&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/317330/original/file-20200226-24680-1p2ijyl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=327&fit=crop&dpr=1 754w, https://images.theconversation.com/files/317330/original/file-20200226-24680-1p2ijyl.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=327&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/317330/original/file-20200226-24680-1p2ijyl.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=327&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">From security…</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/deep-machine-learing-concept-smart-hospitaly-1416861809">Monopoly919/Shutterstock</a></span>
</figcaption>
</figure>
<p>Airports are a common place you may see facial recognition used and it is typically found in automatic border control machines. <a href="http://www.alphr.com/security/1008321/british-airways-facial-tracking-heathrow">Airlines have also been testing the systems at the gate</a>, expanding the data collection beyond government to private companies. Meanwhile, advertising screens in <a href="https://news.sky.com/story/piccadilly-circus-lights-facial-detection-system-incredibly-intrusive-11087020">Piccadilly Circus</a> in London, as well as <a href="https://www.manchestereveningnews.co.uk/news/greater-manchester-news/gmp-trafford-centre-camera-monitored-15278943">Manchester</a>, <a href="https://nottstv.com/screen-outside-victoria-centre-to-have-personalised-adverts-based-on-persons-age-gender-and-mood/">Nottingham</a> and <a href="https://www.vice.com/en_uk/article/yvxbdb/sam-kriss-all-seeing-eyes-in-birmingham">Birmingham</a>, reportedly use the technology to target ads according to the age, gender and mood of people in the crowd. </p>
<p>Shopping centres and public spaces such as museums in <a href="https://www.theguardian.com/technology/2019/aug/16/privacy-campaigners-uk-facial-recognition-epidemic">cities across the UK</a> have used the technology for security purposes. <a href="https://www.theguardian.com/technology/2020/jan/12/anger-over-use-facial-recognition-south-wales-football-derby-cardiff-swansea">Football matches</a>, <a href="https://www.theguardian.com/technology/2019/jul/03/police-face-calls-to-end-use-of-facial-recognition-software">airshows</a>, <a href="https://www.rollingstone.com/music/music-news/taylor-swift-facial-recognition-concerts-768741">concerts</a>, <a href="https://www.theregister.co.uk/2016/08/26/notting_hill_carnival_police_surveillance_cameras_automated_face_recognition/">Notting Hill Carnival</a> and even the <a href="https://www.theguardian.com/technology/2017/nov/12/metropolitan-police-to-use-facial-recognition-technology-remembrance-sunday-cenotaph">Remembrance Sunday</a> service now fall under the invasive eye of facial recognition.</p>
<p>It is not always clear whether facial recognition is used to catch known criminals from a watchlist or simply to add an extra layer of security to public spaces and events. But the South Wales and Metropolitan police forces have admitted they are using it to try to catch elusive criminals. They claim to only use specific watchlists of dangerous individuals, but <a href="https://news.sky.com/story/police-facial-recognition-could-target-literally-anybody-senior-mp-says-11770696">leaked documents</a> show that they also include “persons where intelligence is required” - which could be just about anyone.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/317333/original/file-20200226-24694-1p5c1l8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/317333/original/file-20200226-24694-1p5c1l8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=260&fit=crop&dpr=1 600w, https://images.theconversation.com/files/317333/original/file-20200226-24694-1p5c1l8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=260&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/317333/original/file-20200226-24694-1p5c1l8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=260&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/317333/original/file-20200226-24694-1p5c1l8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=327&fit=crop&dpr=1 754w, https://images.theconversation.com/files/317333/original/file-20200226-24694-1p5c1l8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=327&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/317333/original/file-20200226-24694-1p5c1l8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=327&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">…to shopping.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/iot-machine-learning-human-object-recognition-1369356617">Monopoly919/Shutterstock</a></span>
</figcaption>
</figure>
<p><a href="https://www.adalovelaceinstitute.org/wp-content/uploads/2019/09/Public-attitudes-to-facial-recognition-technology_v.FINAL_.pdf">Research shows</a> the UK public largely supports facial recognition, provided it benefits society and has appropriate limits. Yet there is little proof that facial recognition actually provides significant social benefit given the costs to privacy.</p>
<p>On a practical level, facial recognition technology doesn’t yet work very well. A 2019 independent review by the University of Essex found that only one in five matches by the Metropolitan Police’s system could confidently <a href="https://www.essex.ac.uk/news/2019/07/03/met-police-live-facial-recognition-trial-concerns">be considered accurate</a>. South Wales Police has claimed its use of the technology has enabled 450 arrests. But only 50 were actually made using <a href="https://www.wired.co.uk/article/uk-police-facial-recognition">live facial recognition</a>. The rest were down to conventional CCTV and face-matching or having officers on the street.</p>
<p>Facial recognition systems are often marketed with outrageous claims. The company Clearview AI, which is <a href="https://www.cnet.com/news/clearview-ai-facial-recognition-company-faces-another-lawsuit/">facing legal action</a> for building a database of 3 billion photos of faces taken from social media and other websites, <a href="https://clearview.ai/">says its technology</a> “helps to identify child molesters, murderers, suspected terrorists, and other dangerous people quickly, accurately, and reliably”. Yet it has also <a href="https://www.buzzfeednews.com/article/ryanmac/clearview-ai-nypd-facial-recognition">faced criticism</a> that its technology simply isn’t anywhere near as useful to the police as the firm claims. (Clearview AI did not respond to The Conversation’s request for comment.)</p>
<p>With this in mind, the huge sums of money spent on these systems could probably be better spent on other things to tackle crime and improve public safety. But there are also deep problems with the way facial recognition technology works. For example, <a href="http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf">research has shown</a> that the accuracy of facial recognition can depend on a subject’s race and gender. This means that if you are black and/or a woman, the technology is more likely to falsely match you with someone on a watchlist.</p>
<h2>Slippery slope</h2>
<p>Another issue is how it has the potential to be used for more than spotting known criminals and become a tool of mass surveillance. In one Metropolitan Police trial that led to four arrests, those detained weren’t dangerous criminals but passersby who simply <a href="https://www.ft.com/content/f4779de6-b1e0-11e9-bec9-fdcab53d6959">tried to cover their faces</a> to avoid the non-consensual facial recognition test.</p>
<p>Police harassment and fines are a slippery slope towards further discrimination and abuse of power. We may accept a human officer combing through CCTV footage looking for a specific suspect. But the sheer scale of live facial recognition is more like turning the entire country into one huge police line-up.</p>
<p>On a more fundamental level, biometric data (such as our facial measurements, fingerprints or DNA) are part of our identity. Facial recognition violates not only our right to go about in public without being monitored, but also our bodily rights and our very sense of self.</p>
<p>Facial recognition is creeping across the UK, but its distribution and its effects are likely to be very uneven. So, regardless of the efficacy or value of facial recognition, we need thoroughly <a href="https://theconversation.com/police-use-of-facial-recognition-technology-must-be-governed-by-stronger-legislation-111325">considered national regulation</a> to mitigate the significant risks. Otherwise we risk ending up with an inadequate patchwork of guidelines that is full of gaps and loopholes.</p><img src="https://counter.theconversation.com/content/132047/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Garfield Benjamin does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The more we use facial recognition, the more we see its limits and its risks.Garfield Benjamin, Postdoctoral Researcher, School of Media Arts and Technology, Solent UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1290682020-01-15T19:08:18Z2020-01-15T19:08:18ZDon’t die wondering: apps may soon be able to predict your life expectancy, but do you want to know?<figure><img src="https://images.theconversation.com/files/310160/original/file-20200115-151844-1ole8rh.jpg?ixlib=rb-1.1.0&rect=46%2C23%2C3833%2C2681&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Monaco and Japan have some of the highest life expectancies in the world. But calculating an individual's life expectancy will require taking data analysis several steps further.</span> <span class="attribution"><span class="source">SHUTTERSTOCK</span></span></figcaption></figure><p><em>When will I die?</em></p>
<p>This question has endured across cultures and civilisations. It has given rise to a plethora of religions and spiritual paths over thousands of years, and more recently, <a href="https://apps.apple.com/us/app/when-will-i-die/id1236569653">some highly amusing apps</a>. </p>
<p>But this question now prompts a different response, as technology slowly brings us closer to accurately predicting the answer. </p>
<p>Predicting the lifespan of people, or their “Personal Life Expectancy” (PLE) would greatly alter our lives. </p>
<p>On one hand, it may have benefits for policy making, and help optimise an individual’s health, or the services they receive. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/were-not-just-living-for-longer-were-staying-healthier-for-longer-too-118588">We're not just living for longer – we're staying healthier for longer, too</a>
</strong>
</em>
</p>
<hr>
<p>But the potential misuse of this information by the government or private sector poses major risks to our rights and privacy.</p>
<p>Although generating an accurate life expectancy is currently difficult, due to the complexity of factors underpinning lifespan, emerging technologies could make this a reality in the future.</p>
<h2>How do you calculate life expectancy?</h2>
<p>Predicting life expectancy is not a new concept. <a href="http://www.bbc.com/travel/story/20170807-living-in-places-where-people-live-the-longest">Experts do this</a> at a population level by classifying people into groups, often based on region or ethnicity. </p>
<p>Also, tools such as <a href="https://www.nature.com/articles/s41598-018-23534-9">deep learning</a> and <a href="https://mipt.ru/english/news/scientists_use_ai_to_predict_biological_age_based_on_smartphone_and_wearables_data">artificial intelligence</a> can be used to consider complex variables, such as biomedical data, to predict someone’s biological age. </p>
<p>Biological age refers to how “old” their body is, rather than when they were born. A 30-year-old who smokes heavily may have a biological age closer to 40.</p>
<p><a href="https://www.mdpi.com/2227-7080/6/3/74/htm">Calculating a life expectancy reliably</a> would require a sophisticated system that considers a breadth of environmental, geographic, genetic and lifestyle factors – <a href="https://www1.health.gov.au/internet/publications/publishing.nsf/Content/oatsih-hpf-2012-toc%7Etier1%7Elife-exp-wellb%7E119">all of which have influence</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/310166/original/file-20200115-151848-pc2cam.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/310166/original/file-20200115-151848-pc2cam.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/310166/original/file-20200115-151848-pc2cam.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=389&fit=crop&dpr=1 600w, https://images.theconversation.com/files/310166/original/file-20200115-151848-pc2cam.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=389&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/310166/original/file-20200115-151848-pc2cam.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=389&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/310166/original/file-20200115-151848-pc2cam.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=489&fit=crop&dpr=1 754w, https://images.theconversation.com/files/310166/original/file-20200115-151848-pc2cam.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=489&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/310166/original/file-20200115-151848-pc2cam.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=489&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The use of devices such as fitness trackers will become crucial in predicting personal life expectancy in the future.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-vector/healthy-lady-run-away-angel-death-329261456">Shutterstock</a></span>
</figcaption>
</figure>
<p>With <a href="https://builtin.com/artificial-intelligence/machine-learning-healthcare">machine learning</a> and artificial intelligence, it’s becoming feasible to analyse larger quantities of data. The use of deep learning and cognitive computing, such as with <a href="https://www.ibm.com/watson-health">IBM Watson</a>, helps doctors make more accurate diagnoses than using human judgement alone. </p>
<p>This, coupled with <a href="https://www.cio.com/article/3273114/what-is-predictive-analytics-transforming-data-into-future-insights.html">predictive analytics</a> and increasing computational power, means we may soon have systems, or even apps, that can calculate life expectancy.</p>
<h2>There’s an app for that</h2>
<p>Much like <a href="https://www.mdanderson.org/for-physicians/clinical-tools-resources/clinical-calculators.html">existing tools</a> that predict cancer survival rates, in the coming years we may see apps attempting to analyse data to predict life expectancy.</p>
<p>However, they will not be able to provide a “death date”, or even a year of death.</p>
<p>Human behaviour and activities are so unpredictable, it’s almost impossible to measure, classify and predict lifespan. A personal life expectancy, even a carefully calculated one, would only provide a “natural life expectancy” based on generic data optimised with personal data.</p>
<p>The key to accuracy would be the quality and quantity of data available. Much of this would be taken directly from the user, including gender, age, weight, height and ethnicity.</p>
<p>Access to real-time sensor data through fitness trackers and smart watches could also monitor activity levels, heart rate and blood pressure. This could then be coupled with lifestyle information such as occupation, socioeconomic status, exercise, diet and family medical history. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/your-local-train-station-can-predict-health-and-death-54946">Your local train station can predict health and death</a>
</strong>
</em>
</p>
<hr>
<p>All of the above could be used to classify an individual into a generic group to calculate life expectancy. This result would then be refined over time through the analysis of personal data, updating a user’s life expectancy and letting them monitor it.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/308303/original/file-20191230-11891-nswi58.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/308303/original/file-20191230-11891-nswi58.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=176&fit=crop&dpr=1 600w, https://images.theconversation.com/files/308303/original/file-20191230-11891-nswi58.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=176&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/308303/original/file-20191230-11891-nswi58.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=176&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/308303/original/file-20191230-11891-nswi58.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=221&fit=crop&dpr=1 754w, https://images.theconversation.com/files/308303/original/file-20191230-11891-nswi58.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=221&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/308303/original/file-20191230-11891-nswi58.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=221&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">This figure shows how an individual’s life expectancy might change between two points in time (F and H) following a lifestyle improvement, such as weight loss.</span>
</figcaption>
</figure>
<h2>Two sides of a coin</h2>
<p>Life expectancy predictions have the potential to be beneficial to individuals, health service providers and governments.</p>
<p>For instance, they would make people more aware of their general health, and its improvement or deterioration over time. This may motivate them to make healthier lifestyle choices.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/faster-more-accurate-diagnoses-healthcare-applications-of-ai-research-114000">Faster, more accurate diagnoses: Healthcare applications of AI research</a>
</strong>
</em>
</p>
<hr>
<p>They could also be used by insurance companies to provide individualised services, such as how some car insurance companies use <a href="https://www.theguardian.com/money/2017/dec/16/motoring-myths-black-boxes-telematics-insurance">black-box technology</a> to reduce premiums for more cautious drivers.</p>
<p>Governments may be able to use predictions to more efficiently allocate limited resources, such as social welfare assistance and health care funding, to individuals and areas of greater need.</p>
<p>That said, there’s a likely downside. </p>
<p>People <a href="https://www.theatlantic.com/health/archive/2017/11/the-existential-slap/544790/">may become distressed</a> if their life expectancy is unexpectedly low, or at the thought of having one at all. This raises concerns about how such predictions could impact those who experience or are at risk of mental health problems. </p>
<p>Having people’s detailed health data could also let insurance companies more accurately profile applicants, <a href="https://www.abc.net.au/news/2019-07-08/fitness-tracker-used-to-set-health-insurance-premiums/11287126">leading to discrimination against groups or individuals</a>. </p>
<p>Also, pharmaceutical companies could coordinate targeted medical campaigns based on people’s life expectancy. And governments could choose to tax individuals differently, or restrict services for certain people.</p>
<h2>When will it happen?</h2>
<p>Scientists have been working on ways to <a href="https://towardsdatascience.com/what-really-drives-higher-life-expectancy-e1c1ec22f6e1">predict human life expectancy</a> for many years. </p>
<p>The solution would require input from specialists including demographers, health scientists, data scientists, IT specialists, programmers, medical professionals and statisticians.</p>
<p>While the collection of enough data will be challenging, we can likely expect to see advances in this area in the coming years.</p>
<p>If so, issues related to data compliance, as well and collaboration with government and state agencies will need to be carefully managed. Any system predicting life expectancy would handle highly sensitive data, raising ethical and privacy concerns.</p>
<p>It would also attract cybercriminals, and various other security threats.</p>
<p>Moving forward, the words of Jurassic Park’s Dr Ian Malcolm spring to mind:</p>
<blockquote>
<p>Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.</p>
</blockquote><img src="https://counter.theconversation.com/content/129068/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Predicting life expectancy remains in the realm of science fiction, but it may soon be possible. Are we prepared for such information? And who else would benefit from this knowledge?James Jin Kang, Lecturer, Edith Cowan UniversityPaul Haskell-Dowland, Associate Dean (Computing and Security), Edith Cowan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1274422019-12-04T04:25:45Z2019-12-04T04:25:45ZFingerprint login should be a secure defence for our data, but most of us don’t use it properly<figure><img src="https://images.theconversation.com/files/305096/original/file-20191204-70101-q97e32.jpg?ixlib=rb-1.1.0&rect=79%2C12%2C4010%2C2139&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Even though passcode options include swipe patterns and long passwords, many users still use easy 4-digit PINs. This is because people are often lulled into a false sense of security when they use fingerprint login.</span> <span class="attribution"><span class="source">SHUTTERSTOCK</span></span></figcaption></figure><p>Our electronic devices store a plethora of sensitive information. To protect this information, device operating systems such as <a href="https://www.apple.com/au/ios/ios-13/">Apple’s iOS</a> and <a href="https://www.android.com/phones-tablets/">Android</a> have locking mechanisms. These require user authentication before access is granted. </p>
<p>One of the most common mechanisms is fingerprint login, a form of biometric technology first introduced by Apple in 2013 as Touch ID. </p>
<p>Touch ID was introduced <a href="https://www.apple.com/business/docs/site/iOS_Security_Guide.pdf">with the intuition that</a>, if there was an easier and quicker way to log in, users would be encouraged to keep stronger passcodes and passwords without sacrificing ease of access. It was supposed to enhance both the usability and security of the device.</p>
<p>However, in application this hasn’t been the case. And most users remain unaware of this initial purpose.</p>
<h2>Easy targets</h2>
<p>When first unlocking an iPhone after starting it, <a href="https://support.apple.com/en-gb/HT204060">users are asked</a> to enter a strong six-digit passcode, instead of a simpler four-digit PIN. After that, Touch ID can be used to unlock the phone, to avoid having to re-enter the password multiple times. </p>
<p>The catch is, users can choose to ignore the direction and opt for an easy four-digit PIN, and they usually do. </p>
<p><a href="https://www.usenix.org/conference/soups2015/proceedings/presentation/cherapau">Researchers</a> found that among Touch ID users, the majority still used weak login codes, mainly four-digit PINs (which are easy to guess). This was also true among people who didn’t use Touch ID. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/fingerprinting-to-solve-crimes-not-as-robust-as-you-think-85534">Fingerprinting to solve crimes: not as robust as you think</a>
</strong>
</em>
</p>
<hr>
<p>They also found more than 30% of participants weren’t aware they could use passwords with letters (which are stronger) instead of four-digit PINs.</p>
<p>Some participants indicated they used PINs for quicker access, compared to passwords. And most agreed that Touch ID offered usability benefits including convenience, speed and ease of use.</p>
<p>Interestingly, there was also a disconnect between how secure users thought their passcodes were, and how secure they actually were. </p>
<p>In fact, only 12% of participants correctly estimated their passcode’s strength </p>
<h2>Knowledge is key</h2>
<p>It’s important to understand how fingerprint login and other biometric systems work, before we use them. </p>
<p>A biometric is a unique biological characteristic which can be used to identify and verify a person’s identity. Apart from fingerprints, we see this in facial recognition scans, DNA tests, and less commonly in palm prints, and iris and retina recognition.</p>
<p>Biometrics are marketed as being a very secure solution, because the way biometric data is stored is different to the ways PINs and passwords are stored. </p>
<p>While passwords are stored on <a href="https://home.bt.com/tech-gadgets/computing/cloud-computing/eight-things-you-need-to-know-about-the-cloud-11363891172534">the cloud</a>, data from your fingerprint is stored solely on your device. Servers and apps never have access to your fingerprint data, nor is it saved on the cloud.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/iphone-5s-fingerprint-scanning-thumbs-up-or-down-18112">iPhone 5S fingerprint scanning: thumbs up or down?</a>
</strong>
</em>
</p>
<hr>
<p>However, although it’s incredibly hard for cybercriminals to get access to your actual fingerprint data – since it’s encrypted and stored on the device itself – biometric systems are still not completely secure.</p>
<p>For instance, Apple’s fingerprint technology was compromised <a href="https://www.theguardian.com/technology/2013/sep/22/apple-iphone-fingerprint-scanner-hacked">just two days after the launch of Touch ID</a> (integrated into the iPhone 5S) in 2013. And since then, many people have managed to bypass Touch ID security by <a href="https://www.theverge.com/2016/5/2/11540962/iphone-samsung-fingerprint-duplicate-hack-security">using dental mold or play-dough</a>.</p>
<p>Similarly, it was shown that even the 2017 iPhone X’s <a href="https://support.apple.com/en-au/HT208109">Face ID</a> feature <a href="https://www.wired.com/story/hackers-say-broke-face-id-security/">could be compromised</a>.</p>
<p>Users who use Touch ID with a four-digit PIN backup are also at risk. They’re susceptible to “shoulder surfing” attacks, where attackers simply look over a victim’s shoulder to see them input their PIN.</p>
<p>Other types of attacks include password guessing and even thermal fingerprint scanning, which involves using a thermal device to figure out which areas on a screen were most recently pressed, thereby potentially revealing a passcode combination. </p>
<h2>A permanent mark</h2>
<p>The elephant in the room is that once biometric data such as a fingerprint is stolen, it’s stolen forever. Unlike a password, it can’t be changed.</p>
<p>Stolen biometric data can be used to identify users without their knowledge, especially if users are unaware of how their data is stored and collected. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/fingerprint-and-face-scanners-arent-as-secure-as-we-think-they-are-112414">Fingerprint and face scanners aren’t as secure as we think they are</a>
</strong>
</em>
</p>
<hr>
<p>That said, cybercriminals generally prefer to break into people’s devices through mind games, by luring victims into clicking on links or downloading attachments which eventually disclose their login credentials. </p>
<p>In public, a criminal might ask to borrow your phone for a call. In such situations, it’s often easy for them to steal your PIN simply through observation, rather than having to actually break into your device. </p>
<p>Touch ID technology was designed to enhance security and usability, and it would have, if people hailed its initial purpose and kept stronger passcodes. </p>
<p>But they don’t, because often they don’t understand the basis of the technology.
With biometric technology, users experience a false sense of security. They remain unaware of the many ways in which their information could still be stolen.</p>
<p>This is why users should educate themselves on how the technologies they use function, and the purpose for which they were designed. Failing that, they risk leaving the back door wide open for cybercriminals.</p><img src="https://counter.theconversation.com/content/127442/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nalin Asanka Gamagedara Arachchilage works as Senior Research Fellow at La Trobe University.</span></em></p>While the data from a fingerprint is very hard to retrieve, cybercriminals can get around biometric technology in various ways. And having a weak passcode is like giving them a hall pass.Nalin Asanka Gamagedara Arachchilage, Senior Research Fellow in Cyber Security at La Trobe University, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1220012019-08-20T12:06:24Z2019-08-20T12:06:24ZStolen fingerprints could spell the end of biometric security – here’s how to save it<figure><img src="https://images.theconversation.com/files/288704/original/file-20190820-170914-19nxmi4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/woman-employee-scanning-fingerprint-on-machine-697064554?src=f9obFp5OFLxLIR_N6luuCw-1-87">Pormezz/Shutterstock</a></span></figcaption></figure><p>The biggest known biometric data breach to date was <a href="https://www.theguardian.com/technology/2019/aug/14/major-breach-found-in-biometrics-system-used-by-banks-uk-police-and-defence-firms">reported recently</a> when researchers managed to access a 23-gigabyte database of more than 27.8m records including fingerprint and facial recognition data.</p>
<p>The researchers, working with cyber-security firm VPNMentor, said that they had been able to access the Biostar 2 biometrics lock system that manages access to secure facilities like warehouses or office buildings. This control mechanism, run by the firm Suprema, is reportedly part of a system used by 5,700 organisations in 83 countries, including governments, banks and the UK’s Metropolitan Police.</p>
<p>This breach highlights a major problem with biometric security systems that effectively use people’s biological measurements as passwords. Unlike usernames and passwords, biometric data can’t be changed if it is stolen. </p>
<p>Given that data breaches have become an inevitable part of our increasingly digital world, does that mean biometric security doesn’t have a long-term future, as it’s likely that one day almost everyone’s data will be floating around cyberspace? Perhaps in its current format, but there are also ways we could rescue biometrics and make it more secure.</p>
<p>Traditional passwords are something you know. Biometric features are something you are. Fingerprints, iris scans, voice patterns, face and ear photos and facial recognition data <a href="https://insights.samsung.com/2019/03/19/which-biometric-authentication-method-is-the-most-secure/">can all be used</a> as a way to check if someone is who they say they are and are very hard to fake.</p>
<p>Authentication systems securely store a copy of the raw biometric data and when a user wants to login to the system, their features are compared with the stored data. Once only a feature of science fiction, biometric systems are now widely used in real-life secure facilities, passports and even the fingerprint authentication in your smartphone.</p>
<p>But the unique nature of biometrics is also its flaw. Biometric data might provide a way to identify people with a high degree of accuracy but once it is stolen there is nothing you can do to make it secure again. Of course, if your fingerprint is stolen you could always use another finger, but you could only do this 10 times.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/288701/original/file-20190820-170906-rcepmi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/288701/original/file-20190820-170906-rcepmi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=316&fit=crop&dpr=1 600w, https://images.theconversation.com/files/288701/original/file-20190820-170906-rcepmi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=316&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/288701/original/file-20190820-170906-rcepmi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=316&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/288701/original/file-20190820-170906-rcepmi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=398&fit=crop&dpr=1 754w, https://images.theconversation.com/files/288701/original/file-20190820-170906-rcepmi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=398&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/288701/original/file-20190820-170906-rcepmi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=398&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">What happens when you run out of fingers?</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/scan-fingerprint-biometric-identity-medical-concept-570187411?src=6ub4ZkyeF261SvQdPBTjZQ-2-25">HQuality/Shutterstock</a></span>
</figcaption>
</figure>
<p>Once someone has your fingerprint data, it is possible to <a href="https://boingboing.net/2016/03/06/hacking-a-phones-fingerprint.html">print a replica</a> using conductive ink that can fool biometric scanners. There are also examples of researchers fooling voice scanners <a href="https://www.siliconrepublic.com/enterprise/voice-recognition-security-easily-hacked">with sound-morphing tools</a>, iris scanners with <a href="https://www.wired.com/2012/07/reverse-engineering-iris-scans/">replica images</a> and face scanners <a href="https://www.wired.com/2016/08/hackers-trick-facial-recognition-logins-photos-facebook-thanks-zuck/">with photos</a> and even <a href="https://www.telegraph.co.uk/technology/2018/12/17/3d-printed-head-can-unlock-facial-recognition-systems-popular/">3D-printed heads</a>.</p>
<p>This means it’s really important to protect your raw biometric data from leaking to unwanted parties. But this will become an increasingly difficult task as we reveal our biometric data to more and more service providers. </p>
<p>Barely a week goes by without news of another company having its customers’ data stolen. You’ve probably had to change your own passwords on at least one occasion because of this. If enough people have their biometric data exposed, eventually some systems could become unusable because so many users won’t be able to securely log in to them. </p>
<p>In the recent biometric data breach, more than 1m people had their fingerprints, facial recognition data, face photos, usernames and passwords revealed. It was also discovered that outsiders could replace biometric records in the database with their own details, exposing another way to overcome the security checks. </p>
<h2>Improving security</h2>
<p>So what can be done to make biometrics security stronger? One simple way is passwords. It’s a common practice to store passwords by first encrypting them or <a href="https://www.wired.com/2016/06/hacker-lexicon-password-hashing/">“hashing”</a> them. This is essentially a one-way version of encryption that transforms the passwords into a string of characters known as a message digest that it is almost impossible to decrypt.</p>
<p>This means that even if the encrypted passwords are leaked, hackers can’t obtain the passwords. Modern systems would never store passwords in their original plain text format.</p>
<p>This method can also be applied to biometric data so that only encrypted or message digest versions of the biometric features are stored. In the recent biometric database breach, all the data was stored in raw format without encryption. This means hackers could access the raw biometric features of the users directly and replicate them for getting into critical services.</p>
<p>Another way to make biometric systems more secure would be to use blockchain, the system behind cryptocurrencies such as Bitcoin. With <a href="https://theconversation.com/how-are-bitcoin-cryptowallets-and-blockchain-related-some-jargon-busted-88906">blockchain technology</a>, you can store customer data in a distributed ledger protected by cryptography in multiple computers across the world. This means only authorised parties can access the data (or data blocks), and any attempt to modify the data will be detected by any other user subscribed to the blockchain. It’s also possible to create private distributed ledgers that only certain people can access.</p>
<p>Even this might not be enough to keep biometric systems secure forever. Researchers recently demonstrated that it’s possible to fool fingerprint scanners <a href="https://www.engadget.com/2018/11/16/ai-fingerprints-biometric-scanners/?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAABfUbl-wurWVakrOo7fgoMi7KtK7BwWIucxCjUDKvXdTXs7xDfZeeNNFqih_JTzLMCxq-egw98GGjZMnTgPySvoF__p0FUnJ7Vs3tQluIhVL4a4E926cBK7J0Wl4Vn7Lr-ei9Lp6K00ewme-689xgpMLfhb0zUOkBzxXEvWjPwb9">using artificial intelligence</a> to generate replica prints that can beat the system. So one day, advanced computers might be able to recreate any features in order to fool biometric security system into letting an impostor through. But for now, if biometric service providers would take some simple steps to make their data more secure, we could more avoid breaches that will eventually make these systems obsolete.</p><img src="https://counter.theconversation.com/content/122001/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Chaminda Hewage receives funding from European Social Funding under KESSII Programme.</span></em></p>You can’t change your fingerprint if it’s stolen like you’d change your password.Chaminda Hewage, Reader in Data Security, Cardiff Metropolitan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1177452019-06-02T20:06:19Z2019-06-02T20:06:19ZAs privacy is lost a fingerprint at a time, a biometric rebel asserts our rights<figure><img src="https://images.theconversation.com/files/277300/original/file-20190531-69083-1u30f0l.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Jeremy Lee, a sawmill worker in Imbil, Queensland, refused to have his fingerprints scanned for a new security system introduced by his employer to replace swipe cards.
</span> <span class="attribution"><span class="source">www.shutterstock.com</span></span></figcaption></figure><p>In Back to the Future II (1989), fingerprints are used to lock and unlock doors. It’s a benign technology, apart from the rise of “<a href="https://backtothefuture.fandom.com/wiki/Thumb_bandit">thumb bandits</a>” who amputate thumbs. Gattaca (1997) envisages a bleaker future, where corporations collect DNA samples and genetic discrimination reigns.</p>
<p>Three decades on, “biometric recognition” technology is no longer science fiction. Should we embrace it or fear it?</p>
<p>That question faced Jeremy Lee, a sawmill worker in the town of Imbil, Queensland. when his employer, Superior Wood Ltd, introduced fingerprint scanning to verify clock-on and clock-off times.</p>
<p>Lee refused to comply. He was sacked as a result.</p>
<p>Lee then lodged an unfair dismissal claim in the Fair Work Commission. His claim <a href="https://www.theguardian.com/world/2018/nov/27/companies-can-sack-workers-for-refusing-to-use-fingerprint-scanners">was rejected</a> last November. </p>
<p>But last month Lee won his case on appeal before a full bench of commissioners. </p>
<p>Their ruling was particularly critical of the employer’s lack of process and failure to understand its employees’ right to privacy.</p>
<p>It’s concerning management appeared to not understand the sensitivity of such data, and believed it had the right to demand it for something so mundane. </p>
<p>But what is most disturbing about this case, the first of its kind in Australia, is that just one employee out of about 400 resisted having their biodata taken. Every other employee acquiesced, despite management failing to provide any information about how it planned to store and protect such sensitive data. </p>
<h2>Boundaries of consent</h2>
<p>Biometrics refers to any technology that measures and analyses unique physical and distinctive behavioural characteristics considered innate, immutable and unique to the individual. </p>
<p>Physiological markers include fingerprints, hand geometry, eyes and facial features. Behavioural markers include gait or voice patterns. </p>
<p>You don’t have to look far to see these technologies in use. Fingerprint and facial scanning is now common as a security measure on phones and computers. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/big-brother-is-watching-how-new-technologies-are-changing-police-surveillance-115841">Big brother is watching: how new technologies are changing police surveillance</a>
</strong>
</em>
</p>
<hr>
<p>The advantages are obvious. The drawbacks less so. </p>
<p>The problem is when they are used by others to collect information about us. </p>
<p>In Australia, our political system may protect us from the prospect of biometric surveillance becoming omnipresent, as in the case of China, but we do face the potentially coercive power of employers wanting to use it.</p>
<p>Their reasons may be benign, possibly even quite compelling, but demanding that information might still cross a line that infringes privacy rights.</p>
<p>Once we agree to give up those rights, what guarantees do we have the information won’t end up being used for other ends, legal or illegal?</p>
<h2>Biodata is forever</h2>
<p>This is why you, like Jeremy Lee, should be concerned.</p>
<p>Biometrics information can reveal a huge amount of information about you. If may even reveal information you don’t know. Fingerprint data, for instance, could potentially <a href="https://www.sciencedaily.com/releases/2019/04/190415105048.htm">detect genetic disorders</a>.</p>
<p>There needs to be clear boundaries, so information can only used for the purpose to which an employee has actively consented. Otherwise there is potential for systematic discrimination in recruitment, promotions and conditions of employment.</p>
<p>Perhaps an even greater risk is the security of this data. </p>
<p>Biometric data is vulnerable as any other digital form in an era of sophisticated hacking. It could prove just as valuable to criminals as credit-card details.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/fingerprint-and-face-scanners-arent-as-secure-as-we-think-they-are-112414">Fingerprint and face scanners aren’t as secure as we think they are</a>
</strong>
</em>
</p>
<hr>
<p>Cards can be replaced and passwords changed. Biodata cannot. The level of security protecting biodata should be much greater.</p>
<p>In the case of Jeremy Lee vs Superior Wood, the company admitted the data was stored at multiple sites with access by multiple sources. </p>
<p>Lee ultimately won his case because the commissioners decided the company had not abided by the <a href="https://www.oaic.gov.au/privacy-law/">Privacy Act (1988)</a>. That law says collecting sensitive information should be “reasonably necessary” – <a href="https://www.minterellison.com/articles/the-impact-of-lee-v-superior-wood">in this case</a> there were other ways to verify when employees clocked on and off. It also prohibits collecting sensitive information without an individual’s consent. </p>
<p>Thanks to Jeremy Lee, we now know any employer seeking to collect biometric data has the same obligations. And any employee has the right to object.</p><img src="https://counter.theconversation.com/content/117745/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Biometric data is forever. Any employer seeking to collect it has big obligations to meet. And employees have the right to object.Peter Holland, Professor in Human Resource Management and Employee Relations, Swinburne University of TechnologyTse Leng Tham, Lecturer in Human Resource Management and Management, RMIT UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1169382019-05-16T19:56:20Z2019-05-16T19:56:20ZCongress is considering privacy legislation – be afraid<p>Supreme Court Justice Louis Brandeis called privacy the “<a href="http://faculty.uml.edu/sgallagher/harvard__law_review.htm">right to be let alone</a>.” Perhaps Congress should give states trying to protect consumer data the same right.</p>
<p>For years, a gridlocked Congress ignored privacy, apart from occasionally scolding companies such as <a href="https://www.nytimes.com/2017/10/03/business/equifax-congress-data-breach.html">Equifax</a> and <a href="https://www.washingtonpost.com/technology/2019/03/07/senators-slam-equifax-marriott-executives-massive-data-breaches/?utm_term=.5eeec0bfdb0a">Marriott</a> after their major data breaches. In its absence, states have taken the lead in experimenting with privacy-related laws. </p>
<p>California, for example, <a href="https://www.nbcnews.com/tech/tech-news/california-bringing-law-order-big-data-it-could-change-internet-n1005061">recently passed legislation</a> giving citizens the right to know what data businesses have on them – and to block the information’s sale to third parties. It’s the <a href="https://store.law.com/Registration/Login.aspx?mode=silent&source=https%3A%2F%2Fwww.law.com%2Fnjlawjournal%2F2018%2F12%2F01%2Fthe-california-consumer-privacy-act-what-you-need-to-know%2F%3Fslreturn%3D20190416111321">first of its kind</a> in the U.S. and <a href="https://www.natlawreview.com/article/state-law-developments-consumer-privacy">has prompted lawmakers</a> in other states to try to follow suit. </p>
<p>That’s gotten the attention of businesses, especially in tech, which <a href="https://www.reuters.com/article/us-usa-tech-congress-idUSKCN1M62TE">have been lobbying Congress</a> to preempt a possible patchwork of state laws with what could amount to a weaker federal one. Some observers <a href="https://www.brookings.edu/blog/techtank/2019/01/07/will-this-new-congress-be-the-one-to-pass-data-privacy-legislation/">predict</a> this could be that rare issue that inspires bipartisan compromise in Congress this year. </p>
<p>Sounds like great news, right? </p>
<p>Wrong. </p>
<p>As <a href="https://papers.ssrn.com/sol3/cf_dev/AbsByAuth.cfm?per_id=79181">someone who has studied privacy for nearly two decades</a>, I believe consumers are better off if Congress doesn’t intrude and lets states keep experimenting on how to best protect Americans’ personal data. </p>
<h2>Following California’s lead</h2>
<p>It may be hard to remember, but there was a time when companies were able to keep data breaches secret, so that consumers didn’t even know hackers had their information and that they needed to take steps to protect themselves. </p>
<p>Then <a href="https://leginfo.legislature.ca.gov/faces/codes_displaySection.xhtml?lawCode=CIV&sectionNum=1798.82">California’s data breach law</a> took effect in 2003. California requires companies that suffer data breaches to notify affected consumers as well as the state’s attorney general. </p>
<p>As lawmakers elsewhere learned from these notifications just how common data breaches had become, the <a href="http://www.ncsl.org/research/telecommunications-and-information-technology/security-breach-notification-laws.aspx">other 49 states followed suit</a>. The result is that <a href="https://www.privacyrights.org/data-breaches">more than 8,000 data breaches affecting more than 11 billion records</a> have been made public – and all without Congress doing a thing. </p>
<p>If states had not acted on their own, Americans might never have learned about the Equifax or Marriott breaches, or about the <a href="https://www.idtheftcenter.org/wp-content/uploads/2019/02/ITRC_2018-End-of-Year-Aftermath_FINAL_V2_combinedWEB.pdf">1,244 breaches affecting 446 million records that occurred just last year</a>.<br>
And just as other states followed California on breaches, some are attempting to do the same on privacy legislation. </p>
<p>The <a href="https://cal-privacy.com/">California Consumer Privacy Act</a>, which will take effect next year, will give Californians the right to learn what companies know about them and the kinds of businesses they sell that information to, as well as the right to block such sales. Consumers will also be able to require companies to delete information on them in some circumstances. </p>
<p><a href="https://iapp.org/news/a/us-state-comprehensive-privacy-law-comparison/">Legislators in states</a> including <a href="https://www.clarip.com/blog/ma-privacy-bill-sd341/">Massachussetts</a>, <a href="https://www.mediapost.com/publications/article/334811/washington-state-privacy-proposal-likely-to-fail.html">Washington</a> and <a href="https://www.bleepingcomputer.com/news/legal/new-york-privacy-bill-forces-businesses-to-disclose-consumer-data-use/">New York</a> have introduced similar privacy bills this year. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/274991/original/file-20190516-69199-q3lkuq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/274991/original/file-20190516-69199-q3lkuq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=404&fit=crop&dpr=1 600w, https://images.theconversation.com/files/274991/original/file-20190516-69199-q3lkuq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=404&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/274991/original/file-20190516-69199-q3lkuq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=404&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/274991/original/file-20190516-69199-q3lkuq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=508&fit=crop&dpr=1 754w, https://images.theconversation.com/files/274991/original/file-20190516-69199-q3lkuq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=508&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/274991/original/file-20190516-69199-q3lkuq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=508&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">California has taken the lead on protecting consumer data.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/California-Data-Privacy/cd5c30fb2ae54bac8779e1ad7a617bea/12/0">AP Photo/Don Thompson</a></span>
</figcaption>
</figure>
<h2>Congressional intrusion</h2>
<p>But Congress could bring this experimentation to a halt if lawmakers enact a weaker privacy bill that overrides state laws, as <a href="https://www.commerce.senate.gov/public/_cache/files/e3f238aa-522d-4984-9f15-4e9b0e705c70/FE04C752379060C87ECEDEE13DF85940.02-27-2019beckerman-testimony.pdf">industry lobbyists are seeking</a>. </p>
<p>Congress frequently preempts state laws. For example, the <a href="https://www.law.cornell.edu/uscode/text/9/2">federal arbitration law</a> prevents states from regulating arbitration agreements, even <a href="https://caselaw.findlaw.com/us-supreme-court/517/681.html">barring states from merely requiring</a> that contracts require arbitration on the first page. </p>
<p>I don’t mean to say that there’s no room for Congress to get involved. Most Americans still lack important privacy protections, and Congress could help fill that gap. </p>
<p>But rather than circumventing state laws, a federal privacy law should work in partnership with them – just as federal laws regulating auto safety such as <a href="http://legisworks.org/GPO/STATUTE-105-Pg1914.pdf">airbag requirements</a> operate in tandem with state regulations that govern related issues such as how fast motorists can drive. </p>
<p>Industry advocates, however, don’t want federal and state laws to exist side by side because they say companies will have trouble following the rules of different states. Businesses had the same concerns about state data breach laws, and <a href="https://www.hsgac.senate.gov/subcommittees/investigations/hearings/examining-private-sector-data-breaches">testimony from Marriott’s CEO</a> suggests the company didn’t find it too troublesome to comply with them, however different. </p>
<p>It’s more likely, then, that companies realize that it will be easier for their lobbyists to win a victory in one legislature – Congress – than in 50 states. </p>
<p><a href="https://www.commerce.senate.gov/public/_cache/files/ae013907-beb0-4a9f-9cd5-08540229d8a2/62BA76A684B464BFC9358CE014C06AF8.02-27-2019rothenberg-testimony.pdf">Lobbyists have also argued</a> consumers would be bewildered by such a patchwork of state privacy laws. <a href="https://www.commerce.senate.gov/public/index.cfm/hearings?ID=CBA2CD07-4CC7-4474-8B6E-513FED77073D">They claimed</a>, for example, that a consumer driving from Biloxi, Mississippi, to Bellevue, Washington, would be confused by the different privacy regimes she would encounter. </p>
<p>But that same person – during that same drive – copes with a wide variety of traffic laws. Drivers seem to be able to navigate those different laws just fine.</p>
<h2>New tech, new threats to privacy</h2>
<p>Another concern is that technology is continually improving, with each new advance creating a new privacy challenge for consumers that they cannot now foresee. </p>
<p>Biometrics is an example of an issue that only in recent years has become a serious privacy concern. It’s one thing to use facial recognition software to unlock your phone, another if companies are able to buy your image so they can <a href="https://www.goodeintelligence.com/press-releases/biometrics-creepy-or-convenient/">tailor the ads</a> you see to what you look like. </p>
<p>Illinois was at the forefront of innovation when in 2008 it <a href="http://www.ilga.gov/legislation/ilcs/ilcs3.asp?ActID=3004">passed a statute</a> that prevents companies from selling information about consumers’ fingerprints, retina scans, voiceprints and similar items and requires companies to notify consumers before capturing biometric information. Other states, like <a href="https://codes.findlaw.com/tx/business-and-commerce-code/bus-com-sect-503-001.html">Texas</a> and <a href="https://app.leg.wa.gov/RCW/default.aspx?cite=19.375.020">Washington state</a>, have since enacted similar laws. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/274993/original/file-20190516-69213-w06kka.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/274993/original/file-20190516-69213-w06kka.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/274993/original/file-20190516-69213-w06kka.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=797&fit=crop&dpr=1 600w, https://images.theconversation.com/files/274993/original/file-20190516-69213-w06kka.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=797&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/274993/original/file-20190516-69213-w06kka.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=797&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/274993/original/file-20190516-69213-w06kka.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1002&fit=crop&dpr=1 754w, https://images.theconversation.com/files/274993/original/file-20190516-69213-w06kka.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1002&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/274993/original/file-20190516-69213-w06kka.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1002&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Louis Brandeis dubbed privacy the ‘right to be left alone.’</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Associated-Press-Domestic-News-United-States-LO-/eeec64432ee4da11af9f0014c2589dfb/1/0">AP Photo</a></span>
</figcaption>
</figure>
<p>But it’s another reason a federal privacy law preventing states from experimenting may be worse than no federal law at all. Federal preemption would mean states could no longer respond to threats to privacy. And consumers would have only Congress to turn to for a remedy. Given that the <a href="https://www.congress.gov/106/plaws/publ102/PLAW-106publ102.pdf">last major consumer privacy law</a> at the federal level is already two decades old, it’s hard to believe the frequently frozen Congress would keep up with the times.</p>
<p>Worse, consumers would risk losing their only bargaining chip in the fight over their personal data: companies’ fear that states might put a stop to whatever they’re doing.</p>
<p>Brandeis, a <a href="https://www.cs.cornell.edu/%7Eshmat/courses/cs5436/warren-brandeis.pdf">prophet on privacy</a>, <a href="https://supreme.justia.com/cases/federal/us/285/262/">called</a> the states the “laboratories of democracy.” Let’s see what results the labs produce before we stop experimenting – and risk learning the best solutions.</p><img src="https://counter.theconversation.com/content/116938/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jeff Sovern is a member of the National Association of Consumer Advocates and a registered Democrat.</span></em></p>States like California have been at the forefront of privacy innovation in recent decades. A possible federal law could bring their experimentation to a halt, harming consumers.Jeff Sovern, Professor of Law, St. John's UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1124142019-03-06T04:00:59Z2019-03-06T04:00:59ZFingerprint and face scanners aren’t as secure as we think they are<figure><img src="https://images.theconversation.com/files/261822/original/file-20190304-110110-1tgw1we.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Biometric systems are increasingly used in our civil, commercial and national defence applications.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Despite what every spy movie in the past 30 years would have you think, fingerprint and face scanners used to unlock your smartphone or other devices aren’t nearly as secure as they’re made out to be.</p>
<p>While it’s not great if your password is made public in a data breach, at least you can easily change it. If the scan of your fingerprint or face – known as “biometric template data” – is revealed in the same way, you could be in real trouble. After all, you can’t get a new fingerprint or face.</p>
<p>Your biometric template data are <a href="https://www.gemalto.com/govt/inspired/biometrics">permanently and uniquely linked to you</a>. The exposure of that data to hackers could <a href="https://dl.acm.org/citation.cfm?id=1387883">seriously compromise user privacy and the security of a biometric system</a>. </p>
<p>Current techniques provide effective security from breaches, but advances in artificial intelligence (AI) are rendering these protections obsolete.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/receiving-a-login-code-via-sms-and-email-isnt-secure-heres-what-to-use-instead-112767">Receiving a login code via SMS and email isn't secure. Here's what to use instead</a>
</strong>
</em>
</p>
<hr>
<h2>How biometric data could be breached</h2>
<p>If a hacker wanted to access a system that was protected by a fingerprint or face scanner, there are a number of ways they could do it:</p>
<ol>
<li><p>your fingerprint or face scan (template data) stored in the database could be replaced by a hacker to gain unauthorised access to a system</p></li>
<li><p>a physical copy or spoof of your fingerprint or face could be created from the stored template data (with <a href="http://vkansee.com/this-guy-unlocked-my-iphone-with-play-doh/">play doh</a>, for example) to gain unauthorised access to a system</p></li>
<li><p>stolen template data could be reused to gain unauthorised access to a system</p></li>
<li><p>stolen template data could be used by a hacker to unlawfully track an individual from one system to another.</p></li>
</ol>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/-JwptEh--kU?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>Biometric data need urgent protection</h2>
<p>Nowadays, biometric systems are increasingly used in our civil, commercial and national defence applications.</p>
<p>Consumer devices equipped with biometric systems are found in everyday electronic devices like <a href="http://www.m2sys.com/blog/biometric-resources/biometrics-on-smartphones/">smartphones</a>. MasterCard and Visa both offer <a href="https://techcrunch.com/2017/04/20/mastercard-trials-biometric-bankcard-with-embedded-fingerprint-reader/">credit cards with embedded fingerprint scanners</a>. And wearable <a href="https://singularityhub.com/2018/01/30/smart-homes-wont-just-automate-your-life-theyll-track-your-health-too/#sm.00001gaw7sovv9frwrel7ol9kfq1j">fitness devices</a> are increasingly using biometrics to unlock smart cars and smart homes.</p>
<p>So how can we protect raw template data? A range of encryption techniques have been proposed. These fall into <a href="https://www.mdpi.com/2073-8994/11/2/141">two categories</a>: cancellable biometrics and biometric cryptosystems.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/when-your-body-becomes-your-password-the-end-of-the-login-is-nigh-39092">When your body becomes your password, the end of the login is nigh</a>
</strong>
</em>
</p>
<hr>
<p>In cancellable biometrics, complex mathematical functions are used to transform the original template data when your fingerprint or face is being scanned. This transformation is non-reversible, meaning there’s no risk of the transformed template data being turned back into your original fingerprint or face scan. </p>
<p>In a case where the database holding the transformed template data is breached, the stored records can be deleted. Additionally, when you scan your fingerprint or face again, the scan will result in a new unique template even if you use the same finger or face.</p>
<p>In biometric cryptosystems, the original template data are combined with a cryptographic key <a href="https://dl.acm.org/citation.cfm?id=2905118">to generate a “black box”</a>. The cryptographic key is the “secret” and query data are the “key” to unlock the “black box” so that the secret can be retrieved. The cryptographic key is released upon successful authentication.</p>
<h2>AI is making security harder</h2>
<p>In recent years, new biometric systems that incorporate <a href="https://www.sas.com/en_au/insights/analytics/what-is-artificial-intelligence.html">AI</a> have really come to the forefront of consumer electronics. Think: smart cameras with built-in AI capability to recognise and track specific faces.</p>
<p>But AI is a double-edged sword. While new developments, such as <a href="https://www.nature.com/articles/nature14539">deep artificial neural networks</a>, have enhanced the performance of biometric systems, potential threats could arise from the integration of AI. </p>
<p>For example, researchers at New York University created a tool called <a href="https://www.wired.com/story/deepmasterprints-fake-fingerprints-machine-learning/">DeepMasterPrints</a>. It uses deep learning techniques to generate fake fingerprints that can unlock a large number of mobile devices. It’s similar to the way that a master key can unlock every door.</p>
<p>Researchers have also demonstrated how deep artificial neural networks can be trained so that the original biometric inputs (such as the image of a person’s face) <a href="https://arxiv.org/abs/1703.00832">can be obtained from the stored template data</a>. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/facial-recognition-is-increasingly-common-but-how-does-it-work-61354">Facial recognition is increasingly common, but how does it work?</a>
</strong>
</em>
</p>
<hr>
<h2>New data protection techniques are needed</h2>
<p>Thwarting these types of threats is one of the most pressing issues facing designers of secure AI-based biometric recognition systems.</p>
<p>Existing encryption techniques designed for non AI-based biometric systems are incompatible with AI-based biometric systems. So new protection techniques are needed. </p>
<p>Academic researchers and biometric scanner manufacturers should work together to secure users’ sensitive biometric template data, thus minimising the risk to users’ privacy and identity. </p>
<p>In academic research, special focus should be put on two most important aspects: recognition accuracy and security. As this research falls within <a href="https://www.industry.gov.au/data-and-publications/science-and-research-priorities">Australia’s science and research priority of cybersecurity</a>, both government and private sectors should provide more resources to the development of this emerging technology.</p><img src="https://counter.theconversation.com/content/112414/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Current techniques to protect biometric details, such as face recognition or fingerprints, from hacking are effective, but advances in AI are rendering these protections obsolete.Wencheng Yang, Post Doctoral Researcher, Security Research Institute, Edith Cowan UniversitySong Wang, Senior Lecturer, Engineering, La Trobe UniversityLicensed as Creative Commons – attribution, no derivatives.