tag:theconversation.com,2011:/au/topics/surveillance-systems-47559/articlesSurveillance systems – The Conversation2022-12-18T19:18:01Ztag:theconversation.com,2011:article/1949172022-12-18T19:18:01Z2022-12-18T19:18:01ZNot Big Brother, but close: a surveillance expert explains some of the ways we’re all being watched, all the time<figure><img src="https://images.theconversation.com/files/499955/original/file-20221209-20279-c0jq3z.jpeg?ixlib=rb-1.1.0&rect=95%2C107%2C7893%2C4383&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>A group of <a href="https://www.nature.com/articles/srep01376;">researchers studied</a> 15 months of human mobility movement data taken from 1.5 million people and concluded that just four points in space and time were sufficient to identify 95% of them, even when the data weren’t of excellent quality.</p>
<p>That was back in 2013. </p>
<p>Nearly ten years on, surveillance technologies permeate all aspects of our lives. They collect swathes of data from us in various forms, and often without us knowing.</p>
<p>I’m a surveillance researcher with a focus on technology governance. Here’s my round-up of widespread surveillance systems I think everyone should know about.</p>
<h2>CCTV and open-access cameras</h2>
<p>Although China has more than 50% of <a href="https://www.comparitech.com/vpn-privacy/the-worlds-most-surveilled-cities/">all surveillance cameras installed</a> in the world (about 34 cameras per 1,000 people), Australian cities are <a href="https://www.comparitech.com/vpn-privacy/the-worlds-most-surveilled-cities/">catching up</a>. In 2021, Sydney had 4.67 cameras per 1,000 people and Melbourne had 2.13. </p>
<p>While CCTV cameras can be used for legitimate purposes, such as promoting safety in cities and assisting police with criminal investigations, their use also poses serious concerns.</p>
<p>In 2021, New South Wales police <a href="https://www.innovationaus.com/facial-recognition-and-the-nsw-protest-crowds/">were suspected of</a> having used CCTV footage paired with facial recognition to find people attending anti-lockdown protests. When questioned, they didn’t confirm or deny if they had (or if they would in the future).</p>
<p>In August 2022, the United Nations confirmed CCTV is <a href="https://www.ohchr.org/en/documents/country-reports/ohchr-assessment-human-rights-concerns-xinjiang-uyghur-autonomous-region">being used to</a> carry out “serious human rights violations” against Uyghur and other predominantly Muslim ethnic minorities in the Xinjiang region of Northwest China.</p>
<p>The CCTV cameras in China don’t just record real-time footage. Many are equipped with facial recognition to <a href="https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html">keep tabs on</a> the movements of minorities. And some have reportedly been trialled to <a href="https://www.bbc.com/news/technology-57101248">detect emotions</a>.</p>
<p>The US also has a long history of using CCTV cameras to support racist policing practices. In 2021, Amnesty International <a href="https://www.amnesty.org/en/latest/news/2021/06/scale-new-york-police-facial-recognition-revealed/">reported</a> areas with a higher proportion of non-white residents had more CCTV cameras.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/after-roe-v-wade-heres-how-women-could-adopt-spycraft-to-avoid-tracking-and-prosecution-186046">After Roe v Wade, here's how women could adopt 'spycraft' to avoid tracking and prosecution</a>
</strong>
</em>
</p>
<hr>
<p>Another issue with CCTV is security. Many of these cameras are open-access, which means they don’t have password protection and can often be easily accessed online. So I could spend all day watching a livestream of someone’s porch, as long as there was an open camera nearby.</p>
<p>Surveillance artist Dries Depoorter’s recent project <a href="https://driesdepoorter.be/thefollower/">The Follower</a> aptly showcases the vulnerabilities of open cameras. By coupling open camera footage with AI and Instagram photos, Depoorter was able to match people’s photos with the footage of where and when they were taken. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1569285965323145216"}"></div></p>
<p>There was pushback, with one of the <a href="https://www.inverse.com/input/culture/dries-depoorters-ai-surveillance-art-the-follower-instagram-influencers-photos">identified people saying</a>:</p>
<blockquote>
<p>It’s a crime to use the image of a person without permission. </p>
</blockquote>
<p>Whether or not it is illegal will depend on the specific circumstances and where you live. Either way, the issue here is that Depoorter was able to do this in the first place.</p>
<h2>IoT devices</h2>
<p>An IoT (“Internet of Things”) device is any device that connects to a wireless network to function – so think smart home devices such as Amazon Echo or Google Dot, a baby monitor, or even smart traffic lights.</p>
<p>It’s estimated global spending on IoT devices will <a href="https://acola.org/hs5-internet-of-things-australia/">have reached</a> US$1.2 trillion by some point this year. Around 18 billion connected devices form the IoT network. Like unsecured CCTV cameras, IoT devices are easy to hack into if they use default passwords or passwords that have <a href="https://haveibeenpwned.com/">been leaked</a>. </p>
<p>In some examples, hackers have hijacked baby monitor cameras to <a href="https://www.npr.org/sections/thetwo-way/2018/06/05/617196788/s-c-mom-says-baby-monitor-was-hacked-experts-say-many-devices-are-vulnerable/">stalk</a> breastfeeding mums, <a href="https://www.npr.org/sections/thetwo-way/2018/06/05/617196788/s-c-mom-says-baby-monitor-was-hacked-experts-say-many-devices-are-vulnerable/">threaten</a> parents that their baby was being kidnapped, and say creepy things like “<a href="https://www.nbcnews.com/news/us-news/stranger-hacks-baby-monitor-tells-child-i-love-you-n1090046">I love you</a>” to children. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/xbk3OdYBLHA?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Beyond hacking, businesses can also use data collected through IoT devices to further target customers with products and services. </p>
<p>Privacy experts raised the alarm in September over Amazon’s merger agreement with robot vacuum company iRobot. <a href="https://www.fightforthefuture.org/news/2022-09-09-letter-to-the-ftc-challenge-amazon-irobot-deal">A letter</a> to the US Federal Trade Commission signed by 26 civil rights and privacy advocacy groups said:</p>
<blockquote>
<p>Linking iRobot devices to the already intrusive Amazon home system incentivizes more data collection from more connected home devices, potentially including private details about our habits and our health that would endanger human rights and safety.</p>
</blockquote>
<p>IoT-collected data can also change hands with third parties through data partnerships (which are very common), and this too without customers’ explicit consent.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Smart speakers with digital assistants consistently raise data privacy concerns among experts.</span></span>
</figcaption>
</figure>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-the-shady-world-of-the-data-industry-strips-away-our-freedoms-143823">How the shady world of the data industry strips away our freedoms</a>
</strong>
</em>
</p>
<hr>
<h2>Big tech and big data</h2>
<p>In 2017, the <a href="https://www.economist.com/leaders/2017/05/06/the-worlds-most-valuable-resource-is-no-longer-oil-but-data">value of big data exceeded</a> that of oil. Private companies have driven the majority of that growth. </p>
<p>For tech platforms, the expansive collection of users’ personal information is business as usual, literally, because more data mean more precise analytics, more effective targeted ads <a href="https://www.facebook.com/business/help/716180208457684?id=1792465934137726">and more revenue</a>. </p>
<p>This logic of profit-making through targeted advertising has been <a href="https://journals.sagepub.com/doi/full/10.1177/1095796018819461">dubbed</a> “surveillance capitalism”. As <a href="https://quoteinvestigator.com/2017/07/16/product/">the old saying</a> goes, if you’re not paying for it, then you’re the product.</p>
<p>Meta (which owns both Facebook and Instagram) <a href="https://www.forbes.com/sites/bradadgate/2022/11/03/revenue-of-alphabet-and-meta-the-digital-duopoly-have-been-slipping/?sh=2ebf3dad2fed">generated</a> almost US$23 billion in advertising revenue in the third quarter of this year.</p>
<p>The vast machinery behind this is illustrated well in the 2021 documentary The Social Dilemma, even if in a dramatised way. It <a href="https://theconversation.com/netflixs-the-social-dilemma-highlights-the-problem-with-social-media-but-whats-the-solution-147351">showed us how</a> social media platforms rely on our psychological weaknesses to keep us online for as long as possible, measuring our actions down to the seconds we spend hovering over an ad. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=247&fit=crop&dpr=1 600w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=247&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=247&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=310&fit=crop&dpr=1 754w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=310&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=310&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A graphic excerpt from Social Dilemma.</span>
</figcaption>
</figure>
<h2>Loyalty programs</h2>
<p>Although many people don’t realise it, loyalty programs are one of the biggest personal data collection gimmicks out there. </p>
<p>In a particularly intrusive example, in 2012 one <a href="https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/?sh=706b0cd96668">US retailer</a> sent a teenage girl a catalogue dotted with pictures of smiling infants and nursery furniture. The girl’s angered father went to confront managers at the local store, and learned that predictive analytics knew more about his daughter than he did. </p>
<p>It’s estimated 88% of Australian consumers <a href="https://www.oaic.gov.au/privacy/privacy-assessments/loyalty-program-assessment-woolworths-rewards-woolworths-limited">over age 16 are members</a> of a loyalty program. These schemes build your consumer profile to sell you more stuff. Some might even charge you <a href="https://www.abc.net.au/everyday/making-loyalty-cards-worth-your-time-and-money/10998806">sneaky fees</a> and lure you in with future perks to sell you at steep prices. </p>
<p>As technology journalist <a href="https://www.choice.com.au/consumers-and-data/data-collection-and-use/who-has-your-data/articles/loyalty-program-data-collection">Ros Page notes</a>: </p>
<blockquote>
<p>[T]he data you hand over at the checkout can be shared and sold to businesses you’ve never dealt with.</p>
</blockquote>
<p>As a cheeky sidestep, you could find a buddy to swap your loyalty cards with. Predictive analytics is only strong when it can recognise behavioural patterns. When the patterns are disrupted, the data turn into noise. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/dont-be-phish-food-tips-to-avoid-sharing-your-personal-information-online-138613">Don't be phish food! Tips to avoid sharing your personal information online</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/194917/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ausma Bernot does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The vast majority of people alive today are subject to tracking through a number of overlapping and entrenched surveillance systems.Ausma Bernot, PhD Candidate, School of Criminology and Criminal Justice, Griffith UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1744802022-02-03T14:25:26Z2022-02-03T14:25:26ZHow a neighbourhood watch WhatsApp group shaped fears in a Cape Town suburb<figure><img src="https://images.theconversation.com/files/443996/original/file-20220202-25-b3uyyu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Smith Collection/Gado/Getty Images</span></span></figcaption></figure><p>Trust in state institutions to protect citizens is a prerequisite for sharing social spaces. This trust is corrupted in South Africa, where there are <a href="https://www.degruyter.com/document/doi/10.7208/9780226425078/html">persistent anxieties</a> that crime and violence are out of control.</p>
<p>Despite considerable <a href="http://www.statssa.gov.za/?p=14484">public</a> spending on the police and security sector, the country has an enormous <a href="https://www.africanews.com/2021/06/08/south-africa-insecurity-sees-rapid-growth-of-private-security-sector/">private</a> security economy, as well as volunteer-based organisations for social protection like <a href="https://www.capetown.gov.za/Family%20and%20home/safety-in-the-home/community-policing/neighbourhood-watch">neighbourhood watch groups</a>.</p>
<p>Decisions over access to public spaces, who is welcome, valued and protected – whose lives matter – is, of course, a global question. And it’s <a href="https://www.ruhabenjamin.com/race-after-technology">become apparent</a> in recent years that social media, with its ubiquitous hashtags, opaque algorithms and content moderation practices, can equally firm social divides. </p>
<p>My research on a neighbourhood watch group in <a href="https://theculturetrip.com/africa/south-africa/articles/the-top-things-to-see-and-do-in-observatory-cape-town/">Observatory</a>, a relatively affluent suburb of Cape Town, investigated what relationships of trust and distrust look like in this context.</p>
<p>The neighbourhood watch group had recently been revived. It had been established to decrease a feeling of vulnerability to crime through monthly strategy meetings and neighbourhood patrols.</p>
<p>It turned out, though, that patrols often took place in the form of ‘couch patrolling’ (making observations from the living room window and following social media communication). It also quickly became evident that commonplace technologies like <a href="https://www.whatsapp.com">WhatsApp</a>, integrated into surveillance routines, played a notable part in shaping encounters in the suburb. </p>
<p>The main question I posed in my study, which is now the subject of a <a href="https://www.langaa-rpcig.net/cultivating-suspicion-an-ethnography/">book</a>, <em>Cultivating Suspicion: An Ethnography</em>, was how suspicion transpires in the neighbourhood watch. I describe how desires to feel safe as a group, recountings of the same crime stories, and internalised fears become entangled in everyday surveillance practices.</p>
<p>Social fears are important to consider not just in South African cities, which were the scene of forced removals of the unwanted and supposedly dangerous during the segregationist <a href="https://www.sahistory.org.za/article/history-apartheid-south-africa">apartheid</a> regime. But also for other urban spaces around the globe where gaping social divisions leap to the eye.</p>
<h2>The study</h2>
<p>Nestled beneath the iconic Table Mountain, Observatory is a residential area close to the city centre and the freeway, with a busy strip of cafes, bars and restaurants. Here one meets longtime residents, national and international students of the nearby university, tourists, workers and others coming in and out of the suburb daily.</p>
<p>Affectionately shortened to ‘Obs’, Observatory is usually described as bohemian and somehow different. Its liberal image is also owed to the ‘grey area’ status it had during apartheid, with different race groups <a href="https://journals.sagepub.com/doi/pdf/10.1177/1474474009105052">mixing in public spaces</a> despite the country’s laws. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/444008/original/file-20220202-15-g6l06p.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A view of a street with shops and quaint, old-fashioned houses, a mountain towering in the background." src="https://images.theconversation.com/files/444008/original/file-20220202-15-g6l06p.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/444008/original/file-20220202-15-g6l06p.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=440&fit=crop&dpr=1 600w, https://images.theconversation.com/files/444008/original/file-20220202-15-g6l06p.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=440&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/444008/original/file-20220202-15-g6l06p.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=440&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/444008/original/file-20220202-15-g6l06p.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=553&fit=crop&dpr=1 754w, https://images.theconversation.com/files/444008/original/file-20220202-15-g6l06p.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=553&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/444008/original/file-20220202-15-g6l06p.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=553&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Observatory, Cape Town.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>Despite private security in the area, a supplement to the police force and financed by property owners, the neighbourhood watch group was considered necessary for reducing crime. Already a resident, I announced my interest as a researcher and joined the group in the hopes of learning what motivates its members. As an anthropologist, I wanted to know more about the dynamics of this suburb and how crime anxieties manifest. I began with the question of what people were looking for on patrols – who was considered suspicious?</p>
<p>I spent a year researching the fear of crime in Observatory. Attending meetings, joining patrols, listening to stories, interviewing key persons such as the manager of the private security staff, and occasionally accompanying the police on their drives, I could observe how fear became part of everyday practice.</p>
<h2>The findings</h2>
<p>The membership form explained that foot patrols were to be the main purpose of the neighbourhood watch. But few of the quickly growing members actually attended these. Most simply attended monthly meetings to voice their concerns.</p>
<p>For the most part, these members settled into ‘couch patrolling’. This involved watching the neighbourhood less formally, for instance from their house windows or on their way to the shop. They also followed the content of the active neighbourhood watch social media channels. In addition to WhatsApp groups (linked to police and private security), information was also shared via <a href="https://about.facebook.com/company-info/">Facebook</a> and email. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-cctv-surveillance-poses-a-threat-to-privacy-in-south-africa-97418">How CCTV surveillance poses a threat to privacy in South Africa</a>
</strong>
</em>
</p>
<hr>
<p>But what was the information shared in these groups? What were people looking for when patrolling physically or making sense of what they found on these social media channels? Who was the suspect that may be picked up by the police as a result of local surveillance practices?</p>
<p>My findings show that the neighbourhood watch group as a collective would spread suspicion in certain directions – not least of all through the use of social media apps. Images of ‘suspects’ that were circulated and recycled via different avenues (WhatsApp, Facebook, email and anecdotes) typically fell within the racial categories ‘Black’ and ‘Coloured’ and were marked by poverty.</p>
<p>Snapshots would capture people caught in the act of doing something that was judged as suspicious, although suspicious behaviour did not always amount to an actual crime. I would regularly wake up to rough descriptions of people on the WhatsApp groups, which included substitutes for racial terms such as Charlie (for Coloured), Bravo (for Black), and Whiskey (for White), with Charlie by far the most commonly used.</p>
<p>As some neighbourhood watch members criticised, these desriptions of ‘suspects’ were often not accompanied by an explanation as to why they were being flagged as being ‘up to no good’. </p>
<p>The answer I received to my question of what people are looking for on patrol was usually described as a gut feeling. Lively intellectual debates in meetings and digital chats did not change the fact that what kept resurfacing via the different platforms as descriptions of suspects was becoming concrete. </p>
<p>Suspected criminals were formed into an image of a suspect that was commonly male, dark-skinned, marked by meagre resources, and, understandably, avoiding exposure to the public eye.</p>
<h2>Conclusion</h2>
<p>Surveillance has rightfully become associated with innovative digital tools and is considered an issue of larger power structures. Yet, there are all kinds of less obvious and yet very problematic technologies at play that should not be overlooked – such as everyday surveillance using just our own bodies and cellphones.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/444031/original/file-20220202-27-lz2fgj.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A book cover showing the title and author - Cultivating Suspicion: An Ethnography by Leah Davina Junck - and an illustration of an aerial view of a suburban area, colourised." src="https://images.theconversation.com/files/444031/original/file-20220202-27-lz2fgj.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/444031/original/file-20220202-27-lz2fgj.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=827&fit=crop&dpr=1 600w, https://images.theconversation.com/files/444031/original/file-20220202-27-lz2fgj.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=827&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/444031/original/file-20220202-27-lz2fgj.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=827&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/444031/original/file-20220202-27-lz2fgj.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1039&fit=crop&dpr=1 754w, https://images.theconversation.com/files/444031/original/file-20220202-27-lz2fgj.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1039&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/444031/original/file-20220202-27-lz2fgj.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1039&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Langaa RPCIG</span></span>
</figcaption>
</figure>
<p>Strategies developed by the neighbourhood watch group in Observatory to feel more in charge in what they felt was an out-of-control crime situation also meant maintaining a firm distinction between oneself and ‘the other’. Consequently, monitoring the suburb had a deep impact on what kinds of relationships became possible and whose humanity (namely the residents’) was prioritised over the needs of others (the declared suspects).</p><img src="https://counter.theconversation.com/content/174480/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Leah Davina Junck does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The study showed couch patrolling was more common than foot patrols - with social media influencing fears and suspicions.Leah Davina Junck, Postdoctoral Research Fellow, University of Cape TownLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1554742021-02-23T19:08:59Z2021-02-23T19:08:59ZAI facial analysis is scientifically questionable. Should we be using it for border control?<figure><img src="https://images.theconversation.com/files/385731/original/file-20210223-18-1x0hqsp.jpg?ixlib=rb-1.1.0&rect=113%2C29%2C3880%2C2215&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Developments in global border control technologies are providing innovative ways to address issues relating to migration, asylum-seeking and the introduction of illegal goods into countries.</p>
<p>But while governments and national security can benefit from this, advanced surveillance technology creates risks for the misuse of personal data and the violation of human rights.</p>
<h2>Technology at the border</h2>
<p>One of US President Joe Biden’s first actions was to introduce a <a href="https://www.whitehouse.gov/briefing-room/statements-releases/2021/01/20/fact-sheet-president-biden-sends-immigration-bill-to-congress-as-part-of-his-commitment-to-modernize-our-immigration-system/">bill</a> that prioritises “smart border controls”, as part of a commitment to “restore humanity and American values to our immigration system”. </p>
<p>These controls will supplement existing resources at the border with Mexico. They will include technology and infrastructure developed to enhance the screening of incoming asylum seekers and prevent the arrival of narcotics.</p>
<p>According to Biden, “<a href="https://joebiden.com/immigration/">cameras, sensors, large-scale x-ray machines and fixed towers</a>” will all be used. This likely entails the use of infrared cameras, motion sensors, facial recognition, biometric data, aerial drones and radar.</p>
<p>Under the Trump administration, the Immigration and Customs Enforcement agency (ICE) partnered with <a href="https://www.bbc.com/news/business-54348456">controversial</a> data analytics firm <a href="https://www.palantir.com/">Palantir</a> to <a href="https://www.dhs.gov/sites/default/files/publications/ice-pia-033-falcon-tipline-2012.pdf">link tip-offs</a> from police and citizens with other databases, in a bid to arrest undocumented people.</p>
<p>Similarly, from 2016 to 2019, Hungary, Latvia and Greece piloted an automated lie-detection test funded by the European Union’s <a href="https://ec.europa.eu/programmes/horizon2020/en">research and innovation funding program</a>, Horizon 2020. </p>
<p>The <a href="https://www.iborderctrl.eu/The-project">iBorderCtrl</a> test analysed the facial micro-gestures of travellers crossing international borders at three undisclosed airports, with the aim of determining whether travellers were lying about the purpose of their trip. </p>
<p><a href="https://theintercept.com/2019/07/26/europe-border-control-ai-lie-detector/">Avatars</a> questioned travellers about themselves and their trip while webcams scanned face and eye movements. </p>
<p>Europe’s border and coastguard agency <a href="https://frontex.europa.eu">Frontex</a> has also been <a href="https://frontex.europa.eu/media-centre/news/news-release/frontex-helping-to-bring-innovation-to-future-border-control-VetIX5">investing in</a> border control technology for several years. Since last year, Frontex has <a href="https://www.statewatch.org/media/documents/analyses/no-354-frontex-drones.pdf">operated unmanned drones</a> to detect asylum-seekers attempting to enter various European states.</p>
<p>While Australia has been slower to implement enhanced surveillance at maritime borders, in 2018 the federal government announced it would <a href="https://www.sbs.com.au/news/australia-s-new-fleet-of-surveillance-drones-to-scan-for-people-smugglers-sea-threats">spend A$7 billion on six long-range unmanned drones</a> to monitor Australian waters. These aren’t expected to be operational until at least 2023.</p>
<p>Automated border control systems, however, have been used since 2007. <a href="https://www.abf.gov.au/entering-and-leaving-australia/smartgates">SmartGates</a> at many international airports use facial recognition to verify travellers’ identities against data stored in biometric passports. </p>
<p>Last year, the Department of Human Services implemented the <a href="https://www.govtechreview.com.au/content/gov-security/article/home-affairs-lights-up-new-biometric-system-1411277473">Enterprise Biometric Identification Services</a>. The system was reportedly rolled out to meet an expected surge in demand for visa applications and citizenship. </p>
<p>It combines <a href="https://www.unisys.com.au/">authentication</a> technology with <a href="https://www.idemia.com/">biometrics</a> to match the faces and fingerprints of people who wish to travel to Australia.</p>
<h2>Misuse of data</h2>
<p>Governments may promise, as the Biden administration does, that technology will only serve “legitimate agency purposes”. But data misuse by governments is well documented.</p>
<p>Between 2014 and 2017 in the US, ICE used <a href="https://www.bbc.com/news/world-us-canada-48907026">facial recognition to mine state drivers licence databases</a> to detect “illegal immigrants”. </p>
<p>Refugees in various countries, including Kenya and Ethiopia, <a href="https://www.unhcr.org/550c304c9.pdf">have had their biometric data collected</a> for years. </p>
<p>In 2017, Bangladeshi Industry Minister Amir Hossain Amu <a href="http://www.abc.net.au/news/2017-09-26/rights-of-rohingya-in-question-bangladesh-myanmar/8987158">said</a> the government was collecting biometric data from Rohingya people in the country to “keep record” of them and send them “back to their own place”.</p>
<p>Data misuse can also happen when questionable “science” is involved. For instance, emotion recognition algorithms used in unproven lie-detection tests are highly problematic. </p>
<p>The way people communicate varies widely across cultures and situations. Someone’s ability to answer a question at a border could be affected by trauma, their personality, the way the question is framed or the <a href="https://www.researchgate.net/publication/30961048_Questions_of_Credibility_Omissions_Discrepancies_and_Errors_of_Recall_in_the_Testimony_of_Asylum_Seekers">perceived intentions of the interviewer</a>. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/385734/original/file-20210223-24-182uhaj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Girl makes different faces" src="https://images.theconversation.com/files/385734/original/file-20210223-24-182uhaj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/385734/original/file-20210223-24-182uhaj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=254&fit=crop&dpr=1 600w, https://images.theconversation.com/files/385734/original/file-20210223-24-182uhaj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=254&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/385734/original/file-20210223-24-182uhaj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=254&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/385734/original/file-20210223-24-182uhaj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=319&fit=crop&dpr=1 754w, https://images.theconversation.com/files/385734/original/file-20210223-24-182uhaj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=319&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/385734/original/file-20210223-24-182uhaj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=319&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The way different people express emotions is highly nuanced and contextual; it’s not something AI can be relied upon to gauge accurately.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>Technologies such as iBorderCtrl undermine the rights of migrants, asylum-seekers and all international travellers. They could be used to refuse entry or detain travellers based on race or ethnicity.</p>
<p>Racial profiling at borders isn’t uncommon. It came to light again when New South Wales MP Mehreen Faruqi <a href="https://www.theguardian.com/australia-news/2016/jan/15/greens-politician-mehreen-faruqi-subject-to-racial-profiling-in-us-airport">experienced it</a> at a US airport in 2016. </p>
<p>The Pakistani-born Greens member told The Guardian she was detained at an airport for more than an hour, after immigration staff took her fingerprint, asked her where she was “originally from” and how she got an Australian passport. </p>
<p>Facial recognition technology has already been <a href="https://ssrn.com/abstract=3281765">found to be capable of bias against people of colour</a>. Enlisting this at airports and maritime borders — where human rights have <a href="https://www.thenewhumanitarian.org/opinion/2019/07/17/head-head-biometrics-and-aid">historically</a> <a href="https://www.theguardian.com/australia-news/2017/oct/30/australias-asylum-boat-turnbacks-are-illegal-and-risk-lives-un-told">been undermined</a> on the basis of race — could be disastrous.</p>
<h2>Fighting back</h2>
<p>The good news is many people are now speaking out against how border control technologies can impact migrants, refugees and other travellers.</p>
<p>In February, the European Court of Justice heard <a href="http://curia.europa.eu/juris/liste.jsf?oqp=&for=&mat=or&lgrec=en&jge=&td=%3BALL&jur=C%2CT%2CF&num=T-158%252F19&page=1&dates=&pcs=Oor&lg=&pro=&nat=or&cit=none%252CC%252CCJ%252CR%252C2008E%252C%252C%252C%252C%252C%252C%252C%252C%252C%252Ctrue%252Cfalse%252Cfalse&language=en&avg=&cid=5821378">a case</a> brought by digital rights activist and German politician Patrick Breyer. </p>
<p>Breyer is seeking the release of documents on the ethical evaluation, legal admissibility, marketing and test results of iBorderCtrl. He is concerned the EU is being secretive about a “<a href="https://www.patrick-breyer.de/?p=589231&lang=en">scientifically highly controversial project</a>” funded by taxpayer money. </p>
<p>In Australia, the <a href="https://digitalrightswatch.org.au/">Digital Rights Watch</a> is the main organisation that scrutinises surveillance practices. </p>
<p>Of particular <a href="https://digitalrightswatch.org.au/wp-content/uploads/2018/09/Submission-Assistance-and-Access-Bill-2018.pdf">concern</a> is the <a href="https://www.legislation.gov.au/Details/C2018A00148">Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018</a>. This gives the Australian Border Force extensive powers to search devices carried by people travelling internationally. </p>
<p>Last year the <a href="https://www.inslm.gov.au/sites/default/files/2020-07/INSLM_Review_TOLA_related_matters.pdf">government</a> recommended the legislation be amended so agencies can’t authorise the detention of travellers whose devices are searched by the border force. </p>
<p>However, without an Australia bill of rights, which would prevent laws that infringed privacy rights, the potential for data misuse will persist.</p><img src="https://counter.theconversation.com/content/155474/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Niamh Kinchin does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Innovative border control technologies may be great for governments cracking down on migration — but they could further disadvantage groups that are already vulnerable.Niamh Kinchin, Senior Lecturer, School of Law, University of WollongongLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1430782020-07-28T12:17:31Z2020-07-28T12:17:31ZHow to hide from a drone – the subtle art of ‘ghosting’ in the age of surveillance<figure><img src="https://images.theconversation.com/files/349270/original/file-20200723-19-1selkhc.jpg?ixlib=rb-1.1.0&rect=0%2C204%2C4252%2C2346&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The federal government has used military-grade border patrol drones like this one to monitor protests in US cities.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/joncutrer/43252568250/">_ Jonathan Cutrer/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>Drones of all sizes are being used by environmental advocates to monitor deforestation, by conservationists to track poachers, and by journalists and activists to document large protests. As a <a href="https://scholar.google.com/citations?user=MEUtCZYAAAAJ&hl=en">political sociologist</a> who studies social movements and drones, I document a wide range of nonviolent and pro-social drone uses in my new book, “<a href="https://mitpress.mit.edu/books/good-drone">The Good Drone</a>.” I show that these efforts have the potential to democratize surveillance. </p>
<p>But when the Department of Homeland Security redirects large, fixed-wing drones from the U.S.-Mexico border to <a href="https://www.nytimes.com/2020/06/19/us/politics/george-floyd-protests-surveillance.html">monitor protests</a>, and when towns experiment with using drones to <a href="https://www.nbcnews.com/news/us-news/connecticut-town-tests-pandemic-drone-detect-fevers-experts-question-if-n1189546">test people for fevers</a>, it’s time to think about how many eyes are in the sky and how to avoid unwanted aerial surveillance. One way that’s within reach of nearly everyone is learning how to simply disappear from view.</p>
<h2>Crowded skies</h2>
<p>Over the past decade there’s been an explosion in the public’s use of drones – everyday people with everyday tech doing <a href="https://digital.sandiego.edu/gdl2016report/1/">interesting things</a>. As drones enter already-crowded airspace, the Federal Aviation Administration is <a href="https://doi.org/10.15394/ijaaa.2020.1453">struggling to respond</a>. The near future is likely to see even more of these devices in the sky, flown by an ever-growing cast of social, political and economic actors. </p>
<figure class="align-center ">
<img alt="small drone over a city street" src="https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/349265/original/file-20200723-37-1iy93ky.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A law enforcement drone flew over demonstrators, Friday, June 5, 2020, in Atlanta.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/America-Protest-Atlanta/db14ae07df09454398c3fb94439453a4/16/0">AP Photo/Mike Stewart</a></span>
</figcaption>
</figure>
<p>Public opinion about the use and spread of drones is still <a href="https://theconversation.com/dont-shoot-that-drone-overhead-probably-isnt-invading-your-privacy-114701">up in the air</a>, but burgeoning drone use has sparked numerous efforts to curtail drones. These responses range from public policies exerting community control over local airspace, to the development of sophisticated jamming equipment and tactics for knocking drones out of the sky. </p>
<p>From startups to major defense contractors, there is a scramble to deny airspace to drones, to hijack drones digitally, to control drones physically and to shoot drones down. Anti-drone measures range from simple blunt force, <a href="https://www.popularmechanics.com/flight/drones/how-to/a16756/how-to-shoot-down-a-drone/">10-gauge shotguns</a>, to the poetic: <a href="https://www.washingtonpost.com/news/worldviews/wp/2016/02/01/trained-eagle-destroys-drone-in-dutch-police-video/">well-trained hawks</a>. </p>
<p>Many of these anti-drone measures are expensive and complicated. Some are illegal. The most affordable – and legal – way to avoid drone technology is <a href="http://www.dronesurvivalguide.org/">hiding</a>.</p>
<h2>How to disappear</h2>
<p>The first thing you can do to hide from a drone is to take advantage of the natural and built environment. It’s possible to wait for bad weather, since smaller devices like those used by local police have a hard time flying in high winds, dense fogs and heavy rains. </p>
<p>Trees, walls, alcoves and tunnels are more reliable than the weather, and they offer shelter from the high-flying drones used by the Department of Homeland Security.</p>
<figure class="align-center ">
<img alt="Silhouettes of drones" src="https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=418&fit=crop&dpr=1 600w, https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=418&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=418&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=525&fit=crop&dpr=1 754w, https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=525&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/349245/original/file-20200723-33-16zsn27.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=525&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">In some parts of the world, hiding from drones is a matter of life and death.</span>
<span class="attribution"><a class="source" href="http://www.dronesurvivalguide.org/">Drone Survival Guide</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span>
</figcaption>
</figure>
<p>The second thing you can do is minimize your digital footprints. It’s smart to avoid using wireless devices like mobile phones or GPS systems, since they have digital signatures that can reveal your location. This is useful for evading drones, but is also important for avoiding other privacy-invading technologies.</p>
<p>The third thing you can do is confuse a drone. Placing mirrors on the ground, standing over broken glass, and wearing elaborate headgear, <a href="https://www.theguardian.com/technology/2017/jan/04/anti-surveillance-clothing-facial-recognition-hyperface">machine-readable blankets</a> or <a href="https://projectkovr.com/">sensor-jamming jackets</a> can break up and distort the image a drone sees. </p>
<p>Mannequins and other forms of mimicry can confuse both on-board sensors and the analysts charged with monitoring the drone’s video and sensor feeds. </p>
<p>Drones equipped with infrared sensors will see right through the mannequin trick, but are confused by tactics that mask the body’s temperature. For example, a space blanket will mask significant amounts of the body’s heat, as will simply hiding in an area that matches the body’s temperature, like a building or sidewalk exhaust vent.</p>
<p>The fourth, and most practical, thing you can do to protect yourself from drone surveillance is to get a disguise. The growth of mass surveillance has led to an explosion in creative experiments meant to mask one’s identity. But some of the smartest ideas are decidedly old-school and low-tech. Clothing is the first choice, because hats, glasses, masks and scarves go a long way toward scrambling drone-based facial-recognition software. </p>
<figure class="align-center ">
<img alt="Facial makeup chart" src="https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=518&fit=crop&dpr=1 600w, https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=518&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=518&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=651&fit=crop&dpr=1 754w, https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=651&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/349271/original/file-20200723-33-2y6x6e.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=651&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Clever use of makeup can thwart facial recognition systems.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/johnbullas/4591293468/">John C Bullas BSc MSc PhD MCIHT MIAT/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span>
</figcaption>
</figure>
<p>Your gait is as unique as your fingerprint. As gait-recognition software evolves, it will be important to also mask the key pivot points used in identifying the walker. It may be that the best response is affecting a limp, using a minor leg brace or wearing extremely loose clothing.</p>
<p>Artists and scientists have taken these approaches a step further, developing a <a href="https://weburbanist.com/2013/04/01/stealth-wear-counter-surveillance-fashion-protects-privacy/">hoodie wrap</a> that’s intended to shield the owner’s heat signature and to scramble facial recognition software, and <a href="https://www.chicagotribune.com/business/ct-biz-facial-recognition-blocking-glasses-privacy-20200417-isy77jwrsncoholhndmyifadr4-story.html">glasses</a> intended to foil facial recognition systems. </p>
<h2>Keep an umbrella handy</h2>
<p>These innovations are alluring, but umbrellas may prove to be the most ubiquitous and robust tactic in this list. They’re affordable, easy to carry, hard to see around and can be disposed of in a hurry. Plus you can build a <a href="http://survival.sentientcity.net/umbrella.html">high-tech one</a>, if you want.</p>
<p>It would be nice to live in a world with fewer impositions on privacy, one in which law enforcement did not use small quadcopters and the Department of Homeland Security did not redeploy large Predator drones to surveil protesters. And, for people in some parts of the world, it would be nice not to associate the sound of a drone with impending missile fire. But given that those eyes are in the sky, it’s good to know how to hide. </p>
<p>
<section class="inline-content">
<img src="https://images.theconversation.com/files/248895/original/file-20181204-133100-t34yqm.png?w=128&h=128">
<div>
<header>Austin Choi-Fitzpatrick is the author of:</header>
<p><a href="https://mitpress.mit.edu/books/good-drone">The Good Drone: How Social Movements Democratize Surveillance</a></p>
<footer>MIT Press provides funding as a member of The Conversation US.</footer>
</div>
</section>
</p><img src="https://counter.theconversation.com/content/143078/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Austin Choi-Fitzpatrick has previously won an industry award from drone manufacturer DJI, and his work has been supported through the National Science Foundation. MIT Press provides funding as a member of The Conversation US.</span></em></p>Avoiding drones’ prying eyes can be as complicated as donning a high-tech hoodie and as simple as ducking under a tree.Austin Choi-Fitzpatrick, Associate Professor of Political Sociology, University of San DiegoLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1382962020-05-17T19:57:16Z2020-05-17T19:57:16ZThe trade-offs ‘smart city’ apps like COVIDSafe ask us to make go well beyond privacy<p>The Commonwealth government says if enough of us download its COVIDSafe app, restrictions on our movements and activities can be lifted more quickly and life can return to normal. As important as it is to contain the spread of coronavirus, no government decision about how to do that is beyond question. For those of us concerned about the social and political life of our increasingly “smart” cities, the thinking behind the COVIDSafe app and other “smart city” technology must be open to challenge.</p>
<p>The public focus has been on the app’s privacy implications, but other important issues warrant critical scrutiny too. Indeed, the app could help to entrench problematic forms of social and corporate power over our lives. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/darwins-smart-city-project-is-about-surveillance-and-control-127118">Darwin's 'smart city' project is about surveillance and control</a>
</strong>
</em>
</p>
<hr>
<h2>Social control</h2>
<p>As research on the <a href="https://doi.org/10.1177%2F0263775818812084">politics of smart technologies in our cities insists</a>, while personal privacy is important, it’s not the only issue here. Apps like this have implications for the forms of social control that operate in dense urban environments – where use of a digital technology is technically “voluntary”, but ends up being required if people want access to urban spaces and infrastructures. </p>
<p>Some protections are being promised in the case of the COVIDSafe app. These include a prohibition on employers, government authorities and others requiring any individual to install the app. The law still might not stop this in practice. Some business groups have <a href="http://www.theaustralian.com.au/nation/politics/coronavirus-employers-want-power-over-covidsafe-app/news-story/8d1cc1decd2df8a875fd48bf1bd4d949&usg=AOvVaw1CMMU-5bHCGy-_rOxOFAOj">lobbied government</a> to enable employers to require employees to use the app.</p>
<p>Even if this legal prohibition holds, Prime Minister Scott Morrison has been making thinly veiled threats about more people needing to download the app before he lifts restrictions. App uptake is being demanded in the name of a public interest (in this case, public health). </p>
<p>There’s also significant risk of mission creep here. What other “public interests” might be used to justify contract tracing based on this precedent? It’s easy to imagine government agencies and authorities desiring contact tracing in the service of a range of interests that could be discriminatory and oppressive – the policing of immigrants, welfare recipients and activists, for example. </p>
<p>We must guard against such surveillance creep.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ai-can-help-in-crime-prevention-but-we-still-need-a-human-in-charge-95516">AI can help in crime prevention, but we still need a human in charge</a>
</strong>
</em>
</p>
<hr>
<h2>Privacy protections</h2>
<p>Compared to other government and corporate apps, the COVIDSafe app now has relatively strong privacy protections. It keeps information about who you share space or associate with, but not where you go. It does this by storing encrypted data on the user’s phone about any other phones in range of a Bluetooth “handshake” that are also running the app.</p>
<p>Data will be automatically deleted after 21 days. Data will only be shared after a user has tested positive for COVID-19 and agreed to share the data. Only state health authorities may request and access data for contact tracing.</p>
<p>The <a href="https://www.itnews.com.au/news/covidsafe-privacy-protections-now-locked-in-law-548119">legislated protections</a> represent a big advance on some other government apps. For instance, <a href="https://www.smh.com.au/technology/no-warrants-needed-to-access-opal-card-records-20140708-zt02j.html">over 100 government authorities have access</a> to the data the New South Wales government collects from its public transport Opal smartcard. </p>
<p>It may be that neither governments nor corporations can assume people will continue to uncritically accept “<a href="https://reallifemag.com/the-authoritarian-trade-off/">trade-offs</a>” of public goods like personal privacy and autonomy for the convenience and benefits of digital technology.</p>
<p>However, <a href="https://www.iispartners.com/blog">some important privacy issues</a> remain unresolved, including: </p>
<ul>
<li><p>the amount of data stored, which is <a href="https://www.health.gov.au/resources/publications/covidsafe-application-privacy-impact-assessment">about all devices in range</a>, not just those in range for more than 15 minutes</p></li>
<li><p>whether data stored on Amazon servers will <a href="https://www.theguardian.com/law/2020/may/14/questions-remain-over-whether-data-collected-by-covidsafe-app-could-be-accessed-by-us-law-enforcement">potentially be accessible for US law enforcement agencies</a>, </p></li>
<li><p>when and how the data and app will finally be deleted.</p></li>
</ul>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-covidsafe-bill-doesnt-go-far-enough-to-protect-our-privacy-heres-what-needs-to-change-137880">The COVIDSafe bill doesn't go far enough to protect our privacy. Here's what needs to change</a>
</strong>
</em>
</p>
<hr>
<h2>Questions of power and profit</h2>
<p>It’s also important to ask who benefits from the mass uptake of this app.</p>
<p>A government agency developed the app, drawing in part on an open-source app made available by the Singapore government. But even when an app is “free” and no one profits from its sale, remember that smartphones and data are not free. </p>
<p>Data storage has been contracted out to Amazon Web Services. It was the only company asked to tender for this lucrative government contract. That has <a href="https://www.innovationaus.com/sovereign-capability-and-that-shocking-aws-deal/">raised both security concerns and questions</a> about why locally owned, security-accredited providers were not invited.</p>
<p>Like so many instances of “smart” technology being offered as the solution to pressing problems, the profits of big tech and big telcos who sell us devices, connectivity and data storage are being presented as natural and aligned with public good. It is clear tech corporations see the coronavirus crisis as an <a href="https://theintercept.com/2020/05/08/andrew-cuomo-eric-schmidt-coronavirus-tech-shock-doctrine/">opportunity to consolidate and expand their profits and their power</a>. Every problem <a href="https://www.lawfareblog.com/location-surveillance-counter-covid-19-efficacy-what-matters">looks like a nail to the folks who have hammers to sell</a>.</p>
<h2>Will it work?</h2>
<p>Given these concerns, will the COVIDSafe app even perform as promised? Here, the jury is still out. </p>
<p>Much discussion has focused on the minimum number of app users required for its coverage to be effective. But the app has other limitations too. It <a href="https://theconversation.com/contact-tracing-apps-are-vital-tools-in-the-fight-against-coronavirus-but-who-decides-how-they-work-138206">doesn’t yet work properly on iPhones</a>, for a start.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/in-some-places-40-of-us-may-have-downloaded-covidsafe-heres-why-the-government-should-share-what-it-knows-138323">In some places 40% of us may have downloaded COVIDSafe. Here's why the government should share what it knows</a>
</strong>
</em>
</p>
<hr>
<p>Most importantly, the app treats treats Bluetooth handshakes as a proxy for spatial proximity of devices, it treats this spatial proximity as a proxy for contact between people, and it treats prolonged contact between people as a proxy for viral transmission. Each step in this chain is <a href="http://progcity.maynoothuniversity.ie/wp-content/uploads/2020/04/Digital-tech-spread-of-coronavirus-Rob-Kitchin-PC-WP44.pdf">prone to significant failures and error</a>. </p>
<p>Fortunately, then, the government is not proposing to replace contact tracing performed by human health professionals. Data from the app will be used to support that process.</p>
<p>It’s vital we expand the scope of public discussion about this app and others in our increasingly “smart” cities and societies. Otherwise, we risk embracing “smart” solutions that create new surveillance infrastructures that further concentrate state and corporate power at the expense of our autonomy and alternative solutions to pressing societal problems.</p><img src="https://counter.theconversation.com/content/138296/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Kurt Iveson has received funding from the Australian Research Council, the Henry Halloran Trust, and the City of Sydney.</span></em></p>The COVIDSafe app hasn’t come out of nowhere. The promises of ‘smart city’ data collection may be seductive, but we must always weigh up what we’re being asked to give up in return.Kurt Iveson, Associate Professor of Urban Geography and Research Lead, Sydney Policy Lab, University of SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1221372019-08-21T13:47:19Z2019-08-21T13:47:19ZFacial recognition: ten reasons you should be worried about the technology<figure><img src="https://images.theconversation.com/files/288912/original/file-20190821-170922-16dr2bh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/close-portrait-attractive-african-woman-facial-1160287945?src=j2YHKqXCPj1yiGwR-nCHbw-1-44">Karelnoppe/Shutterstock</a></span></figcaption></figure><p>Facial recognition technology is spreading fast. Already <a href="https://fortune.com/2018/10/28/in-china-facial-recognition-tech-is-watching-you/">widespread in China</a>, software that identifies people by comparing images of their faces against a database of records is now being adopted across much of the rest of the world. It’s common among <a href="https://www.nbcnews.com/news/us-news/how-facial-recognition-became-routine-policing-tool-america-n1004251">police forces</a> but has also been used at <a href="https://www.irishtimes.com/opinion/when-your-face-is-your-boarding-pass-you-are-holidaying-with-big-brother-1.3949353">airports</a>, <a href="https://www.bbc.com/news/technology-49357759">railway stations</a> and <a href="https://www.bbc.co.uk/news/uk-england-south-yorkshire-49369772">shopping centres</a>.</p>
<p>The rapid growth of this technology has triggered a much-needed debate. <a href="https://www.theguardian.com/technology/2019/may/21/office-worker-launches-uks-first-police-facial-recognition-legal-action">Activists</a>, <a href="https://www.theverge.com/2019/8/19/20812032/bernie-sanders-facial-recognition-police-ban-surveillance-reform">politicians</a>, <a href="https://www.essex.ac.uk/news/2019/07/03/met-police-live-facial-recognition-trial-concerns">academics</a> and even <a href="https://www.theguardian.com/world/2019/aug/17/police-halt-trials-face-recognition-systems-surveillance-technology">police forces</a> are expressing serious concerns over the impact facial recognition could have on a political culture based on rights and democracy. </p>
<h2>Human rights concerns</h2>
<p>As someone who researches the future of human rights, I share these concerns. Here are ten reasons why we should worry about the use of facial recognition technology in public spaces.</p>
<p><strong>1) It puts us on a path towards automated blanket surveillance</strong></p>
<p>CCTV is already widespread around the world, but for governments to use footage against you they have to find specific clips of you doing something they can claim as evidence. Facial recognition technology brings monitoring to new levels. It enables the automated and indiscriminate live surveillance of people as they go about their daily business, giving authorities the chance to track your every move.</p>
<p><strong>2) It operates without a clear legal or regulatory framework</strong></p>
<p>Most countries have no specific legislation that regulates the use of facial recognition technology, although <a href="https://www.parliament.uk/business/committees/committees-a-z/commons-select/science-and-technology-committee/news-parliament-2017/biometrics-strategy-report-publication-17-19">some lawmakers</a> are <a href="https://www.vox.com/recode/2019/8/9/20799022/facial-recognition-law">trying to change</a> this. This legal limbo opens the door to abuse, such as obtaining our images without our <a href="https://www.libertyhumanrights.org.uk/news/press-releases-and-statements/liberty-client-takes-police-ground-breaking-facial-recognition">knowledge or consent</a> and using them in ways we would not approve of.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/288915/original/file-20190821-170906-fc2pe5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/288915/original/file-20190821-170906-fc2pe5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/288915/original/file-20190821-170906-fc2pe5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/288915/original/file-20190821-170906-fc2pe5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/288915/original/file-20190821-170906-fc2pe5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/288915/original/file-20190821-170906-fc2pe5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/288915/original/file-20190821-170906-fc2pe5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Authorities don’t need to capture everyone’s image to ensure law and order.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/young-woman-picked-out-by-face-1437798524?src=6fixjEhTOV1qTepLhvHttA-1-11">Axel Buerckert/Shutterstock</a></span>
</figcaption>
</figure>
<p><strong>3) It violates the principles of necessity and proportionality</strong></p>
<p>A commonly stated human rights principle, recognised by organisations <a href="https://www.article19.org/resources/un-resolution-affirms-surveillance-that-is-not-necessary-or-proportionate-is-against-the-right-to-privacy/">from the UN</a> to the <a href="http://www.policingethicspanel.london/uploads/4/4/0/7/44076193/lfr_final_report_-_may_2019.pdf">London Policing Ethics Panel</a>, is that surveillance should be necessary and proportionate. This means surveillance should be restricted to the pursuit of serious crime instead of enabling the unjustified interference into our liberty and fundamental rights. Facial recognition technology is at odds with these principles. It is a technology of control that is symptomatic of the state’s mistrust of its citizens.</p>
<p><strong>4) It violates our right to privacy</strong></p>
<p>The right to privacy matters, even in public spaces. It protects the expression of our identity without uncalled-for intrusion from the state or from private companies. Facial recognition technology’s indiscriminate and large-scale recording, storing and analysing of our images <a href="https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2019/08/statement-live-facial-recognition-technology-in-kings-cross">undermines this right</a> because it means we can no longer do anything in public without the state knowing about it.</p>
<p><strong>5) It has a chilling effect on our democratic political culture</strong></p>
<p>Blanket surveillance can deter individuals from attending public events. It can stifle participation in political protests and campaigns for change. And it can discourage nonconformist behaviour. This chilling effect is a serious infringement on the right to freedom of assembly, association, and expression.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/surveillance-cameras-will-soon-be-unrecognisable-time-for-an-urgent-public-conversation-118931">Surveillance cameras will soon be unrecognisable – time for an urgent public conversation</a>
</strong>
</em>
</p>
<hr>
<p><strong>6) It denies citizens the opportunity for consent</strong></p>
<p>There is a lack of detailed and specific information as to how facial recognition is actually used. This means that we are not given the opportunity <a href="https://www.aclu.org/blog/privacy-technology/surveillance-technologies/federal-court-sounds-alarm-privacy-harms-face">to consent</a> to the recording, analysing and storing of our images in databases. By denying us the opportunity to <a href="https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/consent/why-is-consent-important/">consent</a>, we are denied choice and control over the use of our own images.</p>
<p><strong>7) It is often inaccurate</strong></p>
<p>Facial recognition technology promises accurate identification. But <a href="https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28">numerous studies</a> have highlighted how the algorithms trained on racially biased data sets misidentify people of colour, especially women of colour. Such <a href="https://www.wired.com/story/best-algorithms-struggle-recognize-black-faces-equally/">algorithmic bias</a> is particularly worrying if it results in unlawful arrests, or if it leads public agencies and private companies to discriminate against women and people from minority ethnic backgrounds.</p>
<p><strong>8) It can lead to automation bias</strong></p>
<p>If the people using facial recognition software mistakenly believe that the technology is infallible, it can lead to <a href="https://theconversation.com/automation-can-leave-us-complacent-and-that-can-have-dangerous-consequences-62429">bad decisions</a>. This “<a href="https://thewire.in/tech/india-is-falling-down-the-facial-recognition-rabbit-hole">automation bias</a>” must be avoided. Machine-generated outcomes should not determine how state agencies or private corporations treat individuals. Trained human operators must exercise meaningful control and take decisions based in law.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/288919/original/file-20190821-170914-kl4m2i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/288919/original/file-20190821-170914-kl4m2i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/288919/original/file-20190821-170914-kl4m2i.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/288919/original/file-20190821-170914-kl4m2i.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/288919/original/file-20190821-170914-kl4m2i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/288919/original/file-20190821-170914-kl4m2i.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/288919/original/file-20190821-170914-kl4m2i.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Human operators can rely too much on machines.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/security-control-room-officer-monitors-multiple-771480619?src=-1-0">Gorodenkoff</a></span>
</figcaption>
</figure>
<p><strong>9) It implies there are secret government watchlists</strong></p>
<p>The databases that contain our facial images should ring alarm bells. They imply that private companies and law enforcement agencies are sharing our images to build watchlists of potential suspects without our knowledge or consent. This is a serious threat to our individual rights and civil liberties. The security of these databases, and their vulnerability to the actions of hackers, is also cause for concern.</p>
<p><strong>10) It can be used to target already vulnerable groups</strong></p>
<p>Facial recognition technology can be used for blanket surveillance. But it can also be deployed selectively, for example to identify migrants and refugees. The <a href="https://www.vice.com/en_us/article/7x59z9/the-facial-recognition-system-amazon-sells-to-cops-can-now-detect-fear">sale of facial recognition software</a> to agencies such as the controversial US Immigration and Customs Enforcement (ICE), which has been <a href="https://psmag.com/social-justice/abolish-ice">heavily criticised</a> for its tactics in dealing with migrants, should worry anyone who cares for human rights. And the use of handheld mobile devices with a <a href="https://www.bbc.co.uk/news/uk-wales-49261763">facial recognition app</a> by police forces raises the spectre of enhanced <a href="https://www.washingtonpost.com/business/economy/face-recognition-tech/2016/10/17/986929ea-41f0-44a2-b2b9-90b495230dce_story.html">racial profiling</a> at the street level.</p>
<h2>Debate sorely needed</h2>
<p>With so many concerns about facial recognition technology, we desperately need a more prominent conversation on its impact on our rights and civil liberties. Without proper regulation of these systems, we risk creating dystopian police states in what were once free, democratic countries.</p><img src="https://counter.theconversation.com/content/122137/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Birgit Schippers does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Surveillance software that identifies people from CCTV is eroding human rights and democracy.Birgit Schippers, Visiting Research Fellow, Senator George J Mitchell Institute for Global Peace, Security and Justice, Queen's University BelfastLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1096982019-04-22T10:45:26Z2019-04-22T10:45:26ZHow artificial intelligence systems could threaten democracy<p>U.S. technology giant <a href="https://www.ft.com/content/9378e7ee-5ae6-11e9-9dde-7aedca0a081a">Microsoft has teamed up with a Chinese military university</a> to develop <a href="https://www.irishtimes.com/business/technology/microsoft-worked-with-chinese-military-university-on-ai-1.3855553">artificial intelligence systems</a> that could potentially enhance government surveillance and censorship capabilities. Two <a href="https://www.ft.com/content/5f5916fc-5be3-11e9-939a-341f5ada9d40">U.S. senators publicly condemned</a> the partnership, but what the <a href="http://www.nudt.edu.cn/index_eng.htm">National Defense Technology University of China</a> wants from Microsoft isn’t the only concern.</p>
<p>As <a href="https://scholar.google.com/citations?user=OgVZmm4AAAAJ&hl=en">my research shows</a>, the advent of digital repression is profoundly affecting <a href="https://doi.org/10.1353/jod.2019.0003">the relationship between citizen and state</a>. New technologies are arming governments with unprecedented capabilities to monitor, track and surveil individual people. Even governments in democracies with strong traditions of <a href="https://theconversation.com/is-trumps-definition-of-the-rule-of-law-the-same-as-the-us-constitutions-77598">rule of law</a> find themselves tempted to abuse <a href="https://qz.com/813672/half-of-the-united-states-is-registered-in-police-facial-recognition-databases-and-its-completely-unregulated/">these new abilities</a>.</p>
<p>In states with <a href="https://www.foreignaffairs.com/articles/world/2018-07-10/how-artificial-intelligence-will-reshape-global-order">unaccountable institutions and frequent human rights abuses</a>, AI systems will most likely cause greater damage. China is a prominent example. Its leadership has enthusiastically embraced AI technologies, and has set up the world’s <a href="https://www.nytimes.com/interactive/2019/04/04/world/asia/xinjiang-china-surveillance-prison.html">most sophisticated</a> <a href="https://www.engadget.com/2018/02/22/china-xinjiang-surveillance-tech-spread/">surveillance state</a> in <a href="https://www.theguardian.com/world/2019/feb/18/chinese-surveillance-company-tracking-25m-xinjiang-residents">Xinjiang province</a>, tracking citizens’ daily movements and smartphone use.</p>
<p>Its exploitation of these technologies <a href="https://www.georgesoros.com/2019/01/24/remarks-delivered-at-the-world-economic-forum-2/">presents a chilling model</a> for fellow autocrats and poses a direct threat to open democratic societies. Although there’s no evidence that other governments have replicated this level of AI surveillance, Chinese companies are actively exporting the same underlying technologies across the world.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Surveillance in China’s Xinjiang province includes both extensive police patrols and surveillance cameras, like those on the building in the background.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/China-Tracking-Face/cbbeb8deda184d58a0a1f17fab7e2564/9/0">AP Photo/Ng Han Guan</a></span>
</figcaption>
</figure>
<h2>Increasing reliance on AI tools in the US</h2>
<p><a href="https://ai.stanford.edu/%7Enilsson/QAI/qai.pdf">Artificial intelligence systems</a> are everywhere in the modern world, helping run smartphones, internet search engines, digital voice assistants and Netflix movie queues. <a href="https://governanceai.github.io/US-Public-Opinion-Report-Jan-2019/">Many people fail to realize</a> how quickly AI is expanding, thanks to ever-increasing amounts of data to be analyzed, improving algorithms and advanced computer chips. </p>
<p>Any time more information becomes available and analysis gets easier, governments are interested – and not just authoritarian ones. In the U.S., for instance, the 1970s saw revelations that government agencies – such as the FBI, CIA and NSA – had set up <a href="https://www.intelligence.senate.gov/sites/default/files/94755_II.pdf">expansive domestic surveillance networks</a> to monitor and harass civil rights protesters, political activists and Native American groups. These issues haven’t gone away: Digital technology today has deepened the ability of even more agencies to conduct even more intrusive surveillance.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=498&fit=crop&dpr=1 600w, https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=498&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=498&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=625&fit=crop&dpr=1 754w, https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=625&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=625&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">How fairly do algorithms predict where police should be most focused?</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:Criminaliteits_Anticipatie_Systeem.png">Arnout de Vries</a></span>
</figcaption>
</figure>
<p>For example, U.S. police have eagerly embraced AI technologies. They have begun using software that is <a href="https://theconversation.com/why-big-data-analysis-of-police-activity-is-inherently-biased-72640">meant to predict where crimes will happen</a> to decide where to send officers on patrol. They’re also using <a href="https://www.nbcnews.com/news/us-news/facial-recognition-gives-police-powerful-new-tracking-tool-it-s-n894936">facial recognition</a> and <a href="https://www.washingtonpost.com/crime-law/2018/12/13/fbi-plans-rapid-dna-network-quick-database-checks-arrestees/">DNA analysis</a> in criminal investigations. But analyses of these systems show the <a href="https://theconversation.com/congress-takes-first-steps-toward-regulating-artificial-intelligence-104373">data on which those systems are trained</a> are often biased, leading to <a href="https://theconversation.com/did-artificial-intelligence-deny-you-credit-73259">unfair outcomes</a>, such as <a href="https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing">falsely determining that African Americans are more likely to commit crimes</a> than other groups.</p>
<h2>AI surveillance around the world</h2>
<p>In authoritarian countries, AI systems can directly abet domestic control and surveillance, helping <a href="https://www.power3point0.org/2018/01/25/hybrid-repression-online-and-offline-in-china-foretelling-the-human-rights-struggle-to-come/">internal security forces process massive amounts of information</a> – including social media posts, text messages, emails and phone calls – more quickly and efficiently. The police can identify social trends and <a href="https://www.apnews.com/bf75dd1c26c947b7826d270a16e2658a">specific people</a> who might threaten the regime based on the information uncovered by these systems. </p>
<p>For instance, the Chinese government has used AI in wide-scale crackdowns in regions that are home to ethnic minorities within China. Surveillance systems in Xinjiang and Tibet have been described as “<a href="https://foreignpolicy.com/2019/03/19/962492-orwell-china-socialcredit-surveillance/">Orwellian</a>.” These efforts have included <a href="https://www.nytimes.com/2019/02/21/business/china-xinjiang-uighur-dna-thermo-fisher.html">mandatory DNA samples</a>, Wi-Fi network monitoring and widespread facial recognition cameras, all connected to integrated data analysis platforms. With the aid of these systems, Chinese authorities have, according to the U.S. State Department, “arbitrarily detained” between <a href="https://www.state.gov/j/drl/rls/hrrpt/humanrightsreport/index.htm?year=2018&dlid=289037#wrapper">1 and 2 million people</a>.</p>
<p>My <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3374575">research looks at 90 countries around the world</a> with government types ranging from closed authoritarian to flawed democracies, including Thailand, Turkey, Bangladesh and Kenya. I have found that Chinese companies are <a href="https://carnegieendowment.org/2019/01/22/we-need-to-get-smart-about-how-governments-use-ai-pub-78179">exporting AI surveillance technology</a> to at least 54 of these countries. Frequently, this technology is packaged as part of China’s flagship <a href="https://eng.yidaiyilu.gov.cn/">Belt and Road Initiative</a>, which is funding an extensive network of roads, railways, energy pipelines and telecommunications networks <a href="https://www.knightfrank.com/blog/2018/01/30/an-insight-into-the-belt-and-road-initiative">serving 60% of the world’s population</a> and economies that generate 40% of global GDP.</p>
<p>For instance, Chinese companies like <a href="https://e.huawei.com/us/solutions/industries/smart-city">Huawei</a> and ZTE are constructing “smart cities” in <a href="https://www.dawn.com/news/1333101">Pakistan</a>, <a href="https://e.huawei.com/en/case-studies/global/2017/201704261658">the Philippines</a> and <a href="http://www.chinadaily.com.cn/world/2017-05/16/content_29372143.htm">Kenya</a>, featuring extensive built-in surveillance technology. For example, Huawei has outfitted <a href="https://bgc.com.ph/">Bonifacio Global City</a> in the Philippines with high-definition internet-connected cameras that provide “<a href="https://e.huawei.com/en/case-studies/global/2017/201704261658">24/7 intelligent security surveillance</a> with data analytics to detect crime and help manage traffic.”</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=394&fit=crop&dpr=1 600w, https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=394&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=394&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=495&fit=crop&dpr=1 754w, https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=495&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=495&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Bonifacio Global City in the Philippines has a lot of embedded surveillance equipment.</span>
<span class="attribution"><a class="source" href="https://en.wikipedia.org/wiki/File:Bonifacio_Global_City_2.jpg">alveo land/Wikimedia Commons</a></span>
</figcaption>
</figure>
<p><a href="https://foreignpolicy.com/2018/06/13/in-chinas-far-west-companies-cash-in-on-surveillance-program-that-targets-muslims/">Hikvision</a>, <a href="https://www.scmp.com/tech/social-gadgets/article/2142497/malaysian-police-wear-chinese-start-ups-ai-camera-identify">Yitu</a> and <a href="https://qz.com/1248493/sensetime-the-billion-dollar-alibaba-backed-ai-company-thats-quietly-watching-everyone-in-china/">SenseTime</a> are supplying state-of-the-art facial recognition cameras for use in places like <a href="https://www.albawaba.com/news/china%E2%80%99s-newest-global-export-policing-dissidents-1139230">Singapore</a> – which announced the establishment of a surveillance program with <a href="https://www.reuters.com/article/us-singapore-surveillance/singapore-to-test-facial-recognition-on-lampposts-stoking-privacy-fears-idUSKBN1HK0RV">110,000 cameras mounted on lamp posts</a> around the city-state. Zimbabwe is creating a <a href="https://foreignpolicy.com/2018/07/24/beijings-big-brother-tech-needs-african-faces/">national image database</a> that can be used for facial recognition.</p>
<p>However, selling advanced equipment for profit is different than sharing technology with an express geopolitical purpose. These new capabilities may plant the seeds for global surveillance: As governments become increasingly dependent upon Chinese technology to manage their populations and maintain power, they will face greater pressure to align with China’s agenda. But for now it appears that China’s primary motive is to dominate the market for new technologies and make lots of money in the process. </p>
<h2>AI and disinformation</h2>
<p>In addition to providing surveillance capabilities that are both sweeping and fine-grained, AI can help repressive governments manipulate available information and spread disinformation. These campaigns can be automated or automation-assisted, and deploy <a href="https://theconversation.com/solving-the-political-ad-problem-with-transparency-85366">hyper-personalized messages</a> directed at – or against – <a href="https://www.nytimes.com/2018/10/20/us/politics/saudi-image-campaign-twitter.html">specific people</a> or groups. </p>
<p>AI also underpins the technology commonly called “<a href="https://www.technologyreview.com/s/612501/inside-the-world-of-ai-that-forges-beautiful-art-and-terrifying-deepfakes/">deepfake</a>,” in which algorithms create <a href="https://theconversation.com/detecting-deepfake-videos-in-the-blink-of-an-eye-101072">realistic video and audio forgeries</a>. Muddying the waters between truth and fiction may become useful in a tight election, when one candidate could create fake videos showing an opponent doing and saying things that never actually happened.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/cQ54GDm1eL0?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">An early deepfake video shows some of the dangers of advanced technology.</span></figcaption>
</figure>
<p>In my view, policymakers in democracies should think carefully about the risks of AI systems to their own societies and to people living under authoritarian regimes around the world. A critical question is how many countries will adopt China’s model of digital surveillance. But it’s not just authoritarian countries feeling the pull. And it’s also not just Chinese companies spreading the technology: Many U.S. companies, Microsoft included, but <a href="https://www.axios.com/china-us-technology-surveillance-state-5672b822-fdde-45f9-ac77-e7b5574e9351.html">IBM, Cisco and Thermo Fisher</a> too, have provided sophisticated capabilities to nasty governments. The misuse of AI is not limited to autocratic states.</p><img src="https://counter.theconversation.com/content/109698/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Steven Feldstein is a non-resident fellow with the Carnegie Endowment for International Peace. </span></em></p>Even governments in democracies with strong traditions of rule of law find themselves tempted to abuse these new abilities.Steven Feldstein, Frank and Bethine Church Chair of Public Affairs & Associate Professor, School of Public Service, Boise State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/867332018-02-13T17:21:11Z2018-02-13T17:21:11ZMonitoring populations helps to put the right health services in place<figure><img src="https://images.theconversation.com/files/199247/original/file-20171214-27583-i52vca.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Flickr/Isabel Sommerfeld</span></span></figcaption></figure><p>Fourteen years ago South African researchers <a href="https://www.sciencedirect.com/science/article/pii/S0140673608613999">first picked up</a> rising rates of high blood pressure in the population that led to people dying earlier than expected. </p>
<p>But it wasn’t in the bustling urban metropolis of Johannesburg in South Africa’s economic hub where this cardio-metabolic disease epidemic was first found. The trends – that people were increasingly dying from stroke – were picked up in one of the country’s most rural sub-districts.</p>
<p>The <a href="https://www.sciencedirect.com/science/article/pii/S0140673608613999">findings</a> contributed to South Africa’s National Department of Health drawing up a policy to introduce “integrated” primary health care. And through this policy, chronic conditions such as high blood pressure can be tested and treated at the clinics set up primarily to provide antiretrovirals to HIV positive people.</p>
<p>The discovery was not coincidental. It emanated from work done in a health and demographic surveillance system <a href="https://www.sciencedirect.com/science/article/pii/S0140673608613999">set up in 1992</a> in Bushbuckridge, Mpumalanga. The site is run jointly by the South African Medical Research Council and Wits University’s Rural Public Health and Health Transitions Research Unit.</p>
<p>The project collects population and health and socio-economic data on communities in an impoverished and developmentally constrained part of the country over a long period of time.</p>
<p>Health and demographic surveillance systems like these help researchers understand how factors around health, social and economic wellbeing affect people and the societies that they live in.</p>
<p>These systems are an important part of advanced population registration systems. And nations with complete systems are the world’s most developed. A key reason for this is that they can determine if services are meeting the needs of the population. </p>
<p>The site in Bushbuckridge is one of three surveillance systems running in South Africa. The other two sites are in rural Limpopo: <a href="https://academic.oup.com/ije/article/44/5/1565/2594575">Dikgale</a> at the University of Limpopo, and the <a href="https://www.ahri.org/research/">Africa Health Research Institute</a> in rural KwaZulu-Natal. These sites collectively follow a population of about 300 000 people.</p>
<p>The data being collected is expected to provide <a href="https://bmcpublichealth.biomedcentral.com/articles/10.1186/1471-2458-12-741">deep evidence-based insights</a> into major health and socio-economic challenges facing the country which in turn will enable the government to design and evaluate targeted, evidence-informed policy solutions.</p>
<h2>Giving government a heads-up</h2>
<p>When surveillance systems work well, the information that is collected forms part of the national statistics platform of the country. It helps researchers understand detail and dynamics that they are unable to derive from a census. </p>
<p>This is because censuses are only able to see people at one point in time. Surveillance systems can provide detail on changing patterns and the processes affecting these changes. Together, the surveillance system data and census data give policymakers a sound basis to evaluate policies that are not working.</p>
<p>Surveillance system data provides deep and <a href="https://bmcpublichealth.biomedcentral.com/articles/10.1186/1471-2458-12-741">granular insights</a> into the health and wellbeing of a community. They help governments understand the changing dynamics of a particular population. This, in turn, helps them understand what sort of interventions are needed. Here are some examples:</p>
<ul>
<li><p>Data will give a better idea of how and why people move between rural and urban areas and insights into what health and socio-economic services they are getting or being excluded from.</p></li>
<li><p>Tracking the number of pregnancies can provide valuable information about whether or not there are adequate maternal health and family planning services in place.</p></li>
<li><p>Looking at why people are dying is important to understanding if health services need to be adapted or preventative services strengthened.</p></li>
<li><p>Understanding how people’s levels of education and socio-economic status affect their wellbeing.</p></li>
</ul>
<h2>Falling through the cracks</h2>
<p>Surveillance systems do have challenges. One is that the data come from specific geographic locations. Researchers can’t easily tell what happens beyond these boundaries.</p>
<p>This is why it’s important to have surveillance systems in both rural and urban settings so that researchers can understand livelihoods and monitor bi-directional, migration flows linking poor, rural communities with urban centres.</p>
<p>With investment from the Department of Science and Technology, data and data systems from the current three centres are being harmonised, and <a href="http://saprin.mrc.ac.za/">four more surveillance systems</a> are being set up. Three will be in urban settings in Gauteng, eThekwini and the Western Cape and one in a rural setting in the Eastern Cape. This harmonised network is called the South African Population Research Infrastructure Network (SAPRIN), which is hosted by <a href="http://saprin.mrc.ac.za/">The Medical Research Council</a></p>
<p>The full SAPRIN platform will include 550,000 people –- around 1% of South Africa’s census population. The platform will form <a href="http://saprin.mrc.ac.za/SAPRINfactSHEET.pdf">a network</a> that will be able to generate high-quality evidence to respond to some of South Africa’s biggest issues, which include poverty, inequality, unemployment, education and poor access to effective health care.</p>
<p>It will do this by linking to the public sector’s health system records as well as public school attendance registers and have access to the statistics around social grants. This will help researchers understand how people are using the services that the government has made available.</p>
<h2>The bigger picture</h2>
<p>Inadequate or even misleading evidence for planning is a complex problem in all countries, but especially low and middle-income countries. It arises due to limitations in infrastructure, especially in poorer parts of the country, and the costs involved for people registering key events in their lives. </p>
<p>South Africa is not the only country in the developing world to have surveillance systems like this. The three surveillance sites in South Africa are part of a <a href="http://www.indepth-network.org/">network of 37 health and demographic surveillance system sites</a> in sub-Saharan Africa, comprising the <a href="http://www.indepth-network.org/">INDEPTH Network</a></p>
<p>A combination of national census, vital registration and localised health and demographic surveillance data can be expected to fill the evidence gap in developing countries. </p>
<p>This will enable planners to have immediate and longer-run feedback on the impact of policies and programmes designed to improve health care and socio-economic status. </p>
<p>For this reason, we can expect to see more investment in surveillance over time and a bigger push to combine datasets to understand what is going on and what is needed.</p><img src="https://counter.theconversation.com/content/86733/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Mark A. Collinson receives funding from the South African Department of Science and Technology and the National Institute of Health in the US. </span></em></p><p class="fine-print"><em><span>Kobus Herbst receives funding from the South African Department of Science and Technology and the Wellcome Trust.</span></em></p>Health and demographic surveillance systems are important to understand people and the societies that they live in.Mark A. Collinson, Reader in Population and Public Health, MRC/Wits Rural Public Health and Health Transitions Research Unit, School of Public Health, University of the WitwatersrandKobus Herbst, Chief Information Officer at the Africa Health Research Institute, University of KwaZulu-NatalLicensed as Creative Commons – attribution, no derivatives.