tag:theconversation.com,2011:/ca/topics/data-surveillance-39261/articlesData surveillance – The Conversation2024-02-22T19:21:05Ztag:theconversation.com,2011:article/2240762024-02-22T19:21:05Z2024-02-22T19:21:05ZThe secret sauce of Coles’ and Woolworths’ profits: high-tech surveillance and control<p>Coles and Woolworths, the supermarket chains that together control <a href="https://www.abc.net.au/news/2024-02-20/woolworths-coles-supermarket-tactics-grocery-four-corners/103405054">almost two-thirds</a> of the Australian grocery market, are facing unprecedented scrutiny. </p>
<p>One recent inquiry, commissioned by the Australian Council of Trade Unions and led by former Australian Consumer and Competition Commission chair Allan Fels, found the pair engaged in unfair pricing practices; an ongoing <a href="https://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Supermarket_Prices/SupermarketPrices">Senate inquiry into food prices</a> is looking at how these practices are linked to inflation; and the ACCC has just begun <a href="https://www.accc.gov.au/inquiries-and-consultations/supermarkets-inquiry-2024-25">a government-directed inquiry</a> into potentially anti-competitive behaviour in Australia’s supermarkets. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/8-ways-woolworths-and-coles-squeeze-their-suppliers-and-their-customers-223857">8 ways Woolworths and Coles squeeze their suppliers and their customers</a>
</strong>
</em>
</p>
<hr>
<p>Earlier this week, the two companies also came under the gaze of the <a href="https://www.abc.net.au/news/2024-02-19/super-power-the-cost-of-living-with-coles-and-woolworths/103486508">ABC current affairs program Four Corners</a>. Their respective chief executives each gave somewhat prickly interviews, and Woolworths chief Brad Banducci <a href="https://www.abc.net.au/news/2024-02-21/woolworths-ceo-brad-banducci-retirement-four-corners/103493418">announced his retirement</a> two days after the program aired.</p>
<p>A focus on the power of the supermarket duopoly is long overdue. However, one aspect of how Coles and Woolworths exercise their power has received relatively little attention: a growing high-tech infrastructure of surveillance and control that pervades retail stores, warehouses, delivery systems and beyond.</p>
<h2>Every customer a potential thief</h2>
<p>As the largest private-sector employers and providers of essential household goods, the supermarkets play an outsized role in public life. Indeed, they are such familiar places that technological developments there may fly under the radar of public attention.</p>
<p>Coles and Woolworths are both implementing technologies that treat the supermarket as a “problem space” in which workers are controlled, customers are tracked and profits boosted.</p>
<p>For example, in response to a purported spike in shoplifting, a raft of customer surveillance measures have been introduced that treat every customer as a potential thief. This includes <a href="https://www.news.com.au/lifestyle/food/eat/coles-introducing-new-technology-which-will-track-shoppers-every-move/news-story/86ea8d330f76df87f2235eeda4d1136e">ceiling cameras</a> which assign a digital ID to individuals and track them through the store, and <a href="https://www.thenewdaily.com.au/finance/consumer/2023/08/16/smart-gate-technology">“smart” exit gates</a> that remain closed until a purchase is made. Some customers have reported being “<a href="https://7news.com.au/lifestyle/coles-supermarketshoppers-dramatic-checkout-experience-goes-viral-i-was-trapped-c-12977760">trapped</a>” by the gate despite paying for their items, causing significant embarrassment.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/577235/original/file-20240222-22-8d21o0.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A grainy security camera image from above a self-checkout area showing areas outlined in yellow." src="https://images.theconversation.com/files/577235/original/file-20240222-22-8d21o0.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/577235/original/file-20240222-22-8d21o0.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=360&fit=crop&dpr=1 600w, https://images.theconversation.com/files/577235/original/file-20240222-22-8d21o0.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=360&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/577235/original/file-20240222-22-8d21o0.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=360&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/577235/original/file-20240222-22-8d21o0.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=452&fit=crop&dpr=1 754w, https://images.theconversation.com/files/577235/original/file-20240222-22-8d21o0.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=452&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/577235/original/file-20240222-22-8d21o0.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=452&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Woolworths surveillance cameras monitor the self-checkout area.</span>
<span class="attribution"><span class="source">Woolworths</span></span>
</figcaption>
</figure>
<p>At least one Woolworths store has <a href="https://www.news.com.au/finance/business/woolies-in-wetherill-park-fitted-with-500-tiny-cameras-to-monitor-stock-levels/news-story/585de8c741ae9f520adcc4005f2a736a">installed 500 mini cameras</a> on product shelves. The cameras monitor real-time stock levels, and Woolworths says customers captured in photos will be silhouetted for privacy.</p>
<p>A Woolworths spokesperson <a href="https://www.smh.com.au/national/nsw/up-to-70-cameras-watch-you-buy-groceries-what-happens-to-that-footage-20230819-p5dxtp.html">explained</a> the shelf cameras were part of “a number of initiatives, both covert and overt, to minimise instances of retail crime”. It is unclear whether the cameras are for inventory management, surveillance, or both.</p>
<p>Workers themselves are being fitted with body-worn cameras and wearable alarms. Such measures may protect against customer aggression, which is a <a href="https://www.abc.net.au/news/2023-11-22/retail-union-staff-abuse-cost-of-living-christmas/103117014">serious problem facing workers</a>. Biometric data collected this way could also be used to discipline staff in what scholars Karen Levy and Solon Barocas refer to as “<a href="https://ijoc.org/index.php/ijoc/article/view/7041">refractive surveillance</a>” – a process whereby surveillance measures intended for one group can also impact another.</p>
<h2>Predicting crime</h2>
<p>At the same time as the supermarkets ramp up the amount of data they collect on staff and shoppers, they are also investing in data-driven “crime intelligence” software. Both supermarkets have <a href="https://www.smartcompany.com.au/industries/information-technology/grocery-chains-surveillance-tech-auror/">partnered with New Zealand start-up Auror</a>, which shares a name with the magic police from the Harry Potter books and claims it can <a href="https://www.auror.co/retail-crime-intelligence#What-is-Retail-Crime-Intelligence">predict crime before it happens</a>.</p>
<p>Coles also recently began a partnership with Palantir, a global data-driven surveillance company that takes its name from magical crystal balls in The Lord of the Rings.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/solving-the-supermarket-why-coles-just-hired-us-defence-contractor-palantir-222883">Solving the supermarket: why Coles just hired US defence contractor Palantir</a>
</strong>
</em>
</p>
<hr>
<p>These heavy-handed measures seek to make self-service checkouts more secure without increasing staff numbers. This leads to something of a vicious cycle, as under-staffing, self-checkouts, and high prices are often <a href="https://www.aap.com.au/news/retail-workers-facing-increased-violence-and-abuse/">causes of customer aggression</a> to begin with. </p>
<p>Many staff are similarly frustrated by <a href="https://www.theguardian.com/business/2023/jun/05/coles-woolworths-court-accused-of-underpaying-workers">historical wage theft by the supermarkets</a> that totals hundreds of millions of dollars. </p>
<h2>From community employment to gig work</h2>
<p>Both supermarkets have brought the gig economy squarely <a href="https://theconversation.com/coles-uber-eats-deal-brings-the-gig-economy-inside-the-traditional-workplace-204353">inside the traditional workplace</a>. Uber and Doordash drivers are now part of the infrastructure of home delivery, in an attempt to push last-mile delivery costs onto gig workers. </p>
<p>The precarious working conditions of the gig economy are well known. Customers may not be aware, however, that Coles recently increased Uber Eats and Doordash prices by at least 10%, and will <a href="https://7news.com.au/lifestyle/shoppers-slam-coles-over-major-change-to-half-price-buys-that-will-affect-millions-c-12860556">no longer match in-store promotions</a>. Drivers have been instructed to dispose of the shopping receipt and should no longer place it in the customer’s bag at drop-off. </p>
<p>In addition to higher prices, customers also pay service and delivery fees for the convenience of on-demand delivery. Despite the price increases to customers, drivers I have interviewed in my ongoing research report they are earning less and less through the apps, often well below Australia’s minimum wage.</p>
<p>Viewed as a whole, Coles’ and Woolworths’ high-tech measures paint a picture of surveillance and control that exerts pressures on both customers and workers. While issues of market competition, price gouging, and power asymmetries with suppliers must be scrutinised, issues of worker and customer surveillance are the other side of the same coin – and they too must be reckoned with.</p><img src="https://counter.theconversation.com/content/224076/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Lauren Kate Kelly receives funding from the Australian Research Council (ARC) and the ARC Centre of Excellence for Automated Decision-Making and Society. She works with United Workers Union which has members across the supermarket supply chain.</span></em></p>The hidden side of the supermarket giants’ quest for profits is an increasingly elaborate system for monitoring and managing shoppers and workers.Lauren Kate Kelly, PhD Candidate, ARC Centre of Excellence for Automated Decision-Making and Society, RMIT UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2215692024-01-25T13:16:33Z2024-01-25T13:16:33ZHow to protect your data privacy: A digital media expert provides steps you can take and explains why you can’t go it alone<figure><img src="https://images.theconversation.com/files/570983/original/file-20240123-19-zcfnx.jpg?ixlib=rb-1.1.0&rect=2323%2C526%2C3473%2C4538&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">You probably know you're being tracked online, but what can you do about it?</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/illustration/large-eyeball-on-smart-phone-watching-woman-royalty-free-illustration/1440469231">Malte Mueller/fStop via Getty Images</a></span></figcaption></figure><p>Perfect safety is no more possible online than it is when driving on a crowded road with strangers or walking alone through a city at night. Like roads and cities, the internet’s dangers arise from choices society has made. To enjoy the freedom of cars comes with the risk of accidents; to have the pleasures of a city full of unexpected encounters means some of those encounters can harm you. To have an open internet means people can always find ways to hurt each other. </p>
<p>But some highways and cities are safer than others. Together, people can make their online lives safer, too.</p>
<p>I’m a <a href="https://scholar.google.com/citations?hl=en&user=a3nrKn8AAAAJ&view_op=list_works&sortby=pubdate">media scholar</a> who researches the online world. For decades now, I have experimented on myself and my devices to explore what it might take to live a digital life on my own terms. But in the process, I’ve learned that my privacy cannot come from just my choices and my devices.</p>
<p>This is a guide for getting started, with the people around you, on the way toward a safer and healthier online life.</p>
<h2>The threats</h2>
<p>The dangers you face online take very different forms, and they require different kinds of responses. The kind of threat you hear about most in the news is the straightforwardly criminal sort of hackers and scammers. The perpetrators typically want to steal victims’ identities or money, or both. These attacks take advantage of <a href="https://academic.oup.com/ips/article/16/3/olac013/6649355">varying legal and cultural norms</a> around the world. Businesses and governments often offer to defend people from these kinds of threats, without mentioning that they can pose threats of their own.</p>
<p>A second kind of threat comes from businesses that lurk in the cracks of the online economy. Lax protections allow them to scoop up vast quantities of data about people and sell it to abusive advertisers, police forces and others willing to pay. Private data brokers <a href="https://theconversation.com/data-brokers-know-everything-about-you-what-ftc-case-against-ad-tech-giant-kochava-reveals-218232">most people have never heard of</a> gather data from apps, transactions and more, and they sell what they learn about you without needing your approval.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/Hhyzx-rRBps?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">How the data economy works.</span></figcaption>
</figure>
<p>A third kind of threat comes from established institutions themselves, such as the large tech companies and government agencies. These institutions <a href="https://doi.org/10.1080/21624887.2023.2239002">promise a kind of safety</a> if people trust them – protection from everyone but themselves, as they liberally collect your data. Google, for instance, provides tools with high security standards, but its business model is built on <a href="https://www.cnbc.com/2021/05/18/how-does-google-make-money-advertising-business-breakdown-.html">selling ads</a> based on what people do with those tools. Many people feel they have to accept this deal, because everyone around them already has.</p>
<p>The stakes are high. <a href="https://www.genderit.org/editorial/feminist-conversation-cybersecurity">Feminist</a> and <a href="https://www.criticalracedigitalstudies.com/">critical race</a> scholars have demonstrated that surveillance has long been the basis of unjust discrimination and exclusion. As African American studies scholar <a href="https://scholar.google.com/citations?hl=en&user=XlsH9jEAAAAJ&view_op=list_works&sortby=pubdate">Ruha Benjamin</a> puts it, online surveillance has become a “<a href="https://cyber.harvard.edu/events/new-jim-code">new Jim Code</a>,” excluding people from jobs, fair pricing and other opportunities based on how computers are trained to watch and categorize them.</p>
<p>Once again, there is no formula for safety. When you make choices about your technology, individually or collectively, you are really making choices about whom and how you trust – shifting your trust from one place to another. But those choices can make a real difference.</p>
<h2>Phase 1: Basic data privacy hygiene</h2>
<p>To get started with digital privacy, there are a few things you can do fairly easily on your own. First, use a password manager like <a href="https://bitwarden.com/">Bitwarden</a> or <a href="https://proton.me/pass">Proton Pass</a>, and make all your passwords unique and complex. If you can remember a password easily, it’s probably not keeping you safe. Also, enable two-factor authentication, which typically involves receiving a code in a text message, wherever you can.</p>
<p>As you browse the web, use a browser like <a href="https://firefox.com/">Firefox</a> or <a href="https://brave.com/">Brave</a> with a strong commitment to privacy, and add to that a good ad blocker like <a href="https://ublockorigin.com/">uBlock Origin</a>. Get in the habit of using a search engine like <a href="https://duckduckgo.com/">DuckDuckGo</a> or <a href="https://search.brave.com/">Brave Search</a> that doesn’t profile you based on your past queries.</p>
<p>On your phone, download only the apps you need. It can help to <a href="https://www.popsci.com/story/diy/how-to-reset-devices/">wipe and reset</a> everything periodically to make sure you keep only what you really use. Beware especially of apps that track your location and access your files. For Android users, <a href="https://f-droid.org/">F-Droid</a> is an alternative app store with more privacy-preserving tools. The Consumer Reports app <a href="https://www.permissionslipcr.com/">Permission Slip</a> can help you manage how other apps use your data.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/a1i-3xwcSGA?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Here are more details on how to reduce your exposure to data collection online.</span></figcaption>
</figure>
<h2>Phase 2: Shifting away</h2>
<p>Next, you can start shifting your trust away from companies that make their money from surveillance. But this works best if you can get your community involved; if they are using Gmail, and you email them, Google gets your email whether you use Gmail yourself or not. Try an email provider like <a href="https://proton.me/mail">Proton Mail</a> that doesn’t rely on targeted ads, and see if your friends will try it, too. For mobile chat, <a href="https://www.signal.org/">Signal</a> makes encrypted messages easy, but only if others are using it with you.</p>
<p>You can also try using privacy-preserving operating systems for your devices. <a href="https://grapheneos.org/">GrapheneOS</a> and <a href="https://e.foundation/e-os/">/e/OS</a> are versions of Android that avoid sending your phone’s data to Google. For your computer, <a href="https://pop.system76.com/">Pop!_OS</a> is a friendly version of Linux. Find more ideas for shifting away at science and technology scholar Janet Vertesi’s <a href="https://www.optoutproject.net/">Opt-Out Project</a> website.</p>
<h2>Phase 3: New foundations</h2>
<p>If you are ready to go even further, rethink how your community or workplace collaborates. In my university lab, we <a href="https://manifold.umn.edu/projects/the-lab-book">run our own servers</a> to manage our tools, including <a href="https://nextcloud.com/">Nextcloud</a> for file sharing and <a href="http://matrix.org/">Matrix</a> for chat. </p>
<p>This kind of shift, however, requires a collective commitment in how organizations spend money on technology, away from big companies and toward investing in the ability to manage your tools. It can take extra work to build what I call “<a href="https://doi.org/10.31269/triplec.v20i1.1281">governable stacks</a>” – tools that people manage and control together – but the result can be a more satisfying, empowering relationship with technology.</p>
<h2>Protecting each other</h2>
<p>Too often, people are told that being safe online is a job for individuals, and it is your fault if you’re not doing it right. But I think this is a kind of victim blaming. In my view, the biggest source of danger online is the lack of public policy and collective power to prevent surveillance from being the basic business model for the internet.</p>
<p>For years, people have organized “<a href="https://doi.org/10.14763/2020.4.1508">cryptoparties</a>” where they can come together and learn how to use privacy tools. You can also support organizations like the <a href="https://eff.org/">Electronic Frontier Foundation</a> that advocate for privacy-protecting public policy. If people assume that privacy is just an individual responsibility, we have already lost.</p><img src="https://counter.theconversation.com/content/221569/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nathan Schneider receives funding from a range of entities, including the National Science Foundation, the Henry Luce Foundation, and the Ethereum Foundation. He serves on several nonprofit boards, including those of Metagov, Start.coop, Waging Nonviolence, and Zebras Unite.</span></em></p>Your data privacy is under threat from hackers, data brokers and big tech. Here’s what you can do about it. Step 1 is to get your colleagues, friends and family on board.Nathan Schneider, Assistant Professor of Media Studies, University of Colorado BoulderLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2077072023-06-29T12:16:21Z2023-06-29T12:16:21ZUS agencies buy vast quantities of personal information on the open market – a legal scholar explains why and what it means for privacy in the age of AI<figure><img src="https://images.theconversation.com/files/534425/original/file-20230627-39049-o6zul0.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C3840%2C2160&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Government agencies can track you, thanks to the vast amounts of personal information available for sale.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/society-and-communication-network-concept-royalty-free-image/1326558181">metamorworks/iStock via Getty Images</a></span></figcaption></figure><p>Numerous government agencies, including the FBI, Department of Defense, National Security Agency, Treasury Department, Defense Intelligence Agency, Navy and Coast Guard, have purchased vast amounts of U.S. citizens’ personal information from commercial data brokers. The revelation was published in a partially declassified, internal <a href="https://www.dni.gov/index.php/newsroom/press-releases/press-releases-2023/item/2390-dni-haines-statement-on-declassified-report-on-commercially-available-information">Office of the Director of National Intelligence report</a> released on June 9, 2023.</p>
<p>The report shows the breathtaking scale and invasive nature of the consumer data market and how that market directly enables wholesale surveillance of people. The data includes not only where you’ve been and who you’re connected to, but the nature of your beliefs and predictions about what you might do in the future. The report underscores the grave risks the purchase of this data poses, and urges the intelligence community to adopt internal guidelines to address these problems.</p>
<p>As a privacy, electronic surveillance and technology law <a href="https://www.annetoomeymckenna.com/">attorney, researcher and law professor</a>, I have spent years researching, <a href="https://papers.ssrn.com/sol3/cf_dev/AbsByAuth.cfm?per_id=2643050">writing</a> and advising about the legal issues the report highlights. </p>
<p>These issues are increasingly urgent. Today’s commercially available information, coupled with the now-ubiquitous decision-making artificial intelligence and generative AI like ChatGPT, significantly increases the threat to privacy and civil liberties by giving the government access to sensitive personal information beyond even what it could collect through court-authorized surveillance.</p>
<h2>What is commercially available information?</h2>
<p>The drafters of the report take the position that commercially available information is a subset of publicly available information. The distinction between the two is significant from a legal perspective. Publicly available information is information that is already in the public domain. You could find it by doing a little online searching. </p>
<p>Commercially available information is different. It is personal information collected from a dizzying array of sources by commercial data brokers that aggregate and analyze it, then make it available for purchase by others, including governments. Some of that information is private, confidential or otherwise legally protected.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/534454/original/file-20230627-16-adjj2j.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A chart with four columns and three rows" src="https://images.theconversation.com/files/534454/original/file-20230627-16-adjj2j.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/534454/original/file-20230627-16-adjj2j.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=288&fit=crop&dpr=1 600w, https://images.theconversation.com/files/534454/original/file-20230627-16-adjj2j.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=288&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/534454/original/file-20230627-16-adjj2j.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=288&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/534454/original/file-20230627-16-adjj2j.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=362&fit=crop&dpr=1 754w, https://images.theconversation.com/files/534454/original/file-20230627-16-adjj2j.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=362&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/534454/original/file-20230627-16-adjj2j.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=362&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The commercial data market collects and packages vast amounts of data and sells it for various commercial, private and government uses.</span>
<span class="attribution"><span class="source">Government Accounting Office</span></span>
</figcaption>
</figure>
<p>The sources and types of data for commercially available information are mind-bogglingly vast. They include public records and other publicly available information. But far more information comes from the nearly ubiquitous internet-connected devices in people’s lives, like cellphones, <a href="https://spectrum.ieee.org/smart-home-devices-can-reveal-the-health-status-of-individuals">smart home systems</a>, cars and fitness trackers. These all harness data from sophisticated, embedded <a href="https://theconversation.com/ftc-lawsuit-spotlights-a-major-privacy-risk-from-call-records-to-sensors-your-phone-reveals-more-about-you-than-you-think-189618">sensors, cameras and microphones</a>. Sources also include data from apps, online activity, texts and emails, and even <a href="https://www.hipaajournal.com/meta-facing-scrutiny-over-use-of-meta-pixel-tracking-code-on-hospital-websites/">health care provider websites</a>. </p>
<p>Types of <a href="https://www.dni.gov/files/ODNI/documents/assessments/ODNI-Declassified-Report-on-CAI-January2022.pdf">data include</a> location, gender and sexual orientation, religious and political views and affiliations, <a href="https://www.ftc.gov/business-guidance/blog/2022/07/location-health-and-other-sensitive-information-ftc-committed-fully-enforcing-law-against-illegal">weight and blood pressure, speech patterns, emotional states, behavioral information about myriad activities, shopping patterns</a> and family and friends. </p>
<p>This data provides companies and governments a window into the “<a href="https://doi.org/10.1109/ISCON52037.2021.9702450">Internet of Behaviors</a>,” a combination of data collection and analysis aimed at understanding and predicting people’s behavior. It pulls together a wide range of data, including location and activities, and uses scientific and technological approaches, including psychology and machine learning, to analyze that data. The Internet of Behaviors provides a map of what each person has done, is doing and is expected to do, and provides a <a href="https://doi.org/10.1016/j.sintl.2021.100122">means to influence a person’s behavior</a>. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/pFg3_bW78Ms?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Smart homes could be good for your wallet and good for the environment, but really bad for your privacy.</span></figcaption>
</figure>
<h2>Better, cheaper and unrestricted</h2>
<p>The rich depths of commercially available information, analyzed with powerful AI, provide unprecedented power, intelligence and investigative insights. The information is a cost-effective way to surveil virtually everyone, plus it provides far more sophisticated data than traditional electronic surveillance tools or methods like wiretapping and location tracking. </p>
<p>Government use of electronic surveillance tools is extensively <a href="https://bja.ojp.gov/program/it/privacy-civil-liberties/authorities/statutes/1285">regulated by federal</a> and state laws. The U.S. Supreme Court has ruled that the Constitution’s <a href="https://www.law.cornell.edu/constitution/fourth_amendment">Fourth Amendment</a>, which prohibits unreasonable searches and seizures, requires a warrant for a wide range of digital searches. These include wiretapping or <a href="https://supreme.justia.com/cases/federal/us/389/347/">intercepting a person’s calls</a>, texts or emails; <a href="https://www.law.cornell.edu/supremecourt/text/10-1259">using GPS</a> or <a href="https://www.supremecourt.gov/opinions/17pdf/16-402_h315.pdf">cellular location information</a> to track a person; or <a href="https://supreme.justia.com/cases/federal/us/573/373/">searching a person’s cellphone</a>. </p>
<p>Complying with these laws takes time and money, plus electronic surveillance law restricts what, when and how data can be collected. Commercially available information is cheaper to obtain, provides far richer data and analysis, and is subject to little oversight or restriction compared to when the same data is collected directly by the government.</p>
<h2>The threats</h2>
<p>Technology and the burgeoning volume of commercially available information allow various forms of the information to be combined and analyzed in new ways to understand all aspects of your life, including preferences and desires. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/vc7_TKN0kfw?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">How the collection, aggregation and sale of your data violates your privacy.</span></figcaption>
</figure>
<p>The Office of the Director of National Intelligence report warns that the increasing volume and widespread availability of commercially available information poses “significant threats to privacy and civil liberties.” It increases the power of the government to surveil its citizens outside the bounds of law, and it opens the door to the government using that data in potentially unlawful ways. This could include <a href="https://www.scientificamerican.com/article/yes-phones-can-reveal-if-someone-gets-an-abortion/">using location data obtained via commercially available information rather than a warrant</a> to investigate and prosecute someone for abortion. </p>
<p>The report also captures both how widespread government purchases of commercially available information are and how haphazard government practices around the use of the information are. The purchases are so pervasive and agencies’ practices so poorly documented that the Office of the Director of National Intelligence cannot even fully determine how much and what types of information agencies are purchasing, and what the various agencies are doing with the data. </p>
<h2>Is it legal?</h2>
<p>The question of whether it’s legal for government agencies to purchase commercially available information is complicated by the array of sources and complex mix of data it contains. </p>
<p>There is no legal prohibition on the government collecting information already disclosed to the public or otherwise publicly available. But the nonpublic information listed in the declassified report includes data that U.S. law typically protects. The nonpublic information’s mix of private, sensitive, confidential or otherwise lawfully protected data makes collection a legal gray area. </p>
<p>Despite decades of increasingly sophisticated and invasive commercial data aggregation, Congress has not passed a federal data privacy law. The lack of federal regulation around data <a href="https://theconversation.com/what-is-fog-reveal-a-legal-scholar-explains-the-app-some-police-forces-are-using-to-track-people-without-a-warrant-189944">creates a loophole</a> for government agencies to evade electronic surveillance law. It also allows agencies to amass enormous databases that AI systems learn from and use in often unrestricted ways. The resulting erosion of privacy has been <a href="https://ssrn.com/abstract=2905131">a concern for more than a decade</a>. </p>
<h2>Throttling the data pipeline</h2>
<p>The Office of the Director of National Intelligence report acknowledges the stunning loophole that commercially available information provides for government surveillance: “The government would never have been permitted to compel billions of people to carry location tracking devices on their persons at all times, to log and track most of their social interactions, or to keep flawless records of all their reading habits. Yet smartphones, connected cars, web tracking technologies, the Internet of Things, and other innovations have had this effect without government participation.”</p>
<p>However, it isn’t entirely correct to say “without government participation.” The legislative branch could have prevented this situation by enacting data privacy laws, more tightly regulating commercial data practices, and providing oversight in AI development. Congress could yet address the problem. Representative Ted Lieu has introduced the <a href="https://lieu.house.gov/media-center/press-releases/reps-lieu-buck-eshoo-and-sen-schatz-introduce-bipartisan-bicameral-bill">a bipartisan proposal for a National AI Commission</a>, and Senator Chuck Schumer has proposed <a href="https://www.washingtonpost.com/technology/2023/06/21/ai-regulation-us-senate-chuck-schumer/">an AI regulation framework</a>. </p>
<p>Effective data privacy laws would keep your personal information safer from government agencies and corporations, and responsible AI regulation would block them from manipulating you.</p><img src="https://counter.theconversation.com/content/207707/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Anne Toomey McKenna has received funding from the National Security Agency for the development of legal educational materials about Cyberlaw and funding from The National Police Foundation together with the U.S. Department of Justice-COPS division for legal analysis regarding the use of drones in domestic policing. She is affiliated with IEEE-USA, and she co-chairs IEEE's Artificial Intelligence Policy Committee; this position involves subject matter and education-related interaction with congressional staffers and the Congressional AI Caucus. </span></em></p>The government faces legal restrictions on how much personal information it can gather on citizens, but the law is largely silent on agencies purchasing the data from commercial brokers.Anne Toomey McKenna, Visiting Professor of Law, University of RichmondLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1949172022-12-18T19:18:01Z2022-12-18T19:18:01ZNot Big Brother, but close: a surveillance expert explains some of the ways we’re all being watched, all the time<figure><img src="https://images.theconversation.com/files/499955/original/file-20221209-20279-c0jq3z.jpeg?ixlib=rb-1.1.0&rect=95%2C107%2C7893%2C4383&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>A group of <a href="https://www.nature.com/articles/srep01376;">researchers studied</a> 15 months of human mobility movement data taken from 1.5 million people and concluded that just four points in space and time were sufficient to identify 95% of them, even when the data weren’t of excellent quality.</p>
<p>That was back in 2013. </p>
<p>Nearly ten years on, surveillance technologies permeate all aspects of our lives. They collect swathes of data from us in various forms, and often without us knowing.</p>
<p>I’m a surveillance researcher with a focus on technology governance. Here’s my round-up of widespread surveillance systems I think everyone should know about.</p>
<h2>CCTV and open-access cameras</h2>
<p>Although China has more than 50% of <a href="https://www.comparitech.com/vpn-privacy/the-worlds-most-surveilled-cities/">all surveillance cameras installed</a> in the world (about 34 cameras per 1,000 people), Australian cities are <a href="https://www.comparitech.com/vpn-privacy/the-worlds-most-surveilled-cities/">catching up</a>. In 2021, Sydney had 4.67 cameras per 1,000 people and Melbourne had 2.13. </p>
<p>While CCTV cameras can be used for legitimate purposes, such as promoting safety in cities and assisting police with criminal investigations, their use also poses serious concerns.</p>
<p>In 2021, New South Wales police <a href="https://www.innovationaus.com/facial-recognition-and-the-nsw-protest-crowds/">were suspected of</a> having used CCTV footage paired with facial recognition to find people attending anti-lockdown protests. When questioned, they didn’t confirm or deny if they had (or if they would in the future).</p>
<p>In August 2022, the United Nations confirmed CCTV is <a href="https://www.ohchr.org/en/documents/country-reports/ohchr-assessment-human-rights-concerns-xinjiang-uyghur-autonomous-region">being used to</a> carry out “serious human rights violations” against Uyghur and other predominantly Muslim ethnic minorities in the Xinjiang region of Northwest China.</p>
<p>The CCTV cameras in China don’t just record real-time footage. Many are equipped with facial recognition to <a href="https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html">keep tabs on</a> the movements of minorities. And some have reportedly been trialled to <a href="https://www.bbc.com/news/technology-57101248">detect emotions</a>.</p>
<p>The US also has a long history of using CCTV cameras to support racist policing practices. In 2021, Amnesty International <a href="https://www.amnesty.org/en/latest/news/2021/06/scale-new-york-police-facial-recognition-revealed/">reported</a> areas with a higher proportion of non-white residents had more CCTV cameras.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/after-roe-v-wade-heres-how-women-could-adopt-spycraft-to-avoid-tracking-and-prosecution-186046">After Roe v Wade, here's how women could adopt 'spycraft' to avoid tracking and prosecution</a>
</strong>
</em>
</p>
<hr>
<p>Another issue with CCTV is security. Many of these cameras are open-access, which means they don’t have password protection and can often be easily accessed online. So I could spend all day watching a livestream of someone’s porch, as long as there was an open camera nearby.</p>
<p>Surveillance artist Dries Depoorter’s recent project <a href="https://driesdepoorter.be/thefollower/">The Follower</a> aptly showcases the vulnerabilities of open cameras. By coupling open camera footage with AI and Instagram photos, Depoorter was able to match people’s photos with the footage of where and when they were taken. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1569285965323145216"}"></div></p>
<p>There was pushback, with one of the <a href="https://www.inverse.com/input/culture/dries-depoorters-ai-surveillance-art-the-follower-instagram-influencers-photos">identified people saying</a>:</p>
<blockquote>
<p>It’s a crime to use the image of a person without permission. </p>
</blockquote>
<p>Whether or not it is illegal will depend on the specific circumstances and where you live. Either way, the issue here is that Depoorter was able to do this in the first place.</p>
<h2>IoT devices</h2>
<p>An IoT (“Internet of Things”) device is any device that connects to a wireless network to function – so think smart home devices such as Amazon Echo or Google Dot, a baby monitor, or even smart traffic lights.</p>
<p>It’s estimated global spending on IoT devices will <a href="https://acola.org/hs5-internet-of-things-australia/">have reached</a> US$1.2 trillion by some point this year. Around 18 billion connected devices form the IoT network. Like unsecured CCTV cameras, IoT devices are easy to hack into if they use default passwords or passwords that have <a href="https://haveibeenpwned.com/">been leaked</a>. </p>
<p>In some examples, hackers have hijacked baby monitor cameras to <a href="https://www.npr.org/sections/thetwo-way/2018/06/05/617196788/s-c-mom-says-baby-monitor-was-hacked-experts-say-many-devices-are-vulnerable/">stalk</a> breastfeeding mums, <a href="https://www.npr.org/sections/thetwo-way/2018/06/05/617196788/s-c-mom-says-baby-monitor-was-hacked-experts-say-many-devices-are-vulnerable/">threaten</a> parents that their baby was being kidnapped, and say creepy things like “<a href="https://www.nbcnews.com/news/us-news/stranger-hacks-baby-monitor-tells-child-i-love-you-n1090046">I love you</a>” to children. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/xbk3OdYBLHA?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Beyond hacking, businesses can also use data collected through IoT devices to further target customers with products and services. </p>
<p>Privacy experts raised the alarm in September over Amazon’s merger agreement with robot vacuum company iRobot. <a href="https://www.fightforthefuture.org/news/2022-09-09-letter-to-the-ftc-challenge-amazon-irobot-deal">A letter</a> to the US Federal Trade Commission signed by 26 civil rights and privacy advocacy groups said:</p>
<blockquote>
<p>Linking iRobot devices to the already intrusive Amazon home system incentivizes more data collection from more connected home devices, potentially including private details about our habits and our health that would endanger human rights and safety.</p>
</blockquote>
<p>IoT-collected data can also change hands with third parties through data partnerships (which are very common), and this too without customers’ explicit consent.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Smart speakers with digital assistants consistently raise data privacy concerns among experts.</span></span>
</figcaption>
</figure>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-the-shady-world-of-the-data-industry-strips-away-our-freedoms-143823">How the shady world of the data industry strips away our freedoms</a>
</strong>
</em>
</p>
<hr>
<h2>Big tech and big data</h2>
<p>In 2017, the <a href="https://www.economist.com/leaders/2017/05/06/the-worlds-most-valuable-resource-is-no-longer-oil-but-data">value of big data exceeded</a> that of oil. Private companies have driven the majority of that growth. </p>
<p>For tech platforms, the expansive collection of users’ personal information is business as usual, literally, because more data mean more precise analytics, more effective targeted ads <a href="https://www.facebook.com/business/help/716180208457684?id=1792465934137726">and more revenue</a>. </p>
<p>This logic of profit-making through targeted advertising has been <a href="https://journals.sagepub.com/doi/full/10.1177/1095796018819461">dubbed</a> “surveillance capitalism”. As <a href="https://quoteinvestigator.com/2017/07/16/product/">the old saying</a> goes, if you’re not paying for it, then you’re the product.</p>
<p>Meta (which owns both Facebook and Instagram) <a href="https://www.forbes.com/sites/bradadgate/2022/11/03/revenue-of-alphabet-and-meta-the-digital-duopoly-have-been-slipping/?sh=2ebf3dad2fed">generated</a> almost US$23 billion in advertising revenue in the third quarter of this year.</p>
<p>The vast machinery behind this is illustrated well in the 2021 documentary The Social Dilemma, even if in a dramatised way. It <a href="https://theconversation.com/netflixs-the-social-dilemma-highlights-the-problem-with-social-media-but-whats-the-solution-147351">showed us how</a> social media platforms rely on our psychological weaknesses to keep us online for as long as possible, measuring our actions down to the seconds we spend hovering over an ad. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=247&fit=crop&dpr=1 600w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=247&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=247&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=310&fit=crop&dpr=1 754w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=310&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=310&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A graphic excerpt from Social Dilemma.</span>
</figcaption>
</figure>
<h2>Loyalty programs</h2>
<p>Although many people don’t realise it, loyalty programs are one of the biggest personal data collection gimmicks out there. </p>
<p>In a particularly intrusive example, in 2012 one <a href="https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/?sh=706b0cd96668">US retailer</a> sent a teenage girl a catalogue dotted with pictures of smiling infants and nursery furniture. The girl’s angered father went to confront managers at the local store, and learned that predictive analytics knew more about his daughter than he did. </p>
<p>It’s estimated 88% of Australian consumers <a href="https://www.oaic.gov.au/privacy/privacy-assessments/loyalty-program-assessment-woolworths-rewards-woolworths-limited">over age 16 are members</a> of a loyalty program. These schemes build your consumer profile to sell you more stuff. Some might even charge you <a href="https://www.abc.net.au/everyday/making-loyalty-cards-worth-your-time-and-money/10998806">sneaky fees</a> and lure you in with future perks to sell you at steep prices. </p>
<p>As technology journalist <a href="https://www.choice.com.au/consumers-and-data/data-collection-and-use/who-has-your-data/articles/loyalty-program-data-collection">Ros Page notes</a>: </p>
<blockquote>
<p>[T]he data you hand over at the checkout can be shared and sold to businesses you’ve never dealt with.</p>
</blockquote>
<p>As a cheeky sidestep, you could find a buddy to swap your loyalty cards with. Predictive analytics is only strong when it can recognise behavioural patterns. When the patterns are disrupted, the data turn into noise. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/dont-be-phish-food-tips-to-avoid-sharing-your-personal-information-online-138613">Don't be phish food! Tips to avoid sharing your personal information online</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/194917/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ausma Bernot does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The vast majority of people alive today are subject to tracking through a number of overlapping and entrenched surveillance systems.Ausma Bernot, PhD Candidate, School of Criminology and Criminal Justice, Griffith UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1828302022-05-16T12:17:37Z2022-05-16T12:17:37ZOnline data could be used against people seeking abortions now that Roe v. Wade has been overturned<figure><img src="https://images.theconversation.com/files/463063/original/file-20220513-25-vs4r8v.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C7006%2C4676&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Apps for tracking reproductive health are convenient, but the data they collect could be used against you.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/menstruation-cycle-application-on-smart-phone-royalty-free-image/652267166">Tarik Kizilkaya/iStock via Getty Images</a></span></figcaption></figure><p>In overturning <a href="https://www.oyez.org/cases/1971/70-18">Roe v. Wade</a>, the U.S. Supreme Court decision in the <a href="https://www.supremecourt.gov/opinions/21pdf/19-1392_6j37.pdf">Dobbs case</a> does not merely deprive women of reproductive control and physical agency as a matter of constitutional law, but it also changes their relationship with the online world. Anyone in a state where abortion is now illegal who relies on the internet for information, products and services related to reproductive health is subject to online policing.</p>
<p>All women of child-bearing age, regardless of how secure and how privileged they may have imagined themselves to be, are now among <a href="https://doi.org/10.1007/978-3-030-82786-1_15">the marginalized and vulnerable populations whose privacy is at risk</a>.</p>
<p>As a researcher who <a href="https://scholar.google.com/citations?user=u3BoLzgAAAAJ&hl=en">studies online privacy</a>, I’ve known for some time how <a href="https://harvardlawreview.org/2021/05/geofence-warrants-and-the-fourth-amendment/">Google</a>, <a href="https://theintercept.com/2020/10/21/dataminr-twitter-surveillance-racial-profiling/">social media</a> and internet data generally can be used for <a href="https://nyupress.org/9781479892822/the-rise-of-big-data-policing/">surveillance by law enforcement</a> to cast digital dragnets. Women are at risk not just from what they reveal about their reproductive status on social media, but also by data from their <a href="https://www.npr.org/2022/05/10/1097482967/roe-v-wade-supreme-court-abortion-period-apps">health applications</a>, which could incriminate them if it were subpoenaed.</p>
<h2>Who is tracked and how</h2>
<p>People who are most vulnerable to online privacy encroachment and to the use or abuse of their data have traditionally been those society deems less worthy of protection: <a href="https://tcf.org/content/report/disparate-impact-surveillance/">people without means, power or social standing</a>. Surveillance directed at marginalized people reflects not only a lack of interest in protecting them, but also a presumption that, by virtue of their social identity, they are more likely to commit crimes or to transgress in ways that might justify <a href="https://nyupress.org/9780814776384/punished/%5D">preemptive policing</a>.</p>
<p>Many marginalized people happen to be women, including <a href="https://www.sup.org/books/title/?id=25115">low-income mothers</a>, for whom the mere act of applying for public assistance can subject them to presumptions of criminal intent. These presumptions are often used to justify <a href="https://www.sup.org/books/title/?id=25115&promo=S17XASA">invasions of their privacy</a>. Now, with anti-abortion legislation sweeping Republican-controlled states and <a href="https://theconversation.com/what-triggers-the-trigger-laws-that-could-ban-abortions-184361">poised to go into effect</a> with the end Roe v. Wade, all women of reproductive age in those states are likely to be subject to those same presumptions. </p>
<p>Before, women had to worry only that <a href="https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/?sh=698664166668">Target</a> or Amazon might learn of their pregnancies. Based on what’s already known about <a href="https://americandragnet.org/">privacy incursions by law enforcement against marginalized people</a>, it’s likely that in the post-Roe world women will be more squarely in the crosshairs of <a href="https://www.nist.gov/programs-projects/digital-forensics">digital forensics</a>. For example, law enforcement agencies routinely use <a href="https://www.upturn.org/work/mass-extraction/">forensic tools to search people’s cellphones</a> when investigating a wide range of crimes, sometimes without a search warrant. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/463069/original/file-20220513-24-bf8f6s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a smart phone screen showing a dialog box offering three options for location settings" src="https://images.theconversation.com/files/463069/original/file-20220513-24-bf8f6s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/463069/original/file-20220513-24-bf8f6s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=1067&fit=crop&dpr=1 600w, https://images.theconversation.com/files/463069/original/file-20220513-24-bf8f6s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=1067&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/463069/original/file-20220513-24-bf8f6s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=1067&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/463069/original/file-20220513-24-bf8f6s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1341&fit=crop&dpr=1 754w, https://images.theconversation.com/files/463069/original/file-20220513-24-bf8f6s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1341&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/463069/original/file-20220513-24-bf8f6s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1341&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Many apps track your location, and some of the companies behind those apps sell that data.</span>
<span class="attribution"><a class="source" href="https://newsroom.ap.org/detail/WeatherChannelLawsuit/6fdaf1911dba476a93d28bd31fd2aaa4/photo">AP Photo/Brian Melley</a></span>
</figcaption>
</figure>
<p>Imagine a scenario in which a co-worker or neighbor reports someone to the authorities, which gives law enforcement officials grounds to pursue digital evidence. That evidence could include, for example, internet searches about abortion providers and period app data showing missed periods.</p>
<p>The risk is especially acute in places that foster <a href="https://www.nytimes.com/2021/09/10/us/politics/texas-abortion-law-facts.html">bounty hunting</a>. In a state like Texas where there is a potential for citizens to have standing to sue people who help others access abortion services, everything you say or do in any context becomes relevant because there’s no <a href="https://www.law.cornell.edu/wex/probable_cause">probable cause</a> hurdle to <a href="https://www.law.cornell.edu/wex/discovery">accessing your data</a>. </p>
<p>Outside of that case, it’s difficult to do full justice to all the risks because context matters, and different combinations of circumstances can conspire to elevate harms. Here are risks to keep in mind:</p><ul>
<li>Sharing information about your pregnancy on social media.
</li><li><a href="https://transparencyreport.google.com/user-data/overview?user_requests_report_period=series:requests,accounts;authority:US;time:2021H1&lu=user_requests_report_period">Internet search behavior</a> related directly or indirectly to your pregnancy or reproductive health, regardless of the search engine you use.
</li><li><a href="https://www.vice.com/en/article/m7vzjb/location-data-abortion-clinics-safegraph-planned-parenthood">Location tracking via your phone</a>, for example showing that you visited a place that could be linked to your reproductive health.
</li><li>Using apps that <a href="https://www.consumerreports.org/health-privacy/what-your-period-tracker-app-knows-about-you-a8701683935/">reveal relevant sensitive data</a>, like your menstrual cycle.
</li><li>Being overconfident in using encryption or anonymous tools.
</li></ul><p></p>
<h2>Heeding alarms</h2>
<p>Scholars, including my colleagues and me, have been raising alarms for years, arguing that surveillance activities and lack of privacy threatening <a href="https://dl.acm.org/doi/abs/10.1145/3313831.3376167?cid=81100401804">those most vulnerable are ultimately a threat to all</a>. That’s because the number of people at risk can rise when political forces identify a broader population as posing threats justifying surveillance.</p>
<p>The lack of action on privacy vulnerability is due in part to a failure of imagination, which frequently <a href="https://scholarship.law.gwu.edu/cgi/viewcontent.cgi?article=1159&context=faculty_publications">blinkers people who see their own position as largely safe</a> <a href="https://doi.org/10.1145/3589960">in a social and political system</a>.</p>
<p>There is, however, another reason for inattention. When considering mainstream privacy obligations and requirements, the privacy and security community has, for decades, been caught up in a debate about whether people really care about their privacy in practice, even if they value it in principle. </p>
<p>I’d argue that the <a href="https://doi.org/10.31235/osf.io%2Fta2z3">privacy paradox</a> – the belief that people are less motivated to protect their privacy than they claim to be – remains conventional wisdom today. This view diverts attention from taking action, including giving people tools to fully evaluate their risks. The privacy paradox is arguably more a commentary on how little people understand the implications of what’s been called <a href="https://www.publicaffairsbooks.com/titles/shoshana-zuboff/the-age-of-surveillance-capitalism/9781610395694/">surveillance capitalism</a> or feel empowered to defend against it. </p>
<p>With the general public cast as indifferent, it is easy to assume that people generally don’t want or need protection, and that all groups are at equal risk. Neither is true.</p>
<h2>All in it together?</h2>
<p>It’s hard to talk about silver linings, but as these online risks spread to a broader population, the importance of online safety will become a mainstream concern. Online safety includes being careful about <a href="https://theconversation.com/your-digital-footprints-are-more-than-a-privacy-risk-they-could-help-hackers-infiltrate-computer-networks-177123">digital footprints</a> and using anonymous browsers.</p>
<p>Maybe the general population, at least in states that are <a href="https://www.nytimes.com/interactive/2022/us/abortion-bans-restrictons-roe-v-wade.html">triggering or validating</a> abortion bans, will come to recognize that <a href="https://www.washingtonpost.com/technology/2022/05/04/abortion-digital-privacy/">Google data</a> can be incriminating.</p>
<p><em>This article was updated on June 24, 2022, to indicate that the U.S. Supreme Court has overturned Roe v. Wade.</em></p><img src="https://counter.theconversation.com/content/182830/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nora McDonald does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Data privacy is an abstract issue for most people, even though virtually everyone is at risk. Now that abortion may become illegal in some states, digital surveillance could take an even darker turn.Nora McDonald, Assistant Professor of Information Technology, University of Cincinnati Licensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1621722021-06-20T20:18:58Z2021-06-20T20:18:58ZIs your phone really listening to your conversations? Well, turns out it doesn’t have to<figure><img src="https://images.theconversation.com/files/407172/original/file-20210618-27-os1quw.jpeg?ixlib=rb-1.1.0&rect=209%2C7%2C4782%2C2986&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Have you ever chatted with a friend about buying a certain item and been targeted with an ad for that same item the next day? If so, you may have wondered whether your smartphone was “listening” to you. </p>
<p>But is it really? Well, it’s no coincidence the item you’d been interested in was the same one you were targeted with. </p>
<p>But that doesn’t mean your device is actually listening to your conversations — it doesn’t need to. There’s a good chance you’re already giving it all the information it needs. </p>
<h2>Can phones hear?</h2>
<p>Most of us regularly <a href="https://www.emeraldgrouppublishing.com/archived/learning/management_thinking/articles/cookies.htm">disclose our</a> information to a wide range of websites and apps. We do this when we grant them certain permissions, or allow “cookies” to track our online activities.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/94-of-australians-do-not-read-all-privacy-policies-that-apply-to-them-and-thats-rational-behaviour-96353">94% of Australians do not read all privacy policies that apply to them – and that’s rational behaviour</a>
</strong>
</em>
</p>
<hr>
<p>So-called “first-party cookies” allow websites to “remember” certain details about our interaction with the site. For instance, login cookies let you save your login details so you don’t have to re-enter them each time.</p>
<p>Third-party cookies, however, are created by domains that are external to the site you’re visiting. The third party will often be a marketing company in a partnership with the first-party website or app. </p>
<p>The latter will host the marketer’s ads and grant it access to data it collects from you (which you will have given it permission to do — perhaps by clicking on some innocuous looking popup).</p>
<p>As such, the advertiser can build a picture of your life: your routines, wants and needs. These companies constantly seek to gauge the popularity of their products and how this varies based on factors such as a customer’s age, gender, height, weight, job and hobbies. </p>
<p>By classifying and clustering this information, advertisers improve their recommendation algorithms, using something called <a href="https://link.springer.com/article/10.1007/s40747-020-00212-w">recommender systems</a> <a href="https://arxiv.org/pdf/2009.06861.pdf">to target</a> the right customers with the right ads.</p>
<h2>Computers work behind the scenes</h2>
<p>There are several machine-learning techniques in artificial intelligence (AI) that help systems filter and analyse your data, such as data clustering, classification, association and <a href="https://bdtechtalks.com/2019/05/28/what-is-reinforcement-learning/">reinforcement learning</a> (RL). </p>
<p>An RL agent can <a href="https://bdtechtalks.com/2021/02/22/reinforcement-learning-ad-optimization/">train itself</a> based on feedback gained from user interactions, akin to how a young child will learn to repeat an action if it leads to a reward.</p>
<p>By viewing or pressing “like” on a social media post, you send a reward signal to an RL agent confirming you’re attracted to the post — or perhaps interested in the person who posted it. Either way, a message is sent to the RL agent about your personal interests and preferences.</p>
<p>If you start actively liking posts about “mindfulness” on a social platform, its system will learn to send you advertisements for companies that can offer related products and content. </p>
<p>Ad recommendations may be based on other data, too, including but not limited to:</p>
<ul>
<li><p>other ads you clicked on through the platform</p></li>
<li><p>personal details you provided the platform (such as your age, email address, gender, location and which devices you access the platform on)</p></li>
<li><p>information shared with the platform by other advertisers or marketing partners that already have you as a customer</p></li>
<li><p>specific pages or groups you have joined or “liked” on the platform.</p></li>
</ul>
<p>In fact, AI algorithms can help marketers take huge pools of data and use them to construct your entire social network, ranking people around you based on how much you “care about” (interact with) them. </p>
<p>They can then start to target you with ads based on not only your own data, but on data collected from your friends and family members using the same platforms as you. </p>
<p>For example, Facebook might be able to recommend you something your friend recently bought. It didn’t need to “listen” to a conversation between you and your friend to do this.</p>
<p><div data-react-class="InstagramEmbed" data-react-props="{"url":"https://www.instagram.com/p/CPhRFBjBX17/?utm_medium=copy_link","accessToken":"127105130696839|b4b75090c9688d81dfd245afe6052f20"}"></div></p>
<h2>Exercising your right to privacy is a choice</h2>
<p>While app providers are <em>supposed</em> to provide clear terms and conditions to users about how they collect, store and use data, nowadays it’s on users to be careful about which permissions they give to the apps and sites they use. </p>
<p>When in doubt, give permissions on an as-needed basis. It makes sense to give WhatsApp access to your camera and microphone, as it can’t provide some of its services without this. But not all apps and services will ask for only what is necessary. </p>
<p>Perhaps you don’t mind receiving targeted ads based on your data, and may find it appealing. <a href="https://hbr.org/2020/10/when-do-we-trust-ais-recommendations-more-than-peoples">Research</a> has shown people with a more “utilitarian” (or practical) worldview actually prefer recommendations from AI to those from humans. </p>
<p>That said, it’s possible AI recommendations can constrain people’s choices and <a href="https://theconversation.com/ai-is-killing-choice-and-chance-which-means-changing-what-it-means-to-be-human-151826">minimise serendipity</a> in the long term. By presenting consumers with algorithmically curated choices of what to watch, read and stream, companies may be implicitly keeping our tastes and lifestyle within a narrower frame.</p>
<h2>Don’t want to be predicted? Don’t be predictable</h2>
<p>There are some simple tips you can follow to limit the amount of data you share online. First, you should review your phone’s app permissions regularly. </p>
<p>Also, think twice before an app or website asks you for certain permissions, or to allow cookies. Wherever possible, avoid using your social media accounts to connect or log in to other sites and services. In most cases there will be an option to sign up via email, which could even be a <a href="https://helpdeskgeek.com/free-tools-review/5-best-free-disposable-email-accounts/">burner email</a>.</p>
<p>Once you do start the sign-in process, remember you only have to share as much information as is needed. And if you’re sensitive about privacy, perhaps consider installing a virtual private network (VPN) on your device. This will mask your IP address and encrypt your online activities.</p>
<h2>Try it yourself</h2>
<p>If you still think your phone is listening to you, there’s a simple experiment you can try.</p>
<p>Go to your phone’s settings and restrict access to your microphone for all your apps. Pick a product you know you haven’t searched for in any of your devices and talk about it out loud at some length with another person. </p>
<p>Make sure you repeat this process a few times. If you still don’t get any targeted ads within the next few day, this suggests your phone isn’t really “listening” to you. </p>
<p>It has other ways of finding out what’s on your mind.</p><img src="https://counter.theconversation.com/content/162172/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dana Rezazadegan is affiliated with Swinburne University of Technology. She is Superstar of STEM at Science and Technology Australia and Honorary fellow at Macquarie University.</span></em></p>Have you ever been targeted with ads that are scarily specific to you, and wondered how the app or website could have known?Dana Rezazadegan, Lecturer, Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1598112021-04-28T05:44:30Z2021-04-28T05:44:30ZNSW Police want access to Tinder’s sexual assault data. Cybersafety experts explain why it’s a date with disaster<figure><img src="https://images.theconversation.com/files/397493/original/file-20210428-17-1rdeyi0.jpg?ixlib=rb-1.1.0&rect=110%2C34%2C3771%2C2549&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Dating apps have been under increased scrutiny for their role in facilitating harassment and abuse. </p>
<p>Last year an <a href="https://www.abc.net.au/news/2020-10-12/tinder-dating-app-helps-sexual-predators-hide-four-corners/12722732?nw=0">ABC investigation</a> into Tinder found most users who reported sexual assault offences didn’t receive a response from the platform. Since then, the app has reportedly implemented <a href="https://www.tinderpressroom.com/news?item=122491">new features</a> to mitigate abuse and help users feel safe. </p>
<p>In a recent development, New South Wales Police <a href="https://www.abc.net.au/triplej/programs/hack/tinder-announces-new-safety-measures-artificial-intelligence/13317896">announced</a> they are in conversation with Tinder’s parent company Match Group (which also owns OKCupid, Plenty of Fish and Hinge) regarding a proposal to gain access to a portal of sexual assaults reported on Tinder. The police also suggested using artificial intelligence (AI) to scan users’ conversations for “red flags”.</p>
<p>Tinder <a href="https://www.tinderpressroom.com/2021-02-25-The-Top-10-Safety-Focused-Features-on-Tinder">already uses automation</a> to monitor users’ instant messages to identify harassment and verify personal photographs. However, increasing surveillance and automated systems doesn’t necessarily make dating apps safer to use.</p>
<h2>User safety on dating apps</h2>
<p><a href="https://theconversation.com/right-swipes-and-red-flags-how-young-people-negotiate-sex-and-safety-on-dating-apps-128390">Research</a> has shown people have differing understandings of “safety” on apps. While many users prefer not to negotiate sexual consent on apps, some do. This can involve disclosure of sexual health (including HIV status) and explicit discussions about sexual tastes and preferences. </p>
<p>If the <a href="https://www.theguardian.com/technology/2021/jan/26/grindr-fined-norway-sharing-personal-information">recent Grindr data breach</a> is anything to go by, there are serious privacy risks whenever users’ sensitive information is collated and archived. As such, some may actually feel less safe if they find out police could be monitoring their chats.</p>
<p>Adding to that, automated features in dating apps (which are supposed to enable identity verification and matching) can actually put certain groups at risk. <a href="https://www.tandfonline.com/doi/full/10.1080/14461242.2020.1851610">Trans and non-binary users</a> may be misidentified by automated image and voice recognition systems which are trained to “see” or “hear” gender in binary terms. </p>
<p>Trans people may also be accused of deception if they don’t disclose their trans identity in their profile. And those who do disclose it risk being targeted by transphobic users.</p>
<h2>Increasing police surveillance</h2>
<p>There’s no evidence to suggest that granting police access to sexual assault reports will increase users’ safety on dating apps, or even help them feel safer. <a href="https://eprints.qut.edu.au/131121/2/Rosalie_Gillett_Thesis.pdf">Research</a> has demonstrated users often don’t report harassment and abuse to dating apps or law enforcement. </p>
<p>Consider NSW Police Commissioner Mick Fuller’s misguided “<a href="https://www.abc.net.au/news/2021-03-18/nsw-sexual-consent-app-proposed-by-mick-fuller/100015782">consent app</a>” proposal last month; this is just one of many reasons sexual assault survivors may not want to contact police after an incident. And if police can access personal data, this may deter users from reporting sexual assault.</p>
<p>With high attrition rates, <a href="https://www.theguardian.com/society/2021/mar/20/barriers-to-justice-we-are-still-governed-by-the-idea-that-women-lie-about-sexual-assault">low conviction rates</a> and the prospect of being retraumatised in court, the criminal legal system often fails to deliver justice to sexual assault survivors. Automated referrals to police will only further deny survivors their agency.</p>
<p>Moreover, the proposed partnership with law enforcement sits within a broader project of escalating police surveillance fuelled by <a href="https://journals.uic.edu/ojs/index.php/fm/article/view/5615">platform-verification processes</a>. Tech companies <a href="https://yalebooks.yale.edu/book/9780300209570/atlas-ai">offer police forces a goldmine</a> of data. The needs and experiences of users are rarely the focus of such partnerships.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australian-police-are-using-the-clearview-ai-facial-recognition-system-with-no-accountability-132667">Australian police are using the Clearview AI facial recognition system with no accountability</a>
</strong>
</em>
</p>
<hr>
<p>Match Group and NSW Police have yet to release information about how such a partnership would work and how (or if) users would be notified. Data collected could potentially include usernames, gender, sexuality, identity documents, chat histories, geolocation and sexual health status. </p>
<h2>The limits of AI</h2>
<p>NSW Police also proposed using AI to scan users’ conversations and identify “red flags” that could indicate potential sexual offenders. This would build on Match Group’s current tools that detect sexual violence in users’ private chats. </p>
<p>While an AI-based system may detect overt abuse, everyday and “ordinary” abuse (which is <a href="https://eprints.qut.edu.au/131121/2/Rosalie_Gillett_Thesis.pdf">common in digital dating contexts</a>) may fail to trigger an automated system. Without context, it’s difficult for AI to detect behaviours and language that are harmful to users.</p>
<p>It may detect overt physical threats, but not seemingly innocuous behaviours which are only recognised as abusive by individual users. For instance, repetitive messaging may be welcomed by some, but experienced as harmful by others. </p>
<p>Also, even as automation becomes more sophisticated, users with malicious intent can develop ways to <a href="https://www.theverge.com/interface/2019/5/31/18646525/facebook-white-supremacist-ban-evasion-proud-boys-name-change">circumvent it</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/tinders-new-safety-features-wont-prevent-all-types-of-abuse-131375">Tinder's new safety features won't prevent all types of abuse</a>
</strong>
</em>
</p>
<hr>
<p>If data are shared with police, there’s also the risk flawed data on “potential” offenders may be used to train other <a href="https://mashable.com/2017/10/26/children-predictive-policing-australia/">predictive policing tools</a>.</p>
<p>We know from past research that automated hate-speech detection systems can harbour inherent <a href="https://arxiv.org/abs/2005.13041">racial</a> and <a href="https://arxiv.org/abs/1707.01477">gender biases</a> (and perpetuate them). At the same time we’ve seen examples of AI trained on <a href="https://www.netflix.com/au/title/81328723">prejudicial data</a> making important <a href="https://nyupress.org/9781479837243/algorithms-of-oppression/">decisions about people’s lives</a>, such as by giving <a href="https://rlc.org.au/publication/media-release-call-end-predictive-policing-targeting-children-young-ten">criminal risk assessment scores</a> that negatively impact marginalised groups.</p>
<p>Dating apps must do a lot more to understand how their users think about safety and harm online. A potential partnership between Tinder and NSW Police takes for granted that the <a href="https://reason.com/2021/03/27/will-feminists-please-stop-calling-the-cops/">solution to sexual violence </a> simply involves <a href="https://jacobinmag.com/2020/08/prison-reform-sex-offenders-feminism/">more law enforcement and technological surveillance</a>. </p>
<p>And even so, tech initiatives must always sit alongside well-funded and comprehensive sex education, consent and relationship skill-building, and well-resourced crisis services. </p>
<hr>
<p><em>The Conversation was contacted after publication by a Match Group spokesperson who shared the following:</em></p>
<p><em>“We recognize we have an important role to play in helping prevent sexual assault and harassment in communities around the world. We are committed to ongoing discussions and collaboration with global partners in law enforcement and with leading sexual assault organizations like RAINN to help make our platforms and communities safer. While members of our safety team are in conversations with police departments and advocacy groups to identify potential collaborative efforts, Match Group and our brands have not agreed to implement the NSW Police proposal.”</em></p><img src="https://counter.theconversation.com/content/159811/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Rosalie Gillett receives funding from the Australian Research Council Centre of Excellence for Automated Decision-Making and Society. She is also the recipient of a Facebook Content Governance grant.</span></em></p><p class="fine-print"><em><span>Kath Albury receives funding from the Australian Research Council Centre of Excellence for Automated Decision-Making and Society. She is also the recipient of an Australian eSafety Commission Online Safety grant.</span></em></p><p class="fine-print"><em><span>Zahra Zsuzsanna Stardust receives funding from the Australian Research Council Centre of Excellence for Automated Decision-Making and Society. </span></em></p>Granting police access to Tinder users’ information is problematic for many reasons (even if the intent is to keep people safe).Rosalie Gillett, Postdoctoral Research Fellow, Centre of Excellence for Automated Decision-Making and Society, Queensland University of Technology, Queensland University of TechnologyKath Albury, Professor of Media and Communication and Associate Investigator, ARC Centre of Excellence for Automated Decision-Making + Society, Swinburne University of TechnologyZahra Zsuzsanna Stardust, Postdoctoral Research Fellow, Centre of Excellence in Automated Decision-Making and Society, Queensland University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1541302021-02-05T13:12:29Z2021-02-05T13:12:29ZTim Berners-Lee’s plan to save the internet: give us back control of our data<p>Releasing his creation for free 30 years ago, the inventor of the world wide web, Tim Berners-Lee, famously declared: “this is for everyone”. Today, his invention is used by billions – but it also hosts the <a href="https://freedomhouse.org/report/freedom-net">authoritarian crackdowns</a> of antidemocratic governments, and supports the infrastructure of the most wealthy and powerful companies on Earth.</p>
<p>Now, in an effort to return the internet to the golden age that existed before its current incarnation as <a href="https://journals.sagepub.com/doi/10.1177/0267323108098947">Web 2.0</a> – characterised by invasive data harvesting by governments and corporations – Berners-Lee has devised a plan to save his invention. </p>
<p>This involves his brand of “data sovereignty” – which means giving users power over their data – and it means wrestling back control of the personal information we surrendered to big tech many years ago.</p>
<p>Berners-Lee’s latest intervention comes as increasing numbers of people regard the online world as a landscape dominated by a few tech giants, thriving on a system of “<a href="https://www.theguardian.com/books/2019/oct/04/shoshana-zuboff-surveillance-capitalism-assault-human-automomy-digital-privacy">surveillance capitalism</a>” – which sees our personal data extracted and harvested by online giants before being used to target advertisements at us as we browse the web. </p>
<p>Courts in the US and the EU have filed cases against big tech as part of what’s been dubbed the “<a href="https://www.economist.com/briefing/2018/01/20/the-techlash-against-amazon-facebook-and-google-and-what-they-can-do">techlash</a>” against their growing power. But Berners-Lee’s answer to big tech’s overreach is far simpler: <a href="https://www.nytimes.com/2021/01/10/technology/tim-berners-lee-privacy-internet.html">to give individuals the power to control their own data</a>.</p>
<h2>Net gains</h2>
<p>The idea of data sovereignty has its roots in <a href="https://press.anu.edu.au/publications/series/caepr/indigenous-data-sovereignty">the claims of the world’s indigenous people</a>, who have leveraged the concept to protect the intellectual property of their cultural heritage. </p>
<p>Applied to all web users, data sovereignty means giving individuals complete authority over their personal data. This includes the self-determination of which elements of our <a href="https://networkcultures.org/blog/publication/tod-29-good-data/">personal data</a> we permit to be collected, and how we allow it to be analysed, stored, owned and used.</p>
<p>This would be in stark contrast to the current data practices that underpin big tech’s business models. The practice of “<a href="https://journals.sagepub.com/doi/full/10.1177/2053951718820549">data extraction</a>”, for instance, refers to personal information that is taken from people surfing the web without their meaningful consent or fair compensation. This depends on a model in which your data is not regarded as being your property.</p>
<p>Scholars argue that data extraction, combined with “network effects”, has led to <a href="https://www.ippr.org/juncture-item/the-challenges-of-platform-capitalism">teach monopolies</a>. Network effects are seen when a platform becomes dominant, encouraging even more users join and use it. This allows the dominant platform more possibilities to extract data, which they use to produce better services. In turn, these better services attract even more users. This tends to amplify the power (and database size) of dominant firms at the expense of smaller ones.</p>
<p>This monopolisation tendency explains why the data extraction and ownership landscape is dominated by the so-called <a href="https://www.statista.com/topics/4213/google-apple-facebook-amazon-and-microsoft-gafam/">GAFAM</a> – Google, Apple, Facebook, Amazon and Microsoft – in the US and the so-called <a href="https://www.statista.com/chart/23502/market-shares-baidu-alibaba-tencent/">BAT</a> – Baidu, Alibaba and Tencent – in China. In addition to companies, governments also have monopoly power over their citizens’ data.</p>
<figure class="align-center ">
<img alt="A smartphone screen showing the five 'GAFAM' branded apps" src="https://images.theconversation.com/files/382498/original/file-20210204-14-fufs5q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/382498/original/file-20210204-14-fufs5q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/382498/original/file-20210204-14-fufs5q.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/382498/original/file-20210204-14-fufs5q.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/382498/original/file-20210204-14-fufs5q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/382498/original/file-20210204-14-fufs5q.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/382498/original/file-20210204-14-fufs5q.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The world’s largest tech companies are increasingly regarded as monopolistic.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/kumamoto-japan-may-29-2020-gafam-1783291358">Koshiro K/Shutterstock</a></span>
</figcaption>
</figure>
<p>“<a href="https://journals.sagepub.com/doi/full/10.1177/2053951720982012">Data sovereignty</a>” has been proposed as a promising means of reversing this monopolising tendency. It’s an idea that’s been kicked about on the fringes of internet debates for some time, but its backing by Tim Berners-Lee will mean it garners much greater attention.</p>
<h2>Building data vaults</h2>
<p>Berners-Lee isn’t just backing data sovereignty: he’s building the tech to support it. He recently set up <a href="https://inrupt.com/">Inrupt</a>, a company with the express goal of moving towards the kind of world wide web that its inventor had originally envisioned. Inrupt plans to do that through a new system called “pods” – personal online data stores.</p>
<p>Pods work like personal data safes. By storing their data in a pod, individuals retain ownership and control of their own data, rather than transferring this to digital platforms. Under this system, companies can request access to an individual’s pod, offering certain services in return – but they cannot extract or sell that data onwards.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/web-3-0-the-decentralised-web-promises-to-make-the-internet-free-again-113139">Web 3.0: the decentralised web promises to make the internet free again</a>
</strong>
</em>
</p>
<hr>
<p>Inrupt has built these pods as part of its <a href="https://inrupt.com/solid">Solid</a> project, which has followed the form of a Silicon Valley startup – though with the express objective of making pods accessible for all. All websites or apps a user with a pod visits will require authentication by Solid before being allowed to request an individual’s personal data. If pods are like safes, Solid acts like the bank in which the safe is stored.</p>
<p>One of the criticisms of the idea of pods is that it approaches data as a commodity. The concept of “<a href="https://techcrunch.com/2012/09/30/data-markets-the-emerging-data-economy/">data markets</a>” has been mooted, for instance, as a system that enables companies to make micro-payments in exchange for our data. The fundamental flaw of such a system is that data is of little value when it is bought and sold on its own: the value of data only emerges from its aggregation and analysis, accrued via network effects.</p>
<h2>Common good</h2>
<p>An alternative to the commodification of data could lie in categorising data as “commons”. The idea of the commons was first popularised by the work of Nobel Prize-winning political economist Elinor Ostrom. </p>
<p>A commons approach to data would regard it as owned not by individuals or by companies, but as something that’s owned by society. <a href="https://decodeproject.eu/blog/towards-data-commons">Data as commons</a> is an emerging idea which could unlock the value of data as a public good, keeping ownership in the hands of the community. </p>
<p>Tim Berners-Lee’s intervention in debates about the destiny of the internet is a welcome development. Governments and communities are coming to realise that big tech’s data-driven digital dominance is unhealthy for society. Pods represent one answer among many to the question of how we should respond.</p><img src="https://counter.theconversation.com/content/154130/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Pieter Verdegem does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The web’s inventor believes the liberation of our data will help redistribute power on the internet.Pieter Verdegem, Senior Lecturer, School of Media and Communication, University of WestminsterLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1520852020-12-17T19:07:43Z2020-12-17T19:07:43ZNot just complacency: why people are reluctant to use COVID-19 contact-tracing apps<p>This week’s announcement of two new <a href="https://www.beehive.govt.nz/release/two-new-vaccines-secured-enough-every-new-zealander">COVID-19 vaccine pre-purchase deals</a> is encouraging, but doesn’t mean New Zealanders should become complacent about using the <a href="https://www.health.govt.nz/our-work/diseases-and-conditions/covid-19-novel-coronavirus/covid-19-resources-and-tools/nz-covid-tracer-app">NZ COVID Tracer app</a> during the summer holidays. </p>
<p>The immunisation rollout won’t start until the second quarter of 2021, and the government is encouraging New Zealanders to continue using the app, including the recently upgraded <a href="https://www.health.govt.nz/our-work/diseases-and-conditions/covid-19-novel-coronavirus/covid-19-resources-and-tools/nz-covid-tracer-app/how-nz-covid-tracer-works/bluetooth-tracing">bluetooth function</a>, as part of its plan to <a href="https://www.beehive.govt.nz/release/new-zealand%E2%80%99s-planning-manage-covid-19-over-summer">manage the pandemic</a> during the holiday period.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-to-keep-covid-19-at-bay-during-the-summer-holidays-and-help-make-travel-bubbles-a-reality-in-2021-152014">How to keep COVID-19 at bay during the summer holidays — and help make travel bubbles a reality in 2021</a>
</strong>
</em>
</p>
<hr>
<p>During the past weeks, the number of daily scans has <a href="https://www.health.govt.nz/our-work/diseases-and-conditions/covid-19-novel-coronavirus/covid-19-data-and-statistics/covid-19-nz-covid-tracer-app-data">dropped significantly</a>, down from just over 900,000 scans per day at the end of November to fewer than 400,000 in mid-December.</p>
<p>With no active cases of COVID-19 in the commmunity, complacency might be part of the issue in New Zealand, but as our <a href="https://academic.oup.com/jamia/advance-article-abstract/doi/10.1093/jamia/ocaa240/5961440?redirectedFrom=fulltext">research</a> in the US shows, worries about privacy and trust continue to make people reluctant to use contact-tracing apps. </p>
<h2>Concerns about privacy and surveillance</h2>
<p>We surveyed 853 people from every state in the US to identify the factors promoting or inhibiting their use of contact-tracing applications. Our survey reveals two seemingly contradictory findings.</p>
<p>Individuals are highly motivated to use contact-tracing apps, for the sake of their own health and that of society as a whole. But the study also found people are concerned about privacy, social disapproval and surveillance. </p>
<p>The findings suggest people’s trust in the data collectors is dependent on the technology features of these apps (for example, information sensitivity and anonymity) and the privacy protection initiatives instigated by the authorities.</p>
<p>With the holiday season just around the corner — and even though New Zealand is currently free of community transmission — our findings are pertinent. New Zealanders will travel more during the summer period, and it is more important than ever to use contact-tracing apps to improve our chances of getting on top of any potential outbreaks as quickly as possible. </p>
<p>How, then, to overcome concerns about privacy and trust and make sure New Zealanders <a href="https://covid19.govt.nz/everyday-life/make-summer-unstoppable/">use the upgraded app</a> during summer?</p>
<p>The benefits of adopting contact-tracing apps are mainly in shared public health, and it is important these societal health benefits are emphasised. In order to quell concerns, data collectors (government and businesses) must also offer assurance that people’s real <a href="https://www.health.govt.nz/our-work/diseases-and-conditions/covid-19-novel-coronavirus/covid-19-resources-and-tools/nz-covid-tracer-app/privacy-and-security-nz-covid-tracer#personal">identity will be concealed</a>. </p>
<p>It is the responsibility of the government and the office of the Privacy Commissioner to ensure all personal information is managed appropriately.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/an-australia-nz-travel-bubble-needs-a-unified-covid-contact-tracing-app-were-not-there-151761">An Australia–NZ travel bubble needs a unified COVID contact-tracing app. We're not there</a>
</strong>
</em>
</p>
<hr>
<h2>Transparency and data security</h2>
<p>Our study also found that factors such as peer and social influence, regulatory pressures and previous experiences with privacy loss underlie people’s readiness to adopt contact-tracing apps. </p>
<p>The findings reveal that people expect regulatory protection if they are to use contact-tracing apps. This confirms the need for laws and regulations with strict penalties for those who collect, use, disclose or decrypt collected data for any purpose other than contact tracing. </p>
<p>The New Zealand government is working with third-party developers to complete the <a href="https://www.health.govt.nz/our-work/diseases-and-conditions/covid-19-novel-coronavirus/covid-19-resources-and-tools/nz-covid-tracer-app/integrating-nz-covid-tracer-other-contact-tracing-apps">integration of other apps</a> by the end of December to enable the exchange of digital contact-tracing information from different apps and technologies.</p>
<p>The Privacy Commissioner has already <a href="https://www.privacy.org.nz/publications/statements-media-releases/privacy-commissioner-supports-bluetooth-upgrade-to-covid-tracer/">endorsed</a> the bluetooth upgrade of the official NZ COVID Tracer app because of its focus on users’ privacy. And the Ministry of Health aims to release the source code for the app so New Zealanders can see how their personal data has been managed.</p>
<p>Throughout the summer, the government and ministry should emphasise the importance of using the contact-tracing app and assure New Zealanders about the security and privacy of their personal data.</p>
<p>Adoption of contact-tracing apps is no silver bullet in the battle against COVID-19, but it is a crucial element in New Zealand’s collective public health response to the global pandemic.</p><img src="https://counter.theconversation.com/content/152085/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Farkhondeh Hassandoust does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>New Zealanders will travel more during the summer period and it is more important than ever to use the contact-tracing app to improve our chances of controlling any potential outbreaks.Farkhondeh Hassandoust, Lecturer, Auckland University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1387252020-06-02T20:03:30Z2020-06-02T20:03:30ZSmart cities can help us manage post-COVID life, but they’ll need trust as well as tech<p><em>“This virus may become just another endemic virus in our communities and this virus may never go away.”</em> <strong>– <a href="https://www.channelnewsasia.com/news/world/covid19-coronavirus-who-go-away-12729420">WHO executive director Mike Ryan</a>, May 13</strong></p>
<p>Vaccine or not, we have to come to terms with the reality that COVID-19 requires us to rethink how we live. And that includes the idea of smart cities that use advanced technologies to serve citizens. This has become critical in a time of pandemic.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/coronavirus-recovery-public-transport-is-key-to-avoid-repeating-old-and-unsustainable-mistakes-138415">Coronavirus recovery: public transport is key to avoid repeating old and unsustainable mistakes</a>
</strong>
</em>
</p>
<hr>
<p>Smart city solutions have already proved handy for curbing the contagion. Examples include:</p>
<ul>
<li><p><a href="https://www.smartcitiesworld.net/news/news/covid-19-accelerates-the-adoption-of-smart-city-tech-to-build-resilience--5259">remote temperature monitoring systems</a></p></li>
<li><p><a href="https://www.timeout.com/singapore/news/nparks-launches-a-real-time-map-where-you-can-monitor-crowd-levels-at-singapores-parks-040520">real-time heatmaps of crowding</a> in public spaces</p></li>
<li><p><a href="https://www.weforum.org/agenda/2020/03/three-ways-china-is-using-drones-to-fight-coronavirus/">drones spraying disinfectants</a></p></li>
<li><p>robots acting as “<a href="https://www.smartcitiesworld.net/news/news/singapore-pilots-robot-dog-to-assist-safe-distancing-in-parks-5266">safe-distance ambassadors</a>”. </p></li>
</ul>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/2DJmIjKtVkA?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">The robot dog called SPOT is being trialled in Singapore to remind people to practise physical distancing.</span></figcaption>
</figure>
<p>But as we prepare to move beyond this crisis, cities need to design systems that are prepared to handle the next pandemic. Better still, they will reduce the chances of another one.</p>
<h2>Issues of trust are central</h2>
<p>In a world of egalitarian governments and ethical corporations, the solution to a coronavirus-like pandemic would be simple: a complete individual-level track and trace system. It would use geolocation data and CCTV image recognition, complemented by remote biometric sensors. While some such governments and corporations do exist, putting so much information in the hands of a few, without airtight privacy controls, could lay the foundations of an Orwellian world.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/darwins-smart-city-project-is-about-surveillance-and-control-127118">Darwin's 'smart city' project is about surveillance and control</a>
</strong>
</em>
</p>
<hr>
<p><a href="https://pubsonline.informs.org/doi/10.1287/msom.2019.0823">Our research</a> on smart city challenges suggests a robust solution should be a mix of protocols and norms covering technology, processes and people. To avoid the perils of individual-level monitoring systems, we need to focus on how to leverage technology to modify voluntary citizen behaviour. </p>
<p>This is not a trivial challenge. Desired behaviours that maximise societal benefit may not align with individual preferences in the short run. In part, this could be due to misplaced beliefs or misunderstanding of the long-term consequences. </p>
<p>As an example, despite the rapid spread of COVID-19 in the US, many states have had public protests against lockdowns. A <a href="https://docs.cdn.yougov.com/1ghnpqhhpu/econToplines.pdf">serious proportion of polled Americans</a> believe this pandemic is a hoax, or that its threat is being exaggerated for political reasons.</p>
<h2>Design systems that build trust</h2>
<p>The first step in modifying people’s behaviour to align with the greater good is to design a system that builds trust between the citizens and the city. Providing citizens with timely and credible information about important issues and busting falsehoods goes a long way in creating trust. It helps people to understand which behaviours are safe and acceptable, and why this is for the benefit of the society and their own long-term interest.</p>
<p>In <a href="https://www.mci.gov.sg/pressroom/news-and-stories/pressroom/2020/4/gov-sg-launches-new-channels-to-keep-the-public-informed-about-covid-19">Singapore</a>, the government has very effectively used social media platforms like WhatsApp, Facebook, Twitter, Instagram and Telegram to regularly share COVID-19 information with citizens. </p>
<p>Densely populated cities in countries like India face extra challenges due to vast disparities in education and the many languages used. Smart city initiatives have emerged there to seamlessly provide citizens with information in their local language via a smartphone app. These include an <a href="https://www.edexlive.com/happening/2020/may/09/why-this-myth-and-fake-news-buster-app-by-iiit-delhi-profs-is-the-coolest-thing-in-this-covid-19-wor-11890.html">AI-based myth-busting chatbot</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-smart-city-technology-can-be-used-to-measure-social-distancing-135139">How smart city technology can be used to measure social distancing</a>
</strong>
</em>
</p>
<hr>
<h2>Guard against misuse of data</h2>
<p>Effective smart city solutions require citizens to volunteer data. For example, keeping citizens updated with real-time information about crowding in a public space depends on collecting individual location data in that space. </p>
<p>Individual-level data is also useful to co-ordinate responses during emergencies. Contact tracing, for instance, has emerged as an essential tool in slowing the contagion. </p>
<p>Technology-based smart city initiatives can enable the collection, analysis and reporting of such data. But misuse of data erodes trust, which dissuades citizens from voluntarily sharing their data.</p>
<p>City planners need to think about how they can balance the effectiveness of tech-based solutions with citizens’ privacy concerns. Independent third-party auditing of solutions can help ease these concerns. The <a href="https://www.technologyreview.com/2020/05/07/1000961/launching-mittr-covid-tracing-tracker/">MIT Technology Review’s</a> audit report on contact-tracing apps is one example during this pandemic. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-trade-offs-smart-city-apps-like-covidsafe-ask-us-to-make-go-well-beyond-privacy-138296">The trade-offs 'smart city' apps like COVIDSafe ask us to make go well beyond privacy</a>
</strong>
</em>
</p>
<hr>
<p>It is also important to create robust data governance policies. These can help foster trust and encourage voluntary sharing of data by citizens. </p>
<p>Using several case studies, the consulting firm PwC has proposed a <a href="https://www.pwc.com/us/en/industries/capital-projects-infrastructure/library/foundation-of-smart-city-success.html">seven-layer framework</a> for data governance. It describes balancing privacy concerns of citizens and efficacy of smart city initiatives as the “key to realising smart city potential”. </p>
<p>As we emerge from this pandemic, we will need to think carefully about the data governance policies we should implement. It’s important for city officials to learn from early adopters.</p>
<p>While these important issues coming out of smart city design involve our behaviour as citizens, modifying behaviour isn’t enough in itself. Civic leaders also need to rethink the design of our city systems to support citizens in areas like public transport, emergency response, recreational facilities and so on. Active collaboration between city planners, tech firms and citizens will be crucial in orchestrating our future cities and hence our lives.</p>
<hr>
<p><em>The author acknowledges suggestions from Aarti Gumaledar, Director of Emergentech Advisors Ltd.</em></p><img src="https://counter.theconversation.com/content/138725/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sameer Hasija has received research support from <a href="http://www.mytechfrontier.com">www.mytechfrontier.com</a>.</span></em></p>Smart city solutions have proved handy for curbing the contagion, but recent experience has also shown how much they rely on public trust. And that in turn depends on transparency and robust safeguardsSameer Hasija, Associate Professor of Technology and Operations Management, INSEADLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1382962020-05-17T19:57:16Z2020-05-17T19:57:16ZThe trade-offs ‘smart city’ apps like COVIDSafe ask us to make go well beyond privacy<p>The Commonwealth government says if enough of us download its COVIDSafe app, restrictions on our movements and activities can be lifted more quickly and life can return to normal. As important as it is to contain the spread of coronavirus, no government decision about how to do that is beyond question. For those of us concerned about the social and political life of our increasingly “smart” cities, the thinking behind the COVIDSafe app and other “smart city” technology must be open to challenge.</p>
<p>The public focus has been on the app’s privacy implications, but other important issues warrant critical scrutiny too. Indeed, the app could help to entrench problematic forms of social and corporate power over our lives. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/darwins-smart-city-project-is-about-surveillance-and-control-127118">Darwin's 'smart city' project is about surveillance and control</a>
</strong>
</em>
</p>
<hr>
<h2>Social control</h2>
<p>As research on the <a href="https://doi.org/10.1177%2F0263775818812084">politics of smart technologies in our cities insists</a>, while personal privacy is important, it’s not the only issue here. Apps like this have implications for the forms of social control that operate in dense urban environments – where use of a digital technology is technically “voluntary”, but ends up being required if people want access to urban spaces and infrastructures. </p>
<p>Some protections are being promised in the case of the COVIDSafe app. These include a prohibition on employers, government authorities and others requiring any individual to install the app. The law still might not stop this in practice. Some business groups have <a href="http://www.theaustralian.com.au/nation/politics/coronavirus-employers-want-power-over-covidsafe-app/news-story/8d1cc1decd2df8a875fd48bf1bd4d949&usg=AOvVaw1CMMU-5bHCGy-_rOxOFAOj">lobbied government</a> to enable employers to require employees to use the app.</p>
<p>Even if this legal prohibition holds, Prime Minister Scott Morrison has been making thinly veiled threats about more people needing to download the app before he lifts restrictions. App uptake is being demanded in the name of a public interest (in this case, public health). </p>
<p>There’s also significant risk of mission creep here. What other “public interests” might be used to justify contract tracing based on this precedent? It’s easy to imagine government agencies and authorities desiring contact tracing in the service of a range of interests that could be discriminatory and oppressive – the policing of immigrants, welfare recipients and activists, for example. </p>
<p>We must guard against such surveillance creep.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ai-can-help-in-crime-prevention-but-we-still-need-a-human-in-charge-95516">AI can help in crime prevention, but we still need a human in charge</a>
</strong>
</em>
</p>
<hr>
<h2>Privacy protections</h2>
<p>Compared to other government and corporate apps, the COVIDSafe app now has relatively strong privacy protections. It keeps information about who you share space or associate with, but not where you go. It does this by storing encrypted data on the user’s phone about any other phones in range of a Bluetooth “handshake” that are also running the app.</p>
<p>Data will be automatically deleted after 21 days. Data will only be shared after a user has tested positive for COVID-19 and agreed to share the data. Only state health authorities may request and access data for contact tracing.</p>
<p>The <a href="https://www.itnews.com.au/news/covidsafe-privacy-protections-now-locked-in-law-548119">legislated protections</a> represent a big advance on some other government apps. For instance, <a href="https://www.smh.com.au/technology/no-warrants-needed-to-access-opal-card-records-20140708-zt02j.html">over 100 government authorities have access</a> to the data the New South Wales government collects from its public transport Opal smartcard. </p>
<p>It may be that neither governments nor corporations can assume people will continue to uncritically accept “<a href="https://reallifemag.com/the-authoritarian-trade-off/">trade-offs</a>” of public goods like personal privacy and autonomy for the convenience and benefits of digital technology.</p>
<p>However, <a href="https://www.iispartners.com/blog">some important privacy issues</a> remain unresolved, including: </p>
<ul>
<li><p>the amount of data stored, which is <a href="https://www.health.gov.au/resources/publications/covidsafe-application-privacy-impact-assessment">about all devices in range</a>, not just those in range for more than 15 minutes</p></li>
<li><p>whether data stored on Amazon servers will <a href="https://www.theguardian.com/law/2020/may/14/questions-remain-over-whether-data-collected-by-covidsafe-app-could-be-accessed-by-us-law-enforcement">potentially be accessible for US law enforcement agencies</a>, </p></li>
<li><p>when and how the data and app will finally be deleted.</p></li>
</ul>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-covidsafe-bill-doesnt-go-far-enough-to-protect-our-privacy-heres-what-needs-to-change-137880">The COVIDSafe bill doesn't go far enough to protect our privacy. Here's what needs to change</a>
</strong>
</em>
</p>
<hr>
<h2>Questions of power and profit</h2>
<p>It’s also important to ask who benefits from the mass uptake of this app.</p>
<p>A government agency developed the app, drawing in part on an open-source app made available by the Singapore government. But even when an app is “free” and no one profits from its sale, remember that smartphones and data are not free. </p>
<p>Data storage has been contracted out to Amazon Web Services. It was the only company asked to tender for this lucrative government contract. That has <a href="https://www.innovationaus.com/sovereign-capability-and-that-shocking-aws-deal/">raised both security concerns and questions</a> about why locally owned, security-accredited providers were not invited.</p>
<p>Like so many instances of “smart” technology being offered as the solution to pressing problems, the profits of big tech and big telcos who sell us devices, connectivity and data storage are being presented as natural and aligned with public good. It is clear tech corporations see the coronavirus crisis as an <a href="https://theintercept.com/2020/05/08/andrew-cuomo-eric-schmidt-coronavirus-tech-shock-doctrine/">opportunity to consolidate and expand their profits and their power</a>. Every problem <a href="https://www.lawfareblog.com/location-surveillance-counter-covid-19-efficacy-what-matters">looks like a nail to the folks who have hammers to sell</a>.</p>
<h2>Will it work?</h2>
<p>Given these concerns, will the COVIDSafe app even perform as promised? Here, the jury is still out. </p>
<p>Much discussion has focused on the minimum number of app users required for its coverage to be effective. But the app has other limitations too. It <a href="https://theconversation.com/contact-tracing-apps-are-vital-tools-in-the-fight-against-coronavirus-but-who-decides-how-they-work-138206">doesn’t yet work properly on iPhones</a>, for a start.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/in-some-places-40-of-us-may-have-downloaded-covidsafe-heres-why-the-government-should-share-what-it-knows-138323">In some places 40% of us may have downloaded COVIDSafe. Here's why the government should share what it knows</a>
</strong>
</em>
</p>
<hr>
<p>Most importantly, the app treats treats Bluetooth handshakes as a proxy for spatial proximity of devices, it treats this spatial proximity as a proxy for contact between people, and it treats prolonged contact between people as a proxy for viral transmission. Each step in this chain is <a href="http://progcity.maynoothuniversity.ie/wp-content/uploads/2020/04/Digital-tech-spread-of-coronavirus-Rob-Kitchin-PC-WP44.pdf">prone to significant failures and error</a>. </p>
<p>Fortunately, then, the government is not proposing to replace contact tracing performed by human health professionals. Data from the app will be used to support that process.</p>
<p>It’s vital we expand the scope of public discussion about this app and others in our increasingly “smart” cities and societies. Otherwise, we risk embracing “smart” solutions that create new surveillance infrastructures that further concentrate state and corporate power at the expense of our autonomy and alternative solutions to pressing societal problems.</p><img src="https://counter.theconversation.com/content/138296/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Kurt Iveson has received funding from the Australian Research Council, the Henry Halloran Trust, and the City of Sydney.</span></em></p>The COVIDSafe app hasn’t come out of nowhere. The promises of ‘smart city’ data collection may be seductive, but we must always weigh up what we’re being asked to give up in return.Kurt Iveson, Associate Professor of Urban Geography and Research Lead, Sydney Policy Lab, University of SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1300162020-01-20T15:36:12Z2020-01-20T15:36:12ZAmazon Echo’s privacy issues go way beyond voice recordings<figure><img src="https://images.theconversation.com/files/310498/original/file-20200116-181645-lsn71f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://pixabay.com/photos/echo-dot-amazon-language-assistant-2937627/">HeikoAL/Pixabay</a></span></figcaption></figure><p>Amazon Echo and the Alexa voice assistant have had widely publicised issues with privacy. Whether it is the <a href="https://www.theguardian.com/technology/2019/oct/09/alexa-are-you-invading-my-privacy-the-dark-side-of-our-voice-assistants">amount of data they collect</a> or the fact that they reportedly pay employees and, at times, external contractors from all over the world to <a href="https://www.bloomberg.com/news/articles/2019-04-10/is-anyone-listening-to-you-on-alexa-a-global-team-reviews-audio">listen to recordings to improve accuracy</a>, the potential is there for sensitive personal information to be leaked through these devices.</p>
<p>But the risks extend not just to our relationship with Amazon. Major privacy concerns are starting to emerge in the way Alexa devices interact with other services – risking a dystopian spiral of increasing <a href="https://ojs.library.queensu.ca/index.php/surveillance-and-society/article/view/13204">surveillance and control</a>.</p>
<p>The setup of the Echo turns Amazon into an extra gateway that every online interaction has to pass through, collecting data on each one. Alexa knows what you are searching for, listening to or sending in your messages. Some smartphones do this already, particularly those made by Google and Apple who control the hardware, software and cloud services.</p>
<p>But the difference with an Echo is that it brings together the worst aspects of smartphones and smart homes. It is not a personal device but integrated into the home environment, always waiting to listen in. Alexa even features an art project (not created by Amazon) that tries to make light of this with the creepy “<a href="https://www.cnet.com/g00/how-to/the-weirdest-things-your-amazon-echo-can-say-and-do/?i10c.ua=1&i10c.encReferrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8%3d&i10c.dv=18">Ask the Listeners</a>” function that makes comments about just how much the device is spying on you. Some Echo devices already have cameras, and if facial recognition capabilities were added we could enter a world of pervasive monitoring in our most private spaces, even tracked as we move between locations.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/MnvJ4Bh60L8?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>This technology gives Amazon a huge amount of control over your data, which has long been the aim of most of the tech giants. While <a href="https://www.apple.com/uk/privacy/features/">Apple</a> and <a href="https://slate.com/technology/2019/10/google-pixel-4-recorder-voice-recognition-transcription.html">Google</a> – who face their <a href="https://wsj.com/articles/privacy-problems-mount-for-tech-giants-11548070201">own privacy issues</a> – have similar voice assistants, they have at least made progress running the software directly on their devices so they won’t need to transfer recordings of your voice commands to their servers. Amazon doesn’t appear to be trying to do the same.</p>
<p>This is, in part, because of the firm’s aggressive business model. Amazon’s systems appear not just designed to collect as much data as they can but also to create ways of sharing it. So the potential issues run much deeper than Alexa listening in on private moments.</p>
<h2>Sharing with law enforcement</h2>
<p>One area of concern is the potential for putting the ears of law enforcement in our homes, schools and workplaces. Apple has a <a href="https://theintercept.com/2016/02/26/fbi-vs-apple-post-crypto-wars/">history of resisting FBI requests</a> for user data, and Twitter is relatively transparent about reporting on <a href="https://transparency.twitter.com/en/information-requests.html">how it responds to requests from governments</a>.</p>
<p>But Ring, the internet-connected home-security camera company owned by Amazon, has a high-profile relationship with police that involves <a href="https://www.theverge.com/2019/8/6/20756555/amazon-ring-police-security-camera-footage-warrant-privacy-surveillance">handing over user data</a>. Even <a href="https://www.theguardian.com/technology/2019/aug/29/ring-amazon-police-partnership-social-media-neighbor">the way citizens and police communicate</a> is increasingly monitored and controlled by Amazon.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/310514/original/file-20200116-181625-1etcp9f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/310514/original/file-20200116-181625-1etcp9f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/310514/original/file-20200116-181625-1etcp9f.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/310514/original/file-20200116-181625-1etcp9f.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/310514/original/file-20200116-181625-1etcp9f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/310514/original/file-20200116-181625-1etcp9f.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/310514/original/file-20200116-181625-1etcp9f.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Always listening.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/man-using-virtual-assistant-smart-speaker-1595220568">Tomasso79/Shutterstock</a></span>
</figcaption>
</figure>
<p>This risks embedding a culture of state surveillance in Amazon’s operations, which could have worrying consequences. We’ve seen numerous examples of law enforcement and other government bodies in democratic countries using personal data to spy on people, both in <a href="https://www.independent.co.uk/news/uk/home-news/mi5-data-breach-safeguards-investigatory-powers-act-javid-a8913506.html">breach of the law</a> and within it but for reasons that go <a href="https://www.theguardian.com/world/2016/dec/25/british-councils-used-investigatory-powers-ripa-to-secretly-spy-on-public">far beyond</a> the prevention of terrorism. This kind of mass surveillance also creates severe potential for discrimination, as it has been shown repeatedly to have a worse impact on <a href="https://blogs.lse.ac.uk/gender/2016/06/02/5-reasons-why-surveillance-is-a-feminist-issue/">women</a> and <a href="https://ojs.library.queensu.ca/index.php/surveillance-and-society/article/view/12946">minority</a> <a href="https://techcrunch.com/2016/04/25/surveillance-as-a-tool-for-racism/">groups</a>.</p>
<p>If Amazon isn’t willing to push back, it’s not hard to imagine <a href="https://www.vox.com/the-goods/2018/11/12/18089090/amazon-echo-alexa-smart-speaker-privacy-data">Alexa recordings being handed over</a> to the requests of government employees and law enforcement officers who might be willing to violate the spirit or letter of the law. And given international intelligence-sharing agreements, even if you trust your own government, do you trust others?</p>
<p>In response to this issue, an Amazon spokesperson said: “Amazon does not disclose customer information in response to government demands unless we’re required to do so to comply with a legally valid and blinding order. Amazon objects to overbroad or otherwise inappropriate demands as a matter of course.</p>
<p>"Ring customers decide whether to share footage in response to asks from local police investigating cases. Local police are not able to see any information related to which Ring users received a request and whether they declined to share or opt out of future requests.” They added that although local police can access Ring’s Neighbors app for reporting criminal and suspicious activity, they cannot see or access user account information.</p>
<h2>Tracking health issues</h2>
<p>Health is another area where Amazon appears to be attempting a takeover. The UK’s National Health Service (NHS) has signed a deal for medical advice to be <a href="https://www.theguardian.com/society/2019/dec/08/nhs-gives-amazon-free-use-of-health-data-under-alexa-advice-deal">provided via the Echo</a>. At face value, this simply extends ways of accessing publicly available information like the NHS website or phone line 111 – no official patient data is being shared.</p>
<p>But it creates the possibility that Amazon could start tracking what health information we ask for through Alexa, effectively building profiles of users’ medical histories. This could be linked to online shopping suggestions, third-party ads for costly therapies, or even ads that are potentially traumatic (think women who’ve suffered miscarriages <a href="https://www.huffingtonpost.co.uk/entry/women-affected-by-miscarriage-and-infertility-are-being-targeted-with-baby-ads-on-facebook_uk_5d7f7c42e4b00d69059bd88a">being shown baby products</a>).</p>
<p>An Amazon spokesperson said: “Amazon does not build customer health profiles based on interactions with nhs.uk content or use such requests for marketing purposes. Alexa does not have access to any personal or private information from the NHS.”</p>
<p>The crudeness and glitches of algorithmic advertising would violate the professional and moral standards that health services strive to maintain. Plus it would be highly invasive to treat the data in the same way many Echo recordings are. Would you want a random external contractor to know you were asking for sexual health advice?</p>
<h2>Transparency</h2>
<p>Underlying these issues is a lack of real transparency. Amazon is disturbingly <a href="https://www.wired.co.uk/article/echo-show-5-alexa-privacy-settings">quiet, evasive and reluctant</a> to act when it comes to tackling the privacy implications of their practices, many of which are buried deep within their terms and conditions or hard-to-find settings. Even tech-savvy users don’t necessarily know the <a href="https://dl.acm.org/doi/10.1145/3278721.3278773">full extent of the privacy risks</a>, and when privacy features are added, they often only <a href="https://techcrunch.com/2019/08/03/amazon-quietly-adds-no-human-review-option-to-alexa-as-voice-ais-face-privacy-scrutiny/">make users aware after researchers or the press raise the issue</a>. It is entirely unfair to place such a burden on users to find out and mitigate what these risks are.</p>
<p>So if you have an Echo in your home, what should you do? There are many tips available on <a href="https://www.wired.com/story/alexa-google-assistant-echo-smart-speaker-privacy-controls/">how to make the device more private</a>, such as setting voice recordings to automatically delete or limiting what data is shared with third parties. But smart tech is almost always surveillance tech, and the best piece of advice is not to bring one into your home.</p>
<p><em>In response to the main points of this article, an Amazon spokesperson told The Conversation</em>:</p>
<blockquote>
<p>At Amazon, customer trust is at the centre of everything we do and we take privacy and security very seriously. We have always believed that privacy has to be foundational and built in to every piece of hardware, software, and service that we create. From the beginning, we’ve put customers in control and always look for ways to make it even easier for customers to have transparency and control over their Alexa experience. We’ve introduced several privacy improvements including the option to have voice recordings automatically deleted after three or 18 months on an ongoing basis, the ability to ask Alexa to “delete what I just said” and “delete what I said today,” and the Alexa Privacy Hub, a resource available globally that is dedicated to helping customers learn more about our approach to privacy and the controls they have. We’ll continue to invent more privacy features on behalf of customers.</p>
</blockquote>
<p><em>This article has been amended to make clear the “Ask the Listeners” function is an art project created by a third party.</em></p><img src="https://counter.theconversation.com/content/130016/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Garfield Benjamin does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Hey Alexa, who are you sharing my data with?Garfield Benjamin, Postdoctoral Researcher, School of Media Arts and Technology, Solent UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1274672019-12-17T10:41:07Z2019-12-17T10:41:07ZHundreds of Chinese citizens told me what they thought about the controversial social credit system<figure><img src="https://images.theconversation.com/files/306108/original/file-20191210-95111-ag1v5h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Commuters on the Shanghai Metro all on their smartphones in March, 2019.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/shanghai-china-12-mar-2019-asian-1417042709">Shutterstock/HengLim</a></span></figcaption></figure><p>The Chinese social credit system has been given an unequivocally <a href="https://www.theatlantic.com/technology/archive/2018/02/chinas-dangerous-dream-of-urban-control/553097/">negative reception by the media in the west</a>. Set to be rolled out nationwide in 2020, the system has even been <a href="https://www.wired.co.uk/article/china-social-credit-system-explained">described by one journalist</a> as China’s “most ambitious project in social engineering since the Cultural Revolution”.</p>
<p>On the surface, this reaction is understandable. Once the system is fully implemented, Chinese citizens will be given a social credit score based on their deeds. For example, failure to pay a court bill or playing loud music in public may cause a low score. This score can dictate what rights people have. Those on the “blacklist” are prevented from buying plane or train tickets, for instance, as well as working as <a href="https://www.scmp.com/economy/china-economy/article/2186606/chinas-social-credit-system-shows-its-teeth-banning-millions">civil servants or in certain industries</a>.</p>
<p>The fact that Big Data and facial recognition technology will be applied for the purpose of monitoring citizens raises various human rights concerns. Not surprisingly, the scheme <a href="https://www.sciencealert.com/china-s-dystopian-social-credit-system-science-fiction-black-mirror-mass-surveillance-digital-dictatorship">has been described as</a> a “digital dictatorship” <a href="https://nypost.com/2018/09/19/chinas-social-credit-system-is-a-real-life-black-mirror-nightmare/">and a</a> “dystopian nightmare straight out of Black Mirror”.</p>
<p>But what these accounts lack is a sense of how the system is perceived from within China, which turns out to be rather complicated. <a href="https://www.ucl.ac.uk/anthropology/assa/">My 16-month ethnographic study found</a> that ordinary Chinese people perceive and accept the system differently – and most of them seem to welcome it.</p>
<p>The study, which I carried out in 2018-2019, examined the use of digital devices, such as smartphones, in Shanghai. Ethnography tries to minimise “artificial” encounters, such as surveys and interviews, in favour of being present with people in their everyday lives. My study was designed to gain a holistic understanding of ordinary Chinese people’s daily lives, with a particular focus on digital engagement which at times included dealing with big issues such as state digital surveillance. I let people talk freely about their feelings and ideas.
I spoke to around 500 people and I spent at least 15 hours with around a third of these. Conversations about the social credit system came up naturally rather than through direct questioning.</p>
<p>Contrary to what many people in the west believe, in private and during informal talks among friends, ordinary Chinese are not shy or concerned about expressing their opinions about politics. </p>
<h2>Fear of fraud</h2>
<blockquote>
<p>Living in China … you have to be always on guard against others as pits of fraud are everywhere.</p>
</blockquote>
<p>These are the words of Mr Zhu, a man in his 40s. He was explaining his reluctance to let his mother use a smartphone as she may fall prey to online scammers. He was not alone in worrying about what is seen as an intensifying crisis of public morality. Another research participant (the mother of a newborn baby searching for a nanny) ended up installing secret cameras at home to help her choose a trustworthy one.</p>
<p>The people I spoke to seemed less concerned about giving up some privacy if it meant a significantly higher degree of security and certainty. And a lot of the people I spoke to perceived the new social credit system as a national project to boost public morality through fighting fraud and crime and combating what is currently seen as a nationwide <a href="https://www.latimes.com/archives/la-xpm-2006-sep-24-fg-trust24-story.html">crisis of trust</a>. </p>
<p>China has experienced a rising number of fraud cases and scams, as well as major scandals in the <a href="https://www.bbc.co.uk/news/business-41105589">food safety</a> and <a href="https://www.ft.com/content/89ca8824-81e2-11e9-9935-ad75bb96c849">pharmaceutical industries</a>. There is a widely held consensus that the punishment for these offences is not enough to deter re-offending, with people committing crimes in one province and setting up a business in another the next day with few consequences. Some believe the social credit system will remedy this through the blacklisting system. </p>
<p>There is also another narrative which says that western society is “civilised” because of a long-existing credit system. But this narrative is largely based on an imagined version of western society. And many assume that the idea of a social credit system in China was actually imported from the west.</p>
<p>Penyue, a retired teacher, complained about “uncivilised” deeds, such as spitting or littering in public and said: “Things in the west are better because they have a mature credit system, right?”</p>
<p>Some see it as the equivalent of the more established concept of “credit-worthiness” or getting a good “credit” score (but in the moral, as opposed to the financial sense). There are many apocryphal stories linked to this myth, including one about a Chinese graduate who finds herself outside China in a western city and – despite being qualified – cannot secure a job, because of her past record of fare dodging on trains (an offence which stayed on her credit record). </p>
<p>The point of the story is that in western societies people who break even minor rules won’t be accepted (no matter how qualified), as there are consequences. Stories like this use “the west” as a moral showcase of what a “civilised” society should be. </p>
<p>These stories may be false, but they are true reflections of a commonly held belief that the problem was created by individualism and modernity in China and that the west dealt with the transition to modernity more effectively. China’s own transition from an agricultural collective society (where people always knew who they were dealing with) to a modern one characterised by reliance on strangers is ongoing, and people believe that navigating this requires guidance. </p>
<h2>The sky is watching</h2>
<p>The erosion of mutual trust is also attributed to Mao Zedong’s Cultural Revolution, a turbulent period characterised by <a href="https://voxeu.org/article/impact-within-group-conflict-trust-and-trustworthiness">everyone denouncing everyone else</a>, including friends and family. So citizens see there is a need for mechanisms that enable people to take full responsibility for, and be judged by, their deeds.</p>
<p>Chinese citizens have also tended to view <a href="https://www.cambridge.org/core/books/money-as-god/fates-gift-economy-the-chinese-case-of-coping-with-the-asymmetry-between-man-and-fate/1917C9E2CA1E3359AF0F6F69976B8CE9">life itself as credit</a> and often refer to an old saying: “People are doing things, and the sky (<em>tian</em>) is watching.” This means that whatever one does, there is always a record of their deeds in the sky. The karma system is the standardisation of the relationship between human beings and supernatural powers. One can earn points by doing good deeds, but these can also be easily squandered through bad ones. </p>
<p>I am not trying to adjudicate whether it is appropriate for modern China to play the role of <em>Tian</em>, but it is important when writing about these developments to appreciate the way they are understood within Chinese society and why attitudes there might be quite different from what people in the west might assume.</p><img src="https://counter.theconversation.com/content/127467/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dr. Xinyuan Wang receives funding from the European Research Council. You can read a more detailed version of this article at: <a href="https://blogs.ucl.ac.uk/assa/2019/12/09/chinas-social-credit-system-the-chinese-citizens-perspective/">https://blogs.ucl.ac.uk/assa/2019/12/09/chinas-social-credit-system-the-chinese-citizens-perspective/</a> </span></em></p>China’s social credit system has been described as a ‘dystopian nightmare straight out of Black Mirror’ but many citizens think it will help fight fraud and bring about a better society.Xinyuan Wang, Research Associate in the Department of Anthropology, UCLLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1170952019-05-28T19:46:24Z2019-05-28T19:46:24ZIs China’s social credit system coming to Australia?<p>Privacy was not a hot topic in the recent Australian election, but it should have been. This is because the City of Darwin is adapting elements of the <a href="https://theconversation.com/chinas-social-credit-system-puts-its-people-under-pressure-to-be-model-citizens-89963">Chinese social credit system</a> for use in Australia. The Chinese system’s monitoring of citizens’ behaviour has been widely condemned as “<a href="https://www.forbes.com/sites/rhockett/2019/01/03/when-is-social-credit-orwellian/#479be9f33674">Orwellian</a>”, with frequent <a href="https://www.wired.co.uk/article/china-social-credit-system-explained">comparisons to the dystopian near-future sci-fi of Black Mirror</a>. But for Australians it’s pitched as progress towards a digitally integrated future, embedded innocuously in the “<a href="https://www.darwin.nt.gov.au/council/transforming-darwin/key-projects/switchingondarwin">Switching on Darwin</a>” plans for a smarter city. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/chinas-social-credit-system-puts-its-people-under-pressure-to-be-model-citizens-89963">China’s Social Credit System puts its people under pressure to be model citizens</a>
</strong>
</em>
</p>
<hr>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/276674/original/file-20190528-193544-hdjzmc.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/276674/original/file-20190528-193544-hdjzmc.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/276674/original/file-20190528-193544-hdjzmc.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=683&fit=crop&dpr=1 600w, https://images.theconversation.com/files/276674/original/file-20190528-193544-hdjzmc.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=683&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/276674/original/file-20190528-193544-hdjzmc.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=683&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/276674/original/file-20190528-193544-hdjzmc.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=858&fit=crop&dpr=1 754w, https://images.theconversation.com/files/276674/original/file-20190528-193544-hdjzmc.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=858&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/276674/original/file-20190528-193544-hdjzmc.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=858&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><a class="source" href="https://www.facebook.com/search/top/?q=The%20NT%20News%20darwin%20smart%20city%20Chinese-inspired%20surveillance&epa=SEARCH_BOX">Facebook/NT News</a></span>
</figcaption>
</figure>
<p>To see why this is a worrying development for Australian democracy one must first play a patient game of join the dots. </p>
<p><strong>Dot 1.</strong> One of Darwin’s <a href="https://www.darwin.nt.gov.au/community/programs/sister-cities-program/overview">six “sister cities”</a> is Haikou, capital of the Chinese island province of Hainan. Links established through sister-city relationships are commonly understood to be a springboard to wider networks of co-operative arrangements. Such connections may provide opportunities for cultural exchange, but also for technological exchange. </p>
<p>Recently there have been reports on how smart city plans in Darwin <a href="https://t.co/iMm1iXhUxZ">draw inspiration</a> from the Chinese social credit surveillance system.</p>
<p>The potential of the system for gathering data on citizens’ use of public services, such as Wi-Fi, has been noted. The potential to enhance council profitability through sale of user data to the private sector is significant. More so is the potential for this system to track citizen movements in real time.</p>
<p><strong>Dot 2.</strong> The 2019 Northern Territory government budget earmarks A$1.4 million for expanding the local CCTV network as part of “<a href="http://newsroom.nt.gov.au/mediaRelease/28990">Investing in a Safer Territory</a>”. This figure might yet be supplemented by “proceeds of crime” funds, making the investment much larger. </p>
<p>That’s enough money to roll out biometric facial recognition software, which can link your face from a live CCTV image to your driver licence or passport, as well as “<a href="https://en.wikipedia.org/wiki/Triggerfish_(surveillance)">triggerfish</a>” apps that can access, for example, identifying data on your smartphone remotely without your knowledge. All of these systems can be automated.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/big-brother-is-watching-how-new-technologies-are-changing-police-surveillance-115841">Big brother is watching: how new technologies are changing police surveillance</a>
</strong>
</em>
</p>
<hr>
<p><strong>Dot 3.</strong> The <a href="https://www.legislation.gov.au/Details/C2018B00180">Encryption Act</a>, rushed through federal parliament in December 2018, gave law enforcement and intelligence agencies unprecedented access to communications technology. Telecommunications providers must now provide potentially unlimited back doors into private data. They must also, by law, conceal that they have done so from customers/citizens. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-governments-encryption-laws-finally-passed-despite-concerns-over-security-108409">The government's encryption laws finally passed despite concerns over security</a>
</strong>
</em>
</p>
<hr>
<h2>Foundations of a surveillance state are in place</h2>
<p>Each dot offers a point of triangulation for very real fears about the form and nature of Australian democracy in years to come. Combine these points of technology and law and we see the foundation of a surveillance state. </p>
<p>The ability of agencies to track citizen activity extends from which websites you browse on your mobile to what you write in your private messages to where you are right now. Given how grey these laws are, and the absence of a constitutionally protected <a href="https://theconversation.com/australia-should-strengthen-its-privacy-laws-and-remove-exemptions-for-politicians-93717">right to privacy in Australia</a>, this could extend to criminal records, medical files, payslips, spending patterns and browsing histories.</p>
<p>The Northern Territory News <a href="http://ntnews.newspaperdirect.com/epaper/viewer.aspx?issue=66962019041500000000001001&page=5&article=df65383d-2ea2-4913-8107-1e06398d0183&key=sxlJVDjS3fpwukDv5sbNVQ%3D%3D&feed=rss">reported</a>:</p>
<blockquote>
<p>Darwin council will use Chinese-inspired surveillance technology to gather data on what people are doing on their phones and to put up ‘virtual fences’ that will instantly trigger an alert if crossed.</p>
</blockquote>
<p>That is correct. This technology can track a smart phone. It can also, potentially, identify the user. Darwin City’s general manager for innovation, growth and development services, Josh Sattler, <a href="http://ntnews.newspaperdirect.com/epaper/viewer.aspx?issue=66962019041500000000001001&page=5&article=df65383d-2ea2-4913-8107-1e06398d0183&key=sxlJVDjS3fpwukDv5sbNVQ%3D%3D&feed=rss">told the NT News</a>:</p>
<blockquote>
<p>We’ll be getting sent an alarm saying, ‘There’s a person in this area that you’ve put a virtual fence around.’ […] Boom, an alert goes out to whatever authority, whether it’s us or police, to say ‘look at camera 5’. </p>
</blockquote>
<p>That equates to real-time tracking of a private citizen by law enforcement and local council. And this in a free and democratic country.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/275813/original/file-20190522-187157-sma8os.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/275813/original/file-20190522-187157-sma8os.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/275813/original/file-20190522-187157-sma8os.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=343&fit=crop&dpr=1 600w, https://images.theconversation.com/files/275813/original/file-20190522-187157-sma8os.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=343&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/275813/original/file-20190522-187157-sma8os.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=343&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/275813/original/file-20190522-187157-sma8os.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=431&fit=crop&dpr=1 754w, https://images.theconversation.com/files/275813/original/file-20190522-187157-sma8os.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=431&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/275813/original/file-20190522-187157-sma8os.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=431&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">This NT Police image shows CCTV locations in central Darwin. Between camera and mobile phone surveillance, authorities are now capable of real-time tracking of a private citizen.</span>
<span class="attribution"><a class="source" href="https://www.pfes.nt.gov.au/Police/Community-safety/CCTV.aspx">NT Police</a></span>
</figcaption>
</figure>
<h2>Is it smart for the public to be so trusting?</h2>
<p>The Encryption Act takes on a different tint when looked at through this lens. Law enforcement and intelligence organisations have been empowered by law to invade your privacy and protected by law from you knowing that they have done so. </p>
<p>Such data can be used to place restrictions on free movement, a hard limit placed on a universal human right. Such data may also be sold to third parties, either in exchange for deals with government or to boost city coffers. The potential for abuse and the lack of safeguards for Australian citizens are staggering. </p>
<p>The public are told to place angelic trust in the honesty of government agencies, agencies that by and large regulate themselves. There is <a href="https://theconversation.com/new-data-access-bill-shows-we-need-to-get-serious-about-privacy-with-independent-oversight-of-the-law-101378">toothless public oversight </a>by groups like the NSW Independent Commission Against Corruption, which are all too often hamstrung by a culture of silence. </p>
<p>But, remember, if you’ve got nothing to hide you don’t need to be afraid! After all, it’s only a smart city Wi-Fi program for better street lights. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australians-accept-government-surveillance-for-now-110789">Australians accept government surveillance, for now</a>
</strong>
</em>
</p>
<hr>
<p>The City of Darwin and City of Palmerston have also <a href="https://www.darwin.nt.gov.au/sites/default/files/publications/attachments/cod_annualreport_2017-18_interactive.pdf">bought five new high-definition mobile CCTV units</a> with A$635,000 in funding from the Australian government’s Safer Communities Fund. Northern Territory Police will deploy these across both municipalities. The camera systems will be used to police “crime and anti-social behaviour” and to “protect organisations that may face security risks”. </p>
<p>Remember the city of the future is a safer and more vibrant space. And, if you want to be in it, you will be watched both online and offline, wherever you go. All the time.</p><img src="https://counter.theconversation.com/content/117095/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Peter Rogers is affiliated with the volunteer group Civil Liberties Australia, and the professional associations the Australian Sociological Association (TASA) and the International Sociological Association (TASA).</span></em></p>Darwin is one of the aspiring ‘smart cities’ that is adopting Chinese technology that can identify and track individuals. Add changes in Australian law, and we have the makings of a surveillance state.Peter Rogers, Senior Lecturer in Sociology of Law, Macquarie UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1096982019-04-22T10:45:26Z2019-04-22T10:45:26ZHow artificial intelligence systems could threaten democracy<p>U.S. technology giant <a href="https://www.ft.com/content/9378e7ee-5ae6-11e9-9dde-7aedca0a081a">Microsoft has teamed up with a Chinese military university</a> to develop <a href="https://www.irishtimes.com/business/technology/microsoft-worked-with-chinese-military-university-on-ai-1.3855553">artificial intelligence systems</a> that could potentially enhance government surveillance and censorship capabilities. Two <a href="https://www.ft.com/content/5f5916fc-5be3-11e9-939a-341f5ada9d40">U.S. senators publicly condemned</a> the partnership, but what the <a href="http://www.nudt.edu.cn/index_eng.htm">National Defense Technology University of China</a> wants from Microsoft isn’t the only concern.</p>
<p>As <a href="https://scholar.google.com/citations?user=OgVZmm4AAAAJ&hl=en">my research shows</a>, the advent of digital repression is profoundly affecting <a href="https://doi.org/10.1353/jod.2019.0003">the relationship between citizen and state</a>. New technologies are arming governments with unprecedented capabilities to monitor, track and surveil individual people. Even governments in democracies with strong traditions of <a href="https://theconversation.com/is-trumps-definition-of-the-rule-of-law-the-same-as-the-us-constitutions-77598">rule of law</a> find themselves tempted to abuse <a href="https://qz.com/813672/half-of-the-united-states-is-registered-in-police-facial-recognition-databases-and-its-completely-unregulated/">these new abilities</a>.</p>
<p>In states with <a href="https://www.foreignaffairs.com/articles/world/2018-07-10/how-artificial-intelligence-will-reshape-global-order">unaccountable institutions and frequent human rights abuses</a>, AI systems will most likely cause greater damage. China is a prominent example. Its leadership has enthusiastically embraced AI technologies, and has set up the world’s <a href="https://www.nytimes.com/interactive/2019/04/04/world/asia/xinjiang-china-surveillance-prison.html">most sophisticated</a> <a href="https://www.engadget.com/2018/02/22/china-xinjiang-surveillance-tech-spread/">surveillance state</a> in <a href="https://www.theguardian.com/world/2019/feb/18/chinese-surveillance-company-tracking-25m-xinjiang-residents">Xinjiang province</a>, tracking citizens’ daily movements and smartphone use.</p>
<p>Its exploitation of these technologies <a href="https://www.georgesoros.com/2019/01/24/remarks-delivered-at-the-world-economic-forum-2/">presents a chilling model</a> for fellow autocrats and poses a direct threat to open democratic societies. Although there’s no evidence that other governments have replicated this level of AI surveillance, Chinese companies are actively exporting the same underlying technologies across the world.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/270016/original/file-20190418-28097-1i209s9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Surveillance in China’s Xinjiang province includes both extensive police patrols and surveillance cameras, like those on the building in the background.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/China-Tracking-Face/cbbeb8deda184d58a0a1f17fab7e2564/9/0">AP Photo/Ng Han Guan</a></span>
</figcaption>
</figure>
<h2>Increasing reliance on AI tools in the US</h2>
<p><a href="https://ai.stanford.edu/%7Enilsson/QAI/qai.pdf">Artificial intelligence systems</a> are everywhere in the modern world, helping run smartphones, internet search engines, digital voice assistants and Netflix movie queues. <a href="https://governanceai.github.io/US-Public-Opinion-Report-Jan-2019/">Many people fail to realize</a> how quickly AI is expanding, thanks to ever-increasing amounts of data to be analyzed, improving algorithms and advanced computer chips. </p>
<p>Any time more information becomes available and analysis gets easier, governments are interested – and not just authoritarian ones. In the U.S., for instance, the 1970s saw revelations that government agencies – such as the FBI, CIA and NSA – had set up <a href="https://www.intelligence.senate.gov/sites/default/files/94755_II.pdf">expansive domestic surveillance networks</a> to monitor and harass civil rights protesters, political activists and Native American groups. These issues haven’t gone away: Digital technology today has deepened the ability of even more agencies to conduct even more intrusive surveillance.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=498&fit=crop&dpr=1 600w, https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=498&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=498&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=625&fit=crop&dpr=1 754w, https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=625&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/270024/original/file-20190418-28090-1lpg1vm.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=625&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">How fairly do algorithms predict where police should be most focused?</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:Criminaliteits_Anticipatie_Systeem.png">Arnout de Vries</a></span>
</figcaption>
</figure>
<p>For example, U.S. police have eagerly embraced AI technologies. They have begun using software that is <a href="https://theconversation.com/why-big-data-analysis-of-police-activity-is-inherently-biased-72640">meant to predict where crimes will happen</a> to decide where to send officers on patrol. They’re also using <a href="https://www.nbcnews.com/news/us-news/facial-recognition-gives-police-powerful-new-tracking-tool-it-s-n894936">facial recognition</a> and <a href="https://www.washingtonpost.com/crime-law/2018/12/13/fbi-plans-rapid-dna-network-quick-database-checks-arrestees/">DNA analysis</a> in criminal investigations. But analyses of these systems show the <a href="https://theconversation.com/congress-takes-first-steps-toward-regulating-artificial-intelligence-104373">data on which those systems are trained</a> are often biased, leading to <a href="https://theconversation.com/did-artificial-intelligence-deny-you-credit-73259">unfair outcomes</a>, such as <a href="https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing">falsely determining that African Americans are more likely to commit crimes</a> than other groups.</p>
<h2>AI surveillance around the world</h2>
<p>In authoritarian countries, AI systems can directly abet domestic control and surveillance, helping <a href="https://www.power3point0.org/2018/01/25/hybrid-repression-online-and-offline-in-china-foretelling-the-human-rights-struggle-to-come/">internal security forces process massive amounts of information</a> – including social media posts, text messages, emails and phone calls – more quickly and efficiently. The police can identify social trends and <a href="https://www.apnews.com/bf75dd1c26c947b7826d270a16e2658a">specific people</a> who might threaten the regime based on the information uncovered by these systems. </p>
<p>For instance, the Chinese government has used AI in wide-scale crackdowns in regions that are home to ethnic minorities within China. Surveillance systems in Xinjiang and Tibet have been described as “<a href="https://foreignpolicy.com/2019/03/19/962492-orwell-china-socialcredit-surveillance/">Orwellian</a>.” These efforts have included <a href="https://www.nytimes.com/2019/02/21/business/china-xinjiang-uighur-dna-thermo-fisher.html">mandatory DNA samples</a>, Wi-Fi network monitoring and widespread facial recognition cameras, all connected to integrated data analysis platforms. With the aid of these systems, Chinese authorities have, according to the U.S. State Department, “arbitrarily detained” between <a href="https://www.state.gov/j/drl/rls/hrrpt/humanrightsreport/index.htm?year=2018&dlid=289037#wrapper">1 and 2 million people</a>.</p>
<p>My <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3374575">research looks at 90 countries around the world</a> with government types ranging from closed authoritarian to flawed democracies, including Thailand, Turkey, Bangladesh and Kenya. I have found that Chinese companies are <a href="https://carnegieendowment.org/2019/01/22/we-need-to-get-smart-about-how-governments-use-ai-pub-78179">exporting AI surveillance technology</a> to at least 54 of these countries. Frequently, this technology is packaged as part of China’s flagship <a href="https://eng.yidaiyilu.gov.cn/">Belt and Road Initiative</a>, which is funding an extensive network of roads, railways, energy pipelines and telecommunications networks <a href="https://www.knightfrank.com/blog/2018/01/30/an-insight-into-the-belt-and-road-initiative">serving 60% of the world’s population</a> and economies that generate 40% of global GDP.</p>
<p>For instance, Chinese companies like <a href="https://e.huawei.com/us/solutions/industries/smart-city">Huawei</a> and ZTE are constructing “smart cities” in <a href="https://www.dawn.com/news/1333101">Pakistan</a>, <a href="https://e.huawei.com/en/case-studies/global/2017/201704261658">the Philippines</a> and <a href="http://www.chinadaily.com.cn/world/2017-05/16/content_29372143.htm">Kenya</a>, featuring extensive built-in surveillance technology. For example, Huawei has outfitted <a href="https://bgc.com.ph/">Bonifacio Global City</a> in the Philippines with high-definition internet-connected cameras that provide “<a href="https://e.huawei.com/en/case-studies/global/2017/201704261658">24/7 intelligent security surveillance</a> with data analytics to detect crime and help manage traffic.”</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=394&fit=crop&dpr=1 600w, https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=394&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=394&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=495&fit=crop&dpr=1 754w, https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=495&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/270029/original/file-20190418-28094-xukhtb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=495&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Bonifacio Global City in the Philippines has a lot of embedded surveillance equipment.</span>
<span class="attribution"><a class="source" href="https://en.wikipedia.org/wiki/File:Bonifacio_Global_City_2.jpg">alveo land/Wikimedia Commons</a></span>
</figcaption>
</figure>
<p><a href="https://foreignpolicy.com/2018/06/13/in-chinas-far-west-companies-cash-in-on-surveillance-program-that-targets-muslims/">Hikvision</a>, <a href="https://www.scmp.com/tech/social-gadgets/article/2142497/malaysian-police-wear-chinese-start-ups-ai-camera-identify">Yitu</a> and <a href="https://qz.com/1248493/sensetime-the-billion-dollar-alibaba-backed-ai-company-thats-quietly-watching-everyone-in-china/">SenseTime</a> are supplying state-of-the-art facial recognition cameras for use in places like <a href="https://www.albawaba.com/news/china%E2%80%99s-newest-global-export-policing-dissidents-1139230">Singapore</a> – which announced the establishment of a surveillance program with <a href="https://www.reuters.com/article/us-singapore-surveillance/singapore-to-test-facial-recognition-on-lampposts-stoking-privacy-fears-idUSKBN1HK0RV">110,000 cameras mounted on lamp posts</a> around the city-state. Zimbabwe is creating a <a href="https://foreignpolicy.com/2018/07/24/beijings-big-brother-tech-needs-african-faces/">national image database</a> that can be used for facial recognition.</p>
<p>However, selling advanced equipment for profit is different than sharing technology with an express geopolitical purpose. These new capabilities may plant the seeds for global surveillance: As governments become increasingly dependent upon Chinese technology to manage their populations and maintain power, they will face greater pressure to align with China’s agenda. But for now it appears that China’s primary motive is to dominate the market for new technologies and make lots of money in the process. </p>
<h2>AI and disinformation</h2>
<p>In addition to providing surveillance capabilities that are both sweeping and fine-grained, AI can help repressive governments manipulate available information and spread disinformation. These campaigns can be automated or automation-assisted, and deploy <a href="https://theconversation.com/solving-the-political-ad-problem-with-transparency-85366">hyper-personalized messages</a> directed at – or against – <a href="https://www.nytimes.com/2018/10/20/us/politics/saudi-image-campaign-twitter.html">specific people</a> or groups. </p>
<p>AI also underpins the technology commonly called “<a href="https://www.technologyreview.com/s/612501/inside-the-world-of-ai-that-forges-beautiful-art-and-terrifying-deepfakes/">deepfake</a>,” in which algorithms create <a href="https://theconversation.com/detecting-deepfake-videos-in-the-blink-of-an-eye-101072">realistic video and audio forgeries</a>. Muddying the waters between truth and fiction may become useful in a tight election, when one candidate could create fake videos showing an opponent doing and saying things that never actually happened.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/cQ54GDm1eL0?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">An early deepfake video shows some of the dangers of advanced technology.</span></figcaption>
</figure>
<p>In my view, policymakers in democracies should think carefully about the risks of AI systems to their own societies and to people living under authoritarian regimes around the world. A critical question is how many countries will adopt China’s model of digital surveillance. But it’s not just authoritarian countries feeling the pull. And it’s also not just Chinese companies spreading the technology: Many U.S. companies, Microsoft included, but <a href="https://www.axios.com/china-us-technology-surveillance-state-5672b822-fdde-45f9-ac77-e7b5574e9351.html">IBM, Cisco and Thermo Fisher</a> too, have provided sophisticated capabilities to nasty governments. The misuse of AI is not limited to autocratic states.</p><img src="https://counter.theconversation.com/content/109698/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Steven Feldstein is a non-resident fellow with the Carnegie Endowment for International Peace. </span></em></p>Even governments in democracies with strong traditions of rule of law find themselves tempted to abuse these new abilities.Steven Feldstein, Frank and Bethine Church Chair of Public Affairs & Associate Professor, School of Public Service, Boise State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1087922019-01-16T12:44:44Z2019-01-16T12:44:44ZAmazon, Facebook and Google don’t need to spy on your conversations to know what you’re talking about<figure><img src="https://images.theconversation.com/files/253861/original/file-20190115-152983-1hite2f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/personal-data-protection-privacy-concept-cabinets-631813253?src=XOOEmj7KhFGQOLeaT0vW3A-1-13">vchal/Shutterstock</a></span></figcaption></figure><p>If you’ve ever wondered if your phone is spying on you, you’re not alone. One of the most <a href="https://edition.cnn.com/2018/03/26/opinions/data-company-spying-opinion-schneier/index.html">hotly debated</a> topics in technology today is the amount of data that firms surreptitiously gather about us online. You may well have shared the increasingly common experience of feeling creeped out by ads for something you recently discussed in a real life conversation or an online interaction.</p>
<p>This kind of experience has <a href="https://www.bbc.co.uk/news/technology-41802282">led to suggestions</a> that tech firms are secretly <a href="https://www.vice.com/en_uk/article/wjbzzy/your-phone-is-listening-and-its-not-paranoia">recording our private conversations</a> via smartphones or other internet-connected devices such as smart TVs, Amazon Echo or Google Home. Or that they are reading our private messages even when they are supposedly encrypted, <a href="https://medium.com/@gzanon/no-end-to-end-encryption-does-not-prevent-facebook-from-accessing-whatsapp-chats-d7c6508731b2">as with Facebook’s WhatsApp</a>. If this were proven to be true, it would reveal a huge conspiracy that could do untold damage to the tech industry – which makes it seem somewhat far-fetched. But recent revelations about the degree to which Facebook users’ data <a href="https://theconversation.com/uk/topics/cambridge-analytica-51337">has been shared</a> certainly won’t help to convince people that the big firms aren’t spying on them. </p>
<p>Yet, there is another, more compelling reason for the incredibly relevant ads you see. Simply put, tech firms routinely gather <a href="https://www.bbc.co.uk/news/business-44702483">so much data</a> about you in other ways, they already have an excellent idea what your interests, desires and habits might be. With this information they can build a <a href="https://theconversation.com/big-data-security-problems-threaten-consumers-privacy-54798">detailed profile of you</a> and <a href="https://dataethics.eu/en/facebooks-data-collection-sharelab/">use algorithms</a> based on behavioural science and trends found elsewhere in their data, to predict what ads might be relevant to you. In this way they can show you products or services that you’ve been thinking about recently, even if you’ve never directly searched for or otherwise indicated online that you’d be interested in them. </p>
<p>Firms invest heavily in gathering user data and do so in a number of clever ways. Social networks and other apps offer to store and share our uploaded data for “free” while using it, and the content we access and “like”, to learn about our <a href="https://www.recode.net/2017/5/17/15655854/twitter-privacy-update-targeted-ads">interests, desires and relationships</a>. And, of course, there is our search history, which can reveal so much about our current circumstances that Google data has even been used to <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3510767">spot the start</a> of flu epidemics.</p>
<p>But it gets far creepier. Your personal email inbox is also fair game for tech firms. In 2017, <a href="https://www.theguardian.com/technology/2017/jun/26/google-will-stop-scanning-content-of-personal-emails">Google said</a> it would no longer analyse email content for the purposes of advertising, but <a href="https://techcrunch.com/2018/08/28/yahoo-still-scans-your-emails-for-ads-even-if-its-rivals-wont/">recent reports</a> suggest that other large firms still do this. New tech also provides <a href="https://kar.kent.ac.uk/67485/1/ARES2016-author-final.pdf">another data source</a>, be it <a href="https://kar.kent.ac.uk/67472/1/2017-pst-wnc-preprint.pdf">wearables</a>, <a href="https://bgr.com/2018/07/05/smart-tv-spying-yep/">smart TVs</a>, <a href="https://www.recode.net/2018/10/16/17966102/facebook-portal-ad-targeting-data-collection">other in-home smart devices</a> or the <a href="https://kar.kent.ac.uk/67495/1/jowua2016_enh.pdf">smartphone</a> <a href="https://theconversation.com/7-in-10-smartphone-apps-share-your-data-with-third-party-services-72404">apps</a> that we have come to love. These can gather data on how you use your smart devices, who you contact, what you watch and for how long, other devices on your home network, or where you go.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/253866/original/file-20190115-152980-13szaav.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/253866/original/file-20190115-152980-13szaav.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/253866/original/file-20190115-152980-13szaav.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/253866/original/file-20190115-152980-13szaav.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/253866/original/file-20190115-152980-13szaav.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/253866/original/file-20190115-152980-13szaav.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/253866/original/file-20190115-152980-13szaav.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Tracking your every move can reveal what you’re thinking about.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/closeup-view-young-woman-restaurant-finder-344288111?src=VdOm9qFxel_1nUozisDsrw-1-71">Georgejmclittle/Shutterstock</a></span>
</figcaption>
</figure>
<p>It’s not just individual sites or devices that monitor your online behaviour. A <a href="https://link.springer.com/article/10.1186/s13673-017-0121-6">massive ecosystem</a> of advertisers and supporting companies is dedicated to tracking your activity across the internet. Sites commonly record what pages you look at by saving a small file called a “cookie” to your browser. And your activity across different sites can be matched by looking at your <a href="https://panopticlick.eff.org/static/browser-uniqueness.pdf">browser’s “fingerprint”</a>, a profile made up of details such as your screen size, the version of the browser you’re using and what plug-in tools you have downloaded to use with it. Then, when you visit another website, an ad firm that has built a profile of you based on your cookies and browser fingerprint can load a <a href="https://link.springer.com/article/10.1186/s13673-017-0121-6">“third-party script”</a> to display ads relevant to your profile.</p>
<p>Perhaps even more alarmingly, this tracking does not stop at online data. Tech firms are known <a href="https://www.theverge.com/2018/8/30/17801880/google-mastercard-data-online-ads-offline-purchase-history-privacy">to purchase data from financial organisations</a> about user purchases in the real world to supplement their ad offerings. According to <a href="https://www.propublica.org/article/facebook-doesnt-tell-users-everything-it-really-knows-about-them">some reports</a>, this includes information on income, types of places and restaurants frequented and even how many credit cards are present in their wallets. Opting out of this tracking and onward data sharing is incredibly difficult.</p>
<p>Even where you ask to opt out of this data gathering, your request might not be respected. An example is the uproar caused when <a href="https://www.theverge.com/2017/11/21/16684818/google-location-tracking-cell-tower-data-android-os-firebase-privacy">it was discovered</a> that Google tracks the location of Android users even when the location setting is turned off. Location data is one of the most useful for advertising and many firms, including Apple, Google and Facebook, <a href="https://www.fastcompany.com/40477441/facebook-google-apple-know-where-you-are">track the location of individuals</a> to use as input into their bespoke algorithms.</p>
<h2>Putting the data together</h2>
<p>To sum up with a simple example, imagine you have just started to think about where to go for your next holiday. You spend the morning visiting travel agents to discuss the latest deals and then visit your favourite restaurant, a popular Caribbean food chain, in the city. Excited about your potential trip, later that night you watch mostly TV shows on the tropics. The next day, your social media feed contains flight, hotel and tour ads with deals to Barbados. </p>
<p>This is a very real illustration of how data on your location, financial purchases, interests, and TV viewing history can be correlated and used to create personalised ads. While some might welcome holiday deals, it becomes much more worrying when we consider data gathering or ads targeting <a href="https://kar.kent.ac.uk/67470/1/2017-ccs-mps-ang-author-final.pdf">sensitive health issues</a>, financial difficulties, or <a href="https://www.nytimes.com/interactive/2018/09/12/technology/kids-apps-data-privacy-google-twitter.html">vulnerable people such as children</a>.</p>
<p>The future of digital advertising is set to be as scary as it is intriguing. Even <a href="https://theconversation.com/those-pop-up-i-agree-boxes-arent-just-annoying-theyre-potentially-dangerous-106898">with new laws</a> that try to protect people’s information, tech firms are constantly looking to push the boundaries of <a href="https://www.independent.co.uk/life-style/gadgets-and-tech/news/amazon-alexa-patent-listening-to-me-facebook-phone-talking-ads-a8300246.html">data gathering</a> and <a href="https://www.nytimes.com/interactive/2018/06/21/opinion/sunday/facebook-patents-privacy.html">algorithm design</a> in ways that can feel invasive. It may yet be proven that some firms aren’t being honest with us about all the data they collect, but the stuff we know about is more than enough to build an alarmingly accurate picture of us.</p><img src="https://counter.theconversation.com/content/108792/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jason R.C. Nurse does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>If you’re worried your phone is recording your private conversations, look closer at the data you’ve already agreed to give away.Jason R.C. Nurse, Assistant Professor in Cyber Security, University of KentLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/911102018-02-07T10:41:17Z2018-02-07T10:41:17ZSmartphone data tracking is more than creepy – here’s why you should be worried<figure><img src="https://images.theconversation.com/files/204855/original/file-20180205-19948-1qsqh31.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/woman-using-smartphone-bed-418592050?src=G9BR0O1iijl9r4VpFS4wVw-3-27">Shutterstock</a></span></figcaption></figure><p>Smartphones rule our lives. Having information at our fingertips is the height of convenience. They tell us all sorts of things, but the information we see and receive on our smartphones is just a fraction of the data they generate. By tracking and monitoring our behaviour and activities, smartphones build a digital profile of shockingly intimate information about our personal lives.</p>
<p>These records aren’t just a log of our activities. The digital profiles they create are <a href="https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2017/02/warning-over-illegal-trade-in-people-s-information/">traded between companies</a> and used to make inferences and decisions that affect the opportunities open to us and our lives. What’s more, this typically happens without our knowledge, consent or control.</p>
<p>New and sophisticated methods built into smartphones make it easy to track and monitor our behaviour. A vast amount of information can be collected from our smartphones, both when being actively used and while running in the background. This information can include our location, internet search history, communications, social media activity, finances and biometric data such as fingerprints or facial features. It can also include metadata – information about the data – such as the time and recipient of a text message.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/205080/original/file-20180206-14083-1d2m4nc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/205080/original/file-20180206-14083-1d2m4nc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/205080/original/file-20180206-14083-1d2m4nc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/205080/original/file-20180206-14083-1d2m4nc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/205080/original/file-20180206-14083-1d2m4nc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=425&fit=crop&dpr=1 754w, https://images.theconversation.com/files/205080/original/file-20180206-14083-1d2m4nc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=425&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/205080/original/file-20180206-14083-1d2m4nc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=425&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Your emails can reveal your social network.</span>
<span class="attribution"><a class="source" href="https://theconversation.com/your-social-networks-and-the-secret-story-of-metadata-16119">David Glance</a></span>
</figcaption>
</figure>
<p>Each type of data can reveal something about our interests and preferences, views, hobbies and social interactions. For example, a study conducted by MIT demonstrated how <a href="https://theconversation.com/your-social-networks-and-the-secret-story-of-metadata-16119">email metadata can be used to map our lives</a>, showing the changing dynamics of our professional and personal networks. This data can be used to infer personal information including a person’s background, religion or beliefs, political views, sexual orientation and gender identity, social connections, or health. For example, it is possible to <a href="https://techcrunch.com/2016/05/17/stanford-quantifies-the-privacy-stripping-power-of-metadata/">deduce our specific health conditions</a> simply by connecting the dots between a series of phone calls. </p>
<p>Different types of data can be consolidated and linked to build a comprehensive profile of us. Companies that buy and sell data – <a href="https://www.npr.org/sections/alltechconsidered/2016/07/11/485571291/firms-are-buying-sharing-your-online-info-what-can-you-do-about-it">data brokers</a> – already do this. They collect and combine billions of data elements about people to make inferences about them. These inferences may seem innocuous but can <a href="https://www.ftc.gov/news-events/press-releases/2014/05/ftc-recommends-congress-require-data-broker-industry-be-more">reveal sensitive information</a> such as ethnicity, income levels, educational attainment, marital status, and family composition.</p>
<p>A recent study found that <a href="https://www.scientificamerican.com/article/7-in-10-smartphone-apps-share-your-data-with-third-party-services/">seven in ten smartphone apps share data</a> with third-party tracking companies like Google Analytics. Data from numerous apps can be linked within a smartphone to build this more detailed picture of us, even if permissions for individual apps are granted separately. Effectively, smartphones can be converted into surveillance devices.</p>
<p>The result is the creation and amalgamation of digital footprints that provide in-depth knowledge about your life. The most obvious reason for companies collecting information about individuals is for profit, to deliver targeted advertising and personalised services. Some targeted ads, while perhaps creepy, aren’t necessarily a problem, such as an ad for the new trainers you have been eyeing up.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/204934/original/file-20180205-14072-1adasgp.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/204934/original/file-20180205-14072-1adasgp.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=551&fit=crop&dpr=1 600w, https://images.theconversation.com/files/204934/original/file-20180205-14072-1adasgp.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=551&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/204934/original/file-20180205-14072-1adasgp.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=551&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/204934/original/file-20180205-14072-1adasgp.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=692&fit=crop&dpr=1 754w, https://images.theconversation.com/files/204934/original/file-20180205-14072-1adasgp.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=692&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/204934/original/file-20180205-14072-1adasgp.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=692&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Payday load ads.</span>
<span class="attribution"><a class="source" href="https://www.teamupturn.org/reports/2015/led-astray">Upturn</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>But targeted advertising based on our smartphone data can have real impacts on livelihoods and well-being, beyond influencing purchasing habits. For example, people in financial difficulty might be <a href="https://medium.com/equal-future/google-was-right-to-get-tough-on-payday-loan-ads-and-now-others-should-follow-suit-c7dd8446dc91">targeted for ads for payday loans</a>. They might use these loans to pay for <a href="https://www.theguardian.com/money/2018/jan/22/legalized-loan-sharking-payday-loan-customers-recount-their-experiences">unexpected expenses</a>, such as medical bills, car maintenance or court fees, but could also rely on them for <a href="https://www.theguardian.com/business/2017/oct/16/young-people-are-borrowing-to-cover-basic-living-costs-warns-city-watchdog">recurring living costs</a> such as rent and utility bills. People in financially vulnerable situations can then become trapped in <a href="https://www.theguardian.com/business/2017/sep/19/51-of-young-women-have-to-borrow-to-make-cash-last-until-payday">spiralling debt</a> as they struggle to repay loans due to the high cost of credit. </p>
<p>Targeted advertising can also enable companies to discriminate against people and deny them an equal chance of accessing basic human rights, such as housing and employment. Race is not explicitly included in Facebook’s basic profile information, but a user’s “ethnic affinity” can be worked out based on pages they have liked or engaged with. Investigative journalists from ProPublica found that it is possible to exclude those who match certain ethnic affinities <a href="https://www.technologyreview.com/the-download/609543/facebook-still-lets-people-target-ads-by-race-and-ethnicity/">from housing ads</a>, and <a href="https://www.propublica.org/article/facebook-ads-age-discrimination-targeting">certain age groups from job ads</a>.</p>
<p>This is different to traditional advertising in print and broadcast media, which although targeted is not exclusive. Anyone can still buy a copy of a newspaper, even if they are not the typical reader. Targeted online advertising can completely exclude some people from information without them ever knowing. This is a particular problem because the internet, and social media especially, is now such a common source of information.</p>
<p>Social media data can also be used to <a href="https://www.forbes.com/sites/moneybuilder/2015/10/23/your-social-media-posts-may-soon-affect-your-credit-score-2/">calculate creditworthiness</a>, despite its dubious relevance. Indicators such as the level of sophistication in a user’s language on social media, and their friends’ loan repayment histories can now be used for credit checks. This can have a direct impact on the fees and interest rates charged on loans, the ability to buy a house, and even <a href="http://www.demos.org/discredited-how-employment-credit-checks-keep-qualified-workers-out-job">employment prospects</a>.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"816839913945595904"}"></div></p>
<p>There’s a similar risk with payment and shopping apps. In China, the government <a href="http://www.wired.co.uk/article/chinese-government-social-credit-score-privacy-invasion">has announced plans</a> to combine data about personal expenditure with official records, such as tax returns and driving offences. This initiative, which is being led by both the government and companies, is <a href="https://www.ft.com/content/f23e0cb2-07ec-11e8-9650-9c0ad2d7c5b5">currently in the pilot stage</a>. When fully operational, it will produce a <a href="https://theconversation.com/chinas-dystopian-social-credit-system-is-a-harbinger-of-the-global-age-of-the-algorithm-88348">social credit score</a> that rates an individual citizen’s trustworthiness. These ratings can then be used to issue rewards or penalties, such as privileges in loan applications or limits on career progression.</p>
<p>These possibilities are not distant or hypothetical – they exist now. Smartphones are <a href="https://www.hrbdt.ac.uk/big-data-mass-surveillance-and-the-human-rights-big-data-technology-project/">effectively surveillance devices</a>, and everyone who uses them is exposed to these risks. What’s more, it is impossible to anticipate and detect the full range of ways smartphone data is collected and used, and to demonstrate the full scale of its impact. What we know could be just the beginning.</p><img src="https://counter.theconversation.com/content/91110/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Vivian Ng works for the Human Rights, Big Data and Technology Project, which is funded by the Economic and Social Research Council and the University of Essex.</span></em></p><p class="fine-print"><em><span>Catherine Kent works for the Human Rights, Big Data and Technology Project, which is funded by the Economic and Social Research Council and the University of Essex.</span></em></p>Companies are compiling your smartphone data into shockingly intimate profiles that can be used against you.Vivian Ng, Senior Research Officer, Human Rights Centre, University of Essex, University of EssexCatherine Kent, Project Officer, Human Rights Centre, University of EssexLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/864742017-11-08T01:49:44Z2017-11-08T01:49:44ZYou may be sick of worrying about online privacy, but ‘surveillance apathy’ is also a problem<figure><img src="https://images.theconversation.com/files/193512/original/file-20171107-1032-f7pvxc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Do you care if your data is being used by third parties? </span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/hipster-couple-disinterest-moment-mobile-phones-272688959?src=7ZSQySxMRZkXLshHJcT8Gw-2-87">from www.shutterstock.com </a></span></figcaption></figure><p>We all seem worried about <a href="https://theconversation.com/the-new-data-retention-law-seriously-invades-our-privacy-and-its-time-we-took-action-78991">privacy</a>. Though it’s not only privacy itself we should be concerned about: it’s also our <em>attitudes</em> towards privacy that are important. </p>
<p>When we stop caring about our digital privacy, we witness surveillance apathy.</p>
<p>And it’s something that may be particularly significant for marginalised communities, who feel they hold no power to navigate or negotiate fair use of digital technologies. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/yes-your-doctor-might-google-you-74746">Yes, your doctor might Google you</a>
</strong>
</em>
</p>
<hr>
<p>In the wake of the <a href="https://www.theguardian.com/us-news/the-nsa-files">NSA leaks in 2013</a> led by Edward Snowden, we are more aware of the machinations of online companies such as Facebook and Google. Yet <a href="http://www.pewinternet.org/files/2015/03/PI_AmericansPrivacyStrategies_0316151.pdf">research shows</a> some of us are apathetic when it comes to online surveillance. </p>
<h2>Privacy and surveillance</h2>
<p>Attitudes to privacy and surveillance in Australia are complex. </p>
<p>According to a major <a href="https://www.oaic.gov.au/engage-with-us/community-attitudes/australian-community-attitudes-to-privacy-survey-2017">2017 privacy survey</a>, around 70% of us are more concerned about privacy than we were five years ago. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/193680/original/file-20171107-6707-w7wmf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/193680/original/file-20171107-6707-w7wmf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/193680/original/file-20171107-6707-w7wmf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=848&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193680/original/file-20171107-6707-w7wmf7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=848&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193680/original/file-20171107-6707-w7wmf7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=848&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193680/original/file-20171107-6707-w7wmf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1066&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193680/original/file-20171107-6707-w7wmf7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1066&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193680/original/file-20171107-6707-w7wmf7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1066&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Snapshot of Australian community attitudes to privacy 2017.</span>
<span class="attribution"><a class="source" href="https://www.oaic.gov.au/engage-with-us/community-attitudes/australian-community-attitudes-to-privacy-survey-2017">Office of the Australian Information Commissioner</a></span>
</figcaption>
</figure>
<p>And yet we still increasingly embrace online activities. A <a href="https://www.sensis.com.au/asset/PDFdirectory/Sensis_Social_Media_Report_2017-Chapter-1.pdf">2017 report on social media</a> conducted by search marketing firm Sensis showed that almost 80% of internet users in Australia now have a social media profile, an increase of around ten points from 2016. The data also showed that Australians are on their accounts more frequently than ever before. </p>
<p>Also, most Australians appear not to be concerned about recently proposed implementation of facial recognition technology. Only around one in three (32% of 1,486) respondents to a <a href="http://www.roymorgan.com/findings/7366-roy-morgan-snap-sms-survey-facial-recognition-surveillance-technology-october-10-2017-201710101059">Roy Morgan study</a> expressed worries about having their faces available on a mass database. </p>
<p>A recent <a href="http://www.zdnet.com/article/data-retention-supported-by-two-thirds-of-australians-anu/">ANU poll</a> revealed a similar sentiment, with recent data retention laws supported by two thirds of Australians.</p>
<p>So while we’re aware of the issues with surveillance, we aren’t necessarily doing anything about it, or we’re prepared to make compromises when we perceive our safety is at stake. </p>
<p>Across the world, attitudes to surveillance vary. Around <a href="https://www.forbes.com/sites/kashmirhill/2013/06/10/how-americans-views-on-surveillance-have-changed-over-the-last-decade-or-rather-not-changed/#671747fa4f3c">half of Americans</a> polled in 2013 found mass surveillance acceptable. France, Britain and the Philippines appeared more tolerant of mass surveillance compared to Sweden, Spain, and Germany, according to <a href="https://www.amnesty.org/en/latest/news/2015/03/global-opposition-to-usa-big-brother-mass-surveillance/">2015 Amnesty International data</a>. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/police-want-to-read-encrypted-messages-but-they-already-have-significant-power-to-access-our-data-82891">Police want to read encrypted messages, but they already have significant power to access our data</a>
</strong>
</em>
</p>
<hr>
<h2>Apathy and marginalisation</h2>
<p>In 2015, philosopher Slavoj Žižek <a href="https://www.youtube.com/watch?v=HIxB2dUZ5ZA">proclaimed</a> that he did not care about surveillance (admittedly though suggesting that “perhaps here I preach arrogance”).</p>
<p>This position cannot be assumed by all members of society. Australian academic Kate Crawford <a href="https://thenewinquiry.com/the-anxieties-of-big-data/">argues</a> the impact of data mining and surveillance is more significant for marginalised communities, including people of different races, genders and socioeconomic backgrounds. American academics Shoshana Magnet and Kelley Gates agree, <a href="https://www.amazon.com/New-Media-Surveillance-Shoshana-Magnet/dp/0415568129">writing</a>:</p>
<blockquote>
<p>[…] new surveillance technologies are regularly tested on marginalised communities that are unable to resist their intrusion.</p>
</blockquote>
<p>A <a href="https://obamawhitehouse.archives.gov/sites/default/files/whitehouse_files/docs/Big_Data_Report_Nonembargo_v2.pdf">2015 White House report</a> found that big data can be used to perpetuate price discrimination among people of different backgrounds. It showed how data surveillance “could be used to hide more explicit forms of discrimination”.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/witch-hunts-and-surveillance-the-hidden-lives-of-queer-people-in-the-military-76156">Witch-hunts and surveillance: the hidden lives of queer people in the military</a>
</strong>
</em>
</p>
<hr>
<p><a href="https://www.salon.com/2016/09/14/privacy-the-forgotten-issue-apathy-is-making-americans-vulnerable/">According to Ira Rubinstein</a>, a senior fellow at New York University’s Information Law Institute, ignorance and cynicism are often behind surveillance apathy. Users are either ignorant of the complex infrastructure of surveillance, or they believe they are simply unable to avoid it. </p>
<p>As the White House report stated, consumers “have very little knowledge” about how data is used in conjunction with differential pricing. </p>
<p>So in contrast to the oppressive panopticon (a circular prison with a central watchtower) as envisioned by <a href="https://www.theguardian.com/technology/2015/jul/23/panopticon-digital-surveillance-jeremy-bentham">philosopher Jeremy Bentham</a>, we have what <a href="http://linkis.com/www.iasc-culture.org/EV7iK">Siva Vaidhyanathan calls</a> the “crytopticon”. The crytopticon is “not supposed to be intrusive or obvious. Its scale, its ubiquity, even its very existence, are supposed to go unnoticed”.</p>
<p>But Melanie Taylor, lead artist of the computer game Orwell (which puts players in the role of surveillance) <a href="https://www.kotaku.com.au/2017/03/you-can-make-a-game-about-government-surveillance-but-you-cant-make-people-care/">noted</a> that many simply remain indifferent despite heightened awareness: </p>
<blockquote>
<p>That’s the really scary part: that Snowden revealed all this, and maybe nobody really cared.</p>
</blockquote>
<h2>The Facebook trap</h2>
<p>Surveillance apathy can be linked to people’s dependence on “the system”. As one of my media students pointed out, no matter how much awareness users have regarding their social media surveillance, invariably people will continue using these platforms. This is because they are convenient, practical, and “we are creatures of habit”. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/193677/original/file-20171107-6722-10lv2ib.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/193677/original/file-20171107-6722-10lv2ib.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193677/original/file-20171107-6722-10lv2ib.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193677/original/file-20171107-6722-10lv2ib.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193677/original/file-20171107-6722-10lv2ib.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193677/original/file-20171107-6722-10lv2ib.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193677/original/file-20171107-6722-10lv2ib.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Are you prepared to give up the red social notifications from Facebook?</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/kiev-ukraine-june-8-facebook-web-197767229?src=LuLpVdAeg7t-I6Awrp9i2w-1-18">nevodka/shutterstock</a></span>
</figcaption>
</figure>
<p>As University of Melbourne scholar <a href="https://theconversation.com/profiles/suelette-dreyfus-1102">Suelette Dreyfus</a> noted in a <a href="http://www.abc.net.au/4corners/cracking-the-code-promo/8422812">Four Corners</a> report on Facebook:</p>
<blockquote>
<p>Facebook has very cleverly figured out how to wrap itself around our lives. It’s the family photo album. It’s your messaging to your friends. It’s your daily diary. It’s your contact list.</p>
</blockquote>
<p>This, along with the complex algorithms Facebook and Google use to collect and use data to produce “<a href="https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles">filter bubbles</a>” or “<a href="http://www.tandfonline.com/doi/abs/10.1080/08838151.2014.935851?src=recsys&journalCode=hbem20">you loops</a>” is another issue. </p>
<h2>Protecting privacy</h2>
<p>While some people are attempting to <a href="https://www.washingtonpost.com/news/the-intersect/wp/2017/02/10/erasing-yourself-from-the-internet-is-nearly-impossible-but-heres-how-you-can-try/">delete themselves</a> from the network, others have come up with ways to avoid being tracked online. </p>
<p>Search engines such as <a href="https://duckduckgo.com/">DuckDuckGo</a> or <a href="https://www.torproject.org/projects/torbrowser.html.en">Tor Browser</a> allow users to browse without being tracked. <a href="https://www.theguardian.com/technology/2013/oct/28/mozilla-lightbeam-tracking-privacy-cookies">Lightbeam</a>, meanwhile, allows users to see how their information is being tracked by third party companies. And MIT devised a system to show people the metadata of their emails, called <a href="https://immersion.media.mit.edu/">Immersion</a>.</p>
<p>Surveillance apathy is more disconcerting than surveillance itself. Our very attitudes about privacy will inform the structure of surveillance itself, so caring about it is paramount.</p><img src="https://counter.theconversation.com/content/86474/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Siobhan Lyons does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Many users of digital platforms resign themselves to being monitored. That’s surveillance apathy - and it’s worse in society’s most marginalised groups.Siobhan Lyons, Scholar in Media and Cultural Studies, Macquarie UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/677632017-07-18T00:26:43Z2017-07-18T00:26:43ZThe real costs of cheap surveillance<figure><img src="https://images.theconversation.com/files/177958/original/file-20170712-19701-qhru88.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Who's collecting your data, and what are they using your data for?</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/security-camera-concept-big-brother-surveillance-566879512">Brian A. Jackson/Shutterstock.com</a></span></figcaption></figure><p>Surveillance used to be expensive. Even just a few years ago, tailing a person’s movements around the clock required rotating shifts of personnel devoted full-time to the task. Not any more, though.</p>
<p>Governments can track the movements of massive numbers of people by positioning cameras to <a href="https://www.aclu.org/feature/you-are-being-tracked">read license plates</a>, or by setting up <a href="https://www.perpetuallineup.org/">facial recognition</a> systems. Those systems need few people to operate them, <a href="http://moritzlaw.osu.edu/students/groups/is/files/2012/02/Weinberg.pdf">automating the collection of information about people’s lives</a> and adding that data to searchable databases. Surveillance has become cheap.</p>
<p>I study the <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2753363">law</a> of <a href="http://dx.doi.org/10.2139/ssrn.2753308">identification</a> <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2715594">and</a> <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1118070">privacy</a>, so I pay attention to that trend, and it’s worrying. The data maintained in our individual profiles can be used in making decisions about <a href="https://www.ftc.gov/news-events/press-releases/2015/01/ftc-issues-follow-study-credit-report-accuracy">credit</a>, <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2715594">employment</a>, <a href="https://oig.ssa.gov/newsroom/blog/march24-death-reports">government benefits</a> and more. What governments and companies think they know about us – whether or not it’s accurate – has real power over our actual lives.</p>
<h2>Old-fashioned surveillance</h2>
<p>Back in the day, the high cost of surveillance made it not a big deal when the Supreme Court ruled that government agents don’t need a warrant to <a href="https://supreme.justia.com/cases/federal/us/460/276/case.html">follow a person</a> in public, to <a href="https://supreme.justia.com/cases/federal/us/486/35/case.html">sift through her trash</a> or to fly over her property and <a href="https://supreme.justia.com/cases/federal/us/476/207/case.html">observe it from the air</a>. </p>
<p>The effort needed to collect that sort of data meant that governments would engage in surveillance only rarely, and only for compelling reasons. For most Americans, little about their everyday comings and goings, likes and dislikes, hopes and dreams was tabulated and collected in any central source. But that’s now changed. </p>
<p>Because information collection is now so easy and storage is cheap, it makes sense for government to collect much more information. As a result, after 9/11, rather than the U.S. government first trying to figure out who the bad guys might be and then collecting records of who they spoke to on the phone, federal officials simply <a href="https://obamawhitehouse.archives.gov/sites/default/files/docs/2013-12-12_rg_final_report.pdf">compiled a database</a> of who every person in the U.S. was speaking to on the phone, updated in real time.</p>
<h2>Online tracking</h2>
<p>Private companies’ tracking of our lives has also become easy and cheap too. <a href="https://theconversation.com/7-in-10-smartphone-apps-share-your-data-with-third-party-services-72404">Advertising network systems</a> let data brokers track nearly every page you visit on the web, and associate it with an individual profile. Facebook <a href="http://adage.com/article/digital/facebook-web-browsing-history-ad-targeting/293656/">can</a> <a href="https://www.washingtonpost.com/news/the-intersect/wp/2016/08/19/98-personal-data-points-that-facebook-uses-to-target-ads-to-you/">follow</a> much of its users’ web browsing, even if they’re not logged in.</p>
<p>Google’s tracking presence is even broader. According to one <a href="https://webtransparency.cs.princeton.edu/webcensus/">recent study</a>, Google Analytics tracks users on nearly 70 percent of the top one million websites, and Google subsidiary Doubleclick separately tracks users on almost half of the top million sites. That gives Google – or a subsidiary – access to an extensive list of who visits which websites and when. And the company can <a href="https://www.propublica.org/article/google-has-quietly-dropped-ban-on-personally-identifiable-web-tracking">combine that information with data</a> derived from people’s use of Google Maps, Gmail and other Google services. </p>
<h2>Compiling profiles</h2>
<p>Online tracking is even more powerful when it’s <a href="https://www.propublica.org/article/why-online-tracking-is-getting-creepier">merged with real-world information</a> tied to real names and identities. Facebook, for example, <a href="https://www.facebook.com/help/494750870625830?helpref=uf_permalink">combines its data with information from data brokers</a> such as Experian and Acxiom, which compile information from government records, retailers, financial institutions, social media and other sources. </p>
<p>Acxiom claims to have <a href="https://www.ftc.gov/system/files/documents/reports/data-brokers-call-transparency-accountability-report-federal-trade-commission-may-2014/140527databrokerreport.pdf">information about 700 million consumers around the world</a>, subdividing its information on U.S. residents into more than 3,000 categories. (That figure may be overstated, but even with a decent discount for skepticism, that’s a lot of information.)</p>
<p>Another company, <a href="https://www.theworknumber.com/">The Work Number</a>, a subsidiary of credit bureau Equifax, maintains detailed salary and other payroll-related information for <a href="https://www.propublica.org/article/everything-we-know-about-what-data-brokers-know-about-you">more than one-third of working Americans</a>. Retailer loyalty cards are another source of data – Datalogix, <a href="http://www.oracle.com/us/corporate/acquisitions/datalogix/index.html">a subsidiary of database giant Oracle</a>, aggregates data on consumer purchases, including sales that <a href="https://www.oracle.com/us/assets/health-wellness-data-segments-2537888.pdf">suggest medical conditions or other personal concerns</a>, such as weight loss pills, allergy treatments and hair removal products.</p>
<p>By combining online and offline data, Facebook can charge premium rates to an advertiser who wants to target, say, people in Idaho who are in long-distance relationships and are thinking about buying a minivan. (There are <a href="https://www.nytimes.com/2017/05/14/business/media/advertisers-streaming-video-broadcast-tv.html">3,100</a> of them in Facebook’s database.) If you want to reach users with an interest in Ramadan who have recently returned from overseas trips, Facebook can find them too.</p>
<h2>Taking action</h2>
<p>Today, credit bureaus evaluate financial data – income and employment history, debt repayment records and public information like bankruptcy filings and foreclosures – to decide a person’s creditworthiness. But companies and government agencies can crunch through all these data to find correlations they hadn’t recognized before – and then take action based on those findings, sometimes in <a href="http://www.niemanlab.org/2017/01/are-those-creepy-web-ads-that-learn-your-preferences-and-follow-you-around-online-also-discriminatory/">discriminatory and socially undesirable ways</a>. </p>
<p>For example, online sellers may charge <a href="https://www.ftc.gov/system/files/documents/reports/big-data-tool-inclusion-or-exclusion-understanding-issues/160106big-data-rpt.pdf">higher prices</a> to customers from poorer ZIP codes, where there is less competition from brick-and-mortar stores. A credit card company downgraded consumers’ <a href="https://www.ftc.gov/system/files/documents/reports/big-data-tool-inclusion-or-exclusion-understanding-issues/160106big-data-rpt.pdf">creditworthiness</a> if they had used their cards to pay for marriage counseling or tire repair services. A major <a href="http://stopthecap.com/2016/06/29/cable-one-price-depends-credit-worthiness/">cable TV company</a> developed procedures to discourage prospective customers with low credit scores from signing up, because data analytics revealed that those customers were less lucrative than others.</p>
<p>United States law – unlike the <a href="https://www.privacy-regulation.eu/en/15.htm">law in Europe</a> – gives ordinary people no general right to see their own digital profiles, so we have little opportunity to correct inaccuracies. But even if everything in a profile is accurate, there’s still a big problem: Proprietors’ use of our information in this way encodes discrimination in automated decisions. It means that people who have had marriage counseling, say, or who live in poor neighborhoods are treated as second-class citizens in a wide range of everyday transactions and interactions. That’s not a recipe for a healthy society. </p>
<h2>The rise of social credit?</h2>
<p>All this could spread very deeply into our lives, raising concerns about invasions of privacy. What if credit bureau ratings incorporated the creditworthiness of an applicant’s friends? Or her educational background, the make of her car or whether she uses all capital letters in her text messages? The U.S. Consumer Finance Protection Bureau has <a href="https://www.federalregister.gov/documents/2017/02/21/2017-03361/request-for-information-regarding-use-of-alternative-data-and-modeling-techniques-in-the-credit">opened an inquiry</a> into the dangers such practices might pose.</p>
<p>The People’s Republic of China has begun to construct a souped-up version of the financial credit bureau that, according to some reports, would look even more broadly at a person’s life. In that system, <a href="https://www.washingtonpost.com/world/asia_pacific/chinas-plan-to-organize-its-whole-society-around-big-data-a-rating-for-everyone/2016/10/20/1cd0dd9c-9516-11e6-ae9d-0030ac1899cd_story.html">every citizen would have a score</a> incorporating not only financial data, but also “anything from defaulting on a loan to criticizing the ruling party, from running a red light to failing to care for your parents properly.” The score would affect what jobs an individual could get, what schools her children could attend, even whether she could <a href="https://chinacopyrightandmedia.wordpress.com/2016/09/25/opinions-concerning-accelerating-the-construction-of-credit-supervision-warning-and-punishment-mechanisms-for-persons-subject-to-enforcement-for-trust-breaking/">get a reservation at a fancy restaurant</a>. </p>
<p>Those features haven’t been implemented yet; so far, the system is <a href="https://www.newscientist.com/article/dn28314-inside-chinas-plan-to-give-every-citizen-a-character-score/">more limited</a>. Western news reports have <a href="https://www.economist.com/news/briefing/21711902-worrying-implications-its-social-credit-project-china-invents-digital-totalitarian">decried this plan as totalitarian</a>. It’s worth asking, though, what direction we in the United States are headed in.</p>
<p>Indeed, it’s worth thinking about all of this more deeply. U.S. firms – unless they’re managed or regulated in socially beneficial ways – have both the incentive and the opportunity to use information about us in undesirable ways. We need to talk about the government’s enacting rules constraining that activity. After all, leaving those decisions to the people who make money selling our data is unlikely to result in our getting the rules we want.</p><img src="https://counter.theconversation.com/content/67763/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jonathan Weinberg has done pro bono work for the American Civil Liberties Union relating to constitutional and immigration law, and has contributed a modest amount of money to the Electronic Frontier Foundation. This article is based on his own research and scholarship, and does not necessarily represent the views of the ACLU or the EFF.</span></em></p>What governments and companies think they know about us – whether or not it’s accurate – has real power over our actual lives.Jonathan Weinberg, Professor of Law, Wayne State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/781102017-06-23T03:53:37Z2017-06-23T03:53:37ZAct now to protect your digital rights, Big Brother and his Little Sisters may be watching<figure><img src="https://images.theconversation.com/files/171969/original/file-20170602-25658-xifht4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Do you know who has the rights to access your digital data? And who might be interested in acquiring that information?</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/west_point/8657487759/">West Point-US Military Academy/Flickr </a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p><em>This article is part of the <a href="https://theconversation.com/au/topics/democracy-futures">Democracy Futures</a> series, a <a href="http://sydneydemocracynetwork.org/democracy-futures/">joint global initiative</a> between The Conversation and the <a href="http://sydneydemocracynetwork.org/">Sydney Democracy Network</a>. The project aims to stimulate fresh thinking about the many challenges facing democracies in the 21st century.</em></p>
<hr>
<p>Imagine China takes down its national internet blocking system – aka the <a href="https://www.washingtonpost.com/world/asia_pacific/chinas-scary-lesson-to-the-world-censoring-the-internet-works/2016/05/23/413afe78-fff3-11e5-8bb1-f124a43f84dc_story.html?utm_term=.2aa985b56cdc">Great Firewall</a> – tomorrow. Will this affect how you use the internet?</p>
<p>Without the Great Firewall, Facebook and Google will grow exponentially in China. Before long, the tech giants own a sizeable share of the Chinese market and have become good buddies with Beijing. </p>
<p>This scenario unfolds at a time when Donald Trump’s inward-looking policy upsets Silicon Valley’s efforts to expand its global empire, and when the US Congress <a href="https://medium.freecodecamp.com/how-to-set-up-a-vpn-in-5-minutes-for-free-and-why-you-urgently-need-one-d5cdba361907">further deregulates</a> the internet industry, allowing internet service providers (ISPs), for example, to collect and trade user’s private data. So the tech giants decide to go to bed with China. </p>
<p>What does this have to do with you using your smartphone in, say, Sydney? </p>
<p>Well, if you have a Facebook presence, it means your social network information may now be used in a few additional ways, without your knowledge. Perhaps a few China-bashing news items, shared by your friends, will disappear from your news feed. And if you rely on Google, YouTube, Amazon or Uber, the data you accumulate during your daily routines may now empower not just the Little Sisters (that is, advertising companies), but also Big Brother himself.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/1pT_t34mnWc?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">“We want to help the rest of the world connect with China.”</span></figcaption>
</figure>
<p>According to urban geographer and unionist <a href="https://twitter.com/kurtiveson">Kurt Iveson</a>, surveillance cameras at the University of Sydney generate half of the internet traffic on campus. All the research, the paperwork, the social media back-and-forth, the videos people watch and the online games and music they play, all this online traffic, when added together, barely matches the terabytes of information generated by the surveillance feed. </p>
<p>That’s a pretty big achievement for those tiny cameras looking down at you in the corridors and from the street lamps. </p>
<h2>The <em>‘big’</em> in Big Brother and Big Data</h2>
<p>China has big ambitions. Its interests and investments in infrastructure on a global scale are <a href="https://theconversation.com/the-belt-and-road-initiative-chinas-vision-for-globalisation-beijing-style-77705">well known</a>. It will only be a matter of time before Beijing realises that digital assets are as vital, perhaps even more valuable, than highways and airports.</p>
<p>The Chinese Communist Party already has a good record of endorsing corporate platforms in the <a href="http://www.investopedia.com/terms/n/neweconomy.asp">New Economy</a>. Last November, China embraced the “disruptive” innovation of Uber and similar services. It became the first country to <a href="https://www.ft.com/content/dc63e5ce-54ab-11e6-9664-e0bdc13c3bef">legalise</a> the smartphone ride-hailing business on a national scale. </p>
<p>In contrast, Japanese and European cities have long <a href="https://www.theguardian.com/technology/2016/jun/09/uber-suffers-legal-setbacks-in-france-and-germany">banned</a> Uber from their streets. <a href="https://theconversation.com/when-uber-is-legal-the-taxi-industry-will-have-nowhere-to-hide-48820">Australians</a> and <a href="https://theconversation.com/uber-drivers-stuck-in-legal-limbo-as-us-labor-laws-fail-to-keep-up-43542">Americans</a> continue to debate the <a href="https://theconversation.com/why-people-trust-sharing-economy-strangers-more-than-their-colleagues-70669">ethics</a> and <a href="https://theconversation.com/uber-vs-regulators-the-heavyweight-bout-of-2015-45932">legalities</a> of the start-up service. </p>
<p>In response to the warm embrace, Uber <a href="https://www.ft.com/content/dc63e5ce-54ab-11e6-9664-e0bdc13c3bef">praised China</a> as:</p>
<blockquote>
<p>… a country that has consistently shown itself to be forward-thinking when it comes to business innovation.</p>
</blockquote>
<p>Now you probably see why Silicon Valley might want to divorce Trump and have an affair behind Tiananmen. </p>
<h2>Your digital rights</h2>
<p>Maybe it’s not such a good idea, after all, to hastily agree to whatever terms and conditions tech companies hand down to you in tedious fine print. You don’t know your rights. You don’t know who has your data. But do you care? </p>
<p>As an individual, your power is limited. Using a <a href="https://medium.freecodecamp.com/how-to-set-up-a-vpn-in-5-minutes-for-free-and-why-you-urgently-need-one-d5cdba361907">virtual private network</a> (VPN) can be a good start, but which VPN service can you really trust? This is a pertinent question because what if the VPN you use turns out to be a <a href="https://best-vpn.reviewster.com/according-to-anonymous-vpnbook-is-a-big-fat-honeypot/">honeypot</a> collecting data about you?</p>
<p>Your best shot, then, is to join a movement – such as a citizen group – to raise awareness or a watchdog organisation that guards against the mishandling of private data by telecommunication companies.</p>
<p>Other good places to seek refuge and spread the good word include non-government organisations that promote solidarity with IT-sector workers and hacker groups who develop new crypto technology. You don’t have to know programming or coding to join them, as even the best hackers will need other kinds of help. </p>
<p>Cities like Sydney have many such organisations. Plenty of <a href="https://www.efa.org.au/">folks</a> are working on digital rights issues. <a href="https://savesecurity.org/?from=banner#strong-security-saves-lives">Join them</a> to protect your data from being infringed by Big Brother, his Little Sisters, and even telcos and ISPs. </p>
<p>Even if China doesn’t plan to take down its Great Firewall any time soon, that doesn’t make protecting your own data – personal information that reveals so much about your life – any less important. </p>
<p>As long as you have signed over your rights to corporations, they can still sell out big to Beijing, Moscow or whoever else is peeping from afar, at this very moment, into your campus or workplace CCTV system.</p><img src="https://counter.theconversation.com/content/78110/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jack Linchuan Qiu does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Sooner or later, China will recognise the value of digital assets. This adds to the urgency of citizens ensuring they control the data trails that tell the world what they think and do.Jack Linchuan Qiu, Professor, School of Journalism and Communication, Chinese University of Hong KongLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/787692017-06-05T09:16:04Z2017-06-05T09:16:04ZWhat will the UK election mean for online privacy?<figure><img src="https://images.theconversation.com/files/172231/original/file-20170605-31044-rmlo2e.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption"></span> <span class="attribution"><a class="source" href="https://pixabay.com/en/spyware-cyber-cyber-crime-security-2319403/">Pixabay</a></span></figcaption></figure><p>The recent cyber attack that <a href="https://theconversation.com/nhs-ransomware-cyber-attack-was-preventable-77674">crippled the NHS</a> demonstrated why cyber-security is a vital issue and one that can affect an entire country. The recent <a href="https://theconversation.com/manchester-attack-we-are-in-an-arms-race-against-ever-adapting-terror-networks-78227">terrorist attack in Manchester</a> also reminded people what’s at stake when deciding what <a href="https://theconversation.com/what-the-manchester-attack-leaks-mean-for-the-uk-us-intelligence-sharing-relationship-78415">data gathering and surveillance</a> powers the government should have.</p>
<p>So how are the main UK-wide political parties proposing to tackle online security and privacy after the 2017 general election?</p>
<h2>Conservative party</h2>
<p>The <a href="https://www.conservatives.com/manifesto">Conservative manifesto</a> appears to have the most to say about individual data privacy and takes a bold position on cyber-security. Despite having introduced the <a href="http://services.parliament.uk/bills/2015-16/investigatorypowers.html">Investigatory Powers Act</a> that allows government to access detailed records of everyone’s internet activity in the past 12 months, the Conservatives seem so concerned about privacy that the word appears six times in the manifesto.</p>
<p>It pledges data safety through new legislation, stating the party “will deliver protections for people’s data online, backed by a new data protection law”. Yet the manifesto says little about what shape this would actually take and whether it will align with forthcoming regulatory changes. </p>
<p>Any organisation handling EU consumer data will be <a href="http://www.bbc.co.uk/news/business-40110402">forced to comply</a> with the new <a href="https://ico.org.uk/for-organisations/data-protection-reform/overview-of-the-gdpr/">General Data Protection Regulation</a> (GDPR) that comes into force in May 2018. Because the Conservatives’ position on their new data privacy law is unclear, it adds yet another level of uncertainty and potentially new challenges for data compliance.</p>
<p>The Conservatives also plan to make online regulation more similar to that governing the offline world. They promise to develop a digital charter that will bring individual privacy to the forefront of the technology debate, yet make online service providers share responsibility for privacy protection.</p>
<p>There is also an indication that technology companies will be obligated to give the government access to any encrypted communications and data. This would mean creating a backdoor to personal data, <a href="https://theconversation.com/how-whatsapp-encryption-works-and-why-there-shouldnt-be-a-backdoor-75266">undermining the secure nature</a> of encrypted messages, including popular services such as WhatsApp. Given the increasing challenge of keep data safe from cyber attacks – and that public sector and government services are particular targets for hackers – the government should think carefully before trying to justify such a drastic move.</p>
<p>Another hallmark promise from the Conservatives revolves around safety for children online, and to make a requirement of social media companies to delete information about young people when they turn 18. Erasing millions of profiles across something like 20 social platforms with data storage across the world is a tall order. And what if users don’t want their data deleted, or want to keep part of it? Such a requirement could be seen as a big burden for social media firms.</p>
<p>But the Conservatives go further still by suggesting that they will also introduce an industry-wide levy from internet and communication companies to fund online safety and protection campaigns, similar to the approach taken with the gambling industry. While there is <a href="https://theconversation.com/is-social-media-to-blame-for-the-worsening-mental-health-of-teenage-girls-64333">some evidence</a> of links between social media and mental health issues, equating the internet with gambling is a big step to take by a party otherwise so keen to make the digital economy central to its manifesto.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/172232/original/file-20170605-31044-gxdg4g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/172232/original/file-20170605-31044-gxdg4g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/172232/original/file-20170605-31044-gxdg4g.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/172232/original/file-20170605-31044-gxdg4g.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/172232/original/file-20170605-31044-gxdg4g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/172232/original/file-20170605-31044-gxdg4g.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/172232/original/file-20170605-31044-gxdg4g.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Under surveillance.</span>
<span class="attribution"><a class="source" href="https://pixabay.com/en/ransomware-cyber-crime-malware-2320941/">Pixabay</a></span>
</figcaption>
</figure>
<h2>Labour party</h2>
<p>Those keen to find out more about Labour’s position towards data privacy will find a rather opaque <a href="http://www.labour.org.uk/page/-/Images/manifesto-2017/Labour%20Manifesto%202017.pdf">manifesto</a>. It says: “Labour is committed to growing the digital economy and ensuring that trade agreements do not impede cross-border data flows, whilst maintaining strong data protection rules to protect personal privacy.” Very little light is shed on what laws would underpin these rules, but it seems very likely that a Labour government would keep the GDPR in its current format.</p>
<p>The manifesto also proposes the appointment of a digital ambassador to liaise with technology companies, promoting Britain as an “attractive place for investment”. But there is not a much said on how the position of this potential ambassador would affect on data privacy issues.</p>
<p>Labour’s position on security also lacks definition. Although it admits that individual rights and civil liberties are at times compromised, it promises to apply investigatory powers proportionately and when necessary The party would also continue to “maintain the cross-border security co-operation agreements with our intelligence partners in Europe and beyond”. </p>
<h2>Liberal Democrats</h2>
<p>The Liberal Democrats stand on the other end of the spectrum. They promise to end the mass surveillance powers of the Investigatory Powers Act and oppose the unrestricted collection of communications data and internet records. They also propose to create a digital “Bill of Rights” to protect individuals’ privacy and to exercise more control over their online data.</p>
<p>The Lib Dems’ <a href="http://www.libdems.org.uk/manifesto-glance">manifesto</a> also pledges to counter the Conservatives’ efforts to create backdoors to encryption mechanisms.</p>
<h2>Which way?</h2>
<p>With such a variety of positions on data privacy and digital surveillance, the main parties have given the electorate some clear options to consider. A big one is what a proportionate use of cyber-surveillance looks like. But there are also serious questions about how our data is protected online and whether some of the measures proposed will even work. The Conservative party manifesto promises that the UK will be “the safest place to be online”. That’s an awfully big claim in such an interconnected world.</p><img src="https://counter.theconversation.com/content/78769/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Vladlena Benson is affiliated with KU, ISACA.</span></em></p>UK politicians are planning very different approaches to data privacy, security and surveillance.Vladlena Benson, Associate Professor, Department of Accounting, Finance and Informatics, Kingston UniversityLicensed as Creative Commons – attribution, no derivatives.