tag:theconversation.com,2011:/africa/topics/online-privacy-110/articlesOnline privacy – The Conversation2024-01-14T12:58:47Ztag:theconversation.com,2011:article/2203532024-01-14T12:58:47Z2024-01-14T12:58:47ZCanada should not fall behind on implementing safety measures for children online<figure><img src="https://images.theconversation.com/files/568485/original/file-20240109-19-9s4t5o.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C6720%2C4476&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Bill S-210, designed to protect minors online from exposure to sexually explicit material, passed a second reading in the House of Commons in December 2023.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><iframe style="width: 100%; height: 100px; border: none; position: relative; z-index: 1;" allowtransparency="" allow="clipboard-read; clipboard-write" src="https://narrations.ad-auris.com/widget/the-conversation-canada/canada-should-not-fall-behind-on-implementing-safety-measures-for-children-online" width="100%" height="400"></iframe>
<p>Recent legislation about age verification for adult content sites has sparked an interesting scenario in the Canadian parliament. On Dec. 13, <a href="https://www.parl.ca/legisinfo/en/bill/44-1/s-210">Bill S-210</a>, An Act to restrict young persons’ online access to sexually explicit material, passed the second reading in the House of Commons with <a href="https://www.ourcommons.ca/Members/en/votes/44/1/609">a vote of 189 to 133</a>. </p>
<p>Surprisingly, most Liberal MPs voted against it, as the government has been working on its own <a href="https://www.canada.ca/en/canadian-heritage/campaigns/harmful-online-content.html">online harms bill</a>. The online harms bill was first promised in 2019 but has yet to be tabled due to the <a href="https://www.cbc.ca/news/canada/british-columbia/online-protection-act-1.7042880">broader complications</a> it is dealing with. </p>
<p>With full support from the Conservatives, NDP, Bloc Québécois and some Liberal MPs, Bill S-210 managed to proceed for a committee review. The bill had successfully passed the Senate in the spring of 2023.</p>
<p>Bill S-210 proposes that, before accessing sites with adult content, all users have to go through a mandatory age verification process to prove they are adults. Age verification has been compulsory for accessing gambling sites and those that sell products like alcohol, tobacco and cannabis. </p>
<h2>Protecting minors</h2>
<p>Similar legislation to Bill S-210 has been successfully passed or implemented in various parts of the world, including <a href="https://euconsent.eu/">the European Union</a>, <a href="https://www.internetmatters.org/resources/uk-age-verification-law-for-pornography-sites-explained-parent-guide/">the United Kingdom</a> and <a href="https://www.nytimes.com/2023/04/30/business/louisiana-kids-age-porn-law.html">several states</a> in the United States. </p>
<p>Yet Canadian lawmakers have divided opinions on this bill. Critics of Bill S-210 have raised <a href="https://www.cbc.ca/news/politics/porn-site-age-verification-proposed-bill-1.7060841">strong concerns</a> about privacy and freedom of expression. </p>
<p>My PhD research focuses on anonymous age verification systems to protect users’ privacy. I also voluntarily consult with the Digital Governance Council of Canada to develop <a href="https://dgc-cgn.org/standards/find-a-standard/">technical standards for age-verification technologies</a>.</p>
<p>When discussing privacy and security during online age verification, we need to consider some key factors.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/568639/original/file-20240110-27-bcyhld.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a young boy stares at a laptop screen in the dark" src="https://images.theconversation.com/files/568639/original/file-20240110-27-bcyhld.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/568639/original/file-20240110-27-bcyhld.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/568639/original/file-20240110-27-bcyhld.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/568639/original/file-20240110-27-bcyhld.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/568639/original/file-20240110-27-bcyhld.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/568639/original/file-20240110-27-bcyhld.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/568639/original/file-20240110-27-bcyhld.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Suppporters of Bill S-210 say it will protect children, while critics of the bill have raised strong concerns about privacy and freedom of expression.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<h2>Online age verification</h2>
<p>While <a href="https://avpassociation.com/avmethods/">different mechanisms</a> exist for online age verification, the more popular methods are ID document matching, facial recognition and third-party verification. </p>
<p>ID document matching is a common method for age verification during in person transactions. For instance, individuals are required to present government-issued ID documents, such as a driver’s licence or health card, when purchasing alcohol from a physical store. Similarly, in online transactions, users can upload an image of their ID. </p>
<p>Then <a href="https://www.britannica.com/technology/OCR">optical character recognition</a> technology is used to extract data from the document, particularly the date of birth. Additionally, a <a href="https://www.incognia.com/the-authentication-reference/what-is-liveness-detection">liveness check</a> may be conducted by comparing the photo on the document with an instant photo of the user to ensure authenticity.</p>
<p>Users may also verify their age through authorized third parties, such as their credit cards or bank accounts. This method leverages existing relationships and information held by these trusted entities to confirm the user’s age.</p>
<p>Biometric-based age verification has been an emerging field during the last decade, thanks to artificial intelligence. Researchers are exploring <a href="https://doi.org/10.1109/MS.2020.3044872">different biometrics</a> for estimating age, including <a href="https://www.yoti.com/blog/yoti-myface-liveness-white-paper/">facial images and videos</a>, <a href="https://doi.org/10.1109/ICPCSN58827.2023.00082">speech</a>, <a href="https://doi.org/10.1109/ICACC-202152719.2021.9708286">fingerprints</a>, <a href="https://doi.org/10.1109/RTSI55261.2022.9905194">heart signals</a> and <a href="https://doi.org/10.1049/ic.2013.0258">irises</a>. </p>
<p>During facial analysis, users are requested to provide a live selfie in the form of an image or video, which is then analyzed by AI-based tools to estimate their ages. This method has been <a href="https://iapp.org/news/a/how-facial-age-estimation-technology-can-help-protect-childrens-privacy-for-coppa-and-beyond/">extensively tested</a> and is now deployed by various entities in different countries, including <a href="https://www.telegraph.co.uk/business/2023/12/15/google-develops-selfie-scanning-block-children-porn/">Google</a> and <a href="https://www.bbc.com/news/technology-63544332">Meta</a>.</p>
<h2>Less invasive option</h2>
<p>When different options are available, users can choose the options with which they feel most comfortable with. The euCONSENT project is a network founded by the European Commission to protect children online. The network recently ran a <a href="https://euconsent.eu/a-summary-of-the-achievements-and-lessons-learned-of-the-euconsent-project-and-what-comes-next/">comprehensive pilot</a> on online age verification among 2,000 children and adults across five European countries. </p>
<p>Participants’ feedback showed that facial estimation was the top choice, preferred by 68 per cent of the participants. They considered it as an easy, fast and less invasive option. Third-party verification (through credit card) was the least chosen option, preferred by only three per cent of the participants.</p>
<p>Users’ personal data (ID documents, facial images or bank information) needs to be protected by enforcing strict regulations, similar to the EU’s <a href="https://gdpr-info.eu/">General Data Protection Regulation</a> policies. </p>
<p>Bill S-210 proposes to implement reliable age verification methods that will collect users’ personal information solely for verification purposes, and the data will be destroyed immediately after verification.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/568643/original/file-20240110-17-nwjjlf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a man's face being scanned by his mobile phone" src="https://images.theconversation.com/files/568643/original/file-20240110-17-nwjjlf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/568643/original/file-20240110-17-nwjjlf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=316&fit=crop&dpr=1 600w, https://images.theconversation.com/files/568643/original/file-20240110-17-nwjjlf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=316&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/568643/original/file-20240110-17-nwjjlf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=316&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/568643/original/file-20240110-17-nwjjlf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=398&fit=crop&dpr=1 754w, https://images.theconversation.com/files/568643/original/file-20240110-17-nwjjlf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=398&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/568643/original/file-20240110-17-nwjjlf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=398&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Facial analysis can be used to determine a user’s age.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<h2>Challenges posed by VPNs</h2>
<p><a href="https://www.pcmag.com/how-to/what-is-a-vpn-and-why-you-need-one">Virtual Private Networks (VPNs)</a> are often used to evade age verification. Users route internet traffic through servers in different locations, making it appear as if they are accessing content from a region without age restrictions. </p>
<p>This challenge can be tackled by <a href="https://www.apnic.net/ip-geolocation-service-providers/">IP geolocation services</a>, which compare a user’s claimed location with their actual IP address, helping to identify any discrepancies.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/explainer-what-is-a-virtual-private-network-vpn-12741">Explainer: what is a virtual private network (VPN)?</a>
</strong>
</em>
</p>
<hr>
<h2>Protecting children</h2>
<p>Along with technological readiness, social awareness is also crucial to ensure proper adoption of age-verification measures, which takes us back to the legislative aspects. </p>
<p>The number of online sexual luring cases involving children <a href="https://www.cbc.ca/news/canada/manitoba/social-media-online-child-luring-reports-spike-canada-1.6739824">has increased 10-fold</a> in the last five years in Canada. We have experienced tragic incidents of kids dying by suicide after being victimized online. Last October, <a href="https://urldefense.com/v3/__https:/www.cbc.ca/news/canada/british-columbia/police-link-suicide-of-12-year-old-prince-george-b-c-boy-to-online-sexual-extortion-1.7041185__;!!MtWvt2UVEQ!DF5HkrmBKV19KkIeKL-ea2wsl0zQDjJXailbkNU8v_hglKA5S_bli3hS-fFnKq-jM1tMU5hhYryCzTQawM4J5fnd%24">a 12-year-old B.C. boy died by suicide</a> after falling victim to online sextortion.</p>
<p>So, the question is: how long do we need to wait before measures are in place to protect children? Canada cannot afford to trail behind any longer. It is now time to move forward and make online space safe.</p><img src="https://counter.theconversation.com/content/220353/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Azfar Adib received funding from MITACS.</span></em></p>Canada needs to take action immediately to protect children online from exposure to age-inappropriate material. More stringent age verification measures need to be in place.Azfar Adib, Public Scholar & PhD Candidate, Electrical and Computer Engineering, Concordia UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2110932023-09-26T22:51:51Z2023-09-26T22:51:51ZFamily vlogs can entertain, empower and exploit<figure><img src="https://images.theconversation.com/files/548388/original/file-20230914-27-rfrjml.jpg?ixlib=rb-1.1.0&rect=0%2C23%2C5329%2C3523&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Family vlogs can be a double-edged sword that provide families with income, but also lead to exploitation.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><iframe style="width: 100%; height: 100px; border: none; position: relative; z-index: 1;" allowtransparency="" allow="clipboard-read; clipboard-write" src="https://narrations.ad-auris.com/widget/the-conversation-canada/family-vlogs-can-entertain-empower-and-exploit" width="100%" height="400"></iframe>
<p>YouTube channels belonging to American content creator Ruby Franke were recently <a href="https://globalnews.ca/news/9960389/ruby-franke-youtube-kevin-jodi-hildebrandt/">scrubbed from the site</a> after the YouTuber was charged with child abuse. Franke was known for making parenting videos on her YouTube channel, 8 Passengers. Her videos frequently featured content on the family and her six children.</p>
<p>Police in Utah said the charges were laid after Franke’s 12-year-old son <a href="https://www.sltrib.com/news/politics/2023/09/05/heres-what-we-know-about-arrest/">climbed out of the window</a> of a home and went to a neighbour to ask for food and water. Police said the boy and his younger sister were found emaciated and required hospitalization. </p>
<p>As blogs and live journals gather internet dust, <a href="https://www.wix.com/blog/photography/how-to-vlog">vlogging</a> has emerged as a new source of intimate entertainment, and for creators, potential income. However, they also raise serious questions about exploitation and the privacy rights of children.</p>
<h2>What is vlogging?</h2>
<p>Vlogs are videos, usually published through social media, that share the creator’s personal thoughts and experiences. Family vlogs like Franke’s are a popular form of this medium, where parents take viewers into their homes. The content might involve taking viewers along on the family’s daily routine. Family vlogging channels upload videos sharing <a href="https://www.youtube.com/watch?v=cq1hI0Mmyic">significant milestones</a>, <a href="https://www.youtube.com/watch?v=OxUHjIFkeIk&t=401s">morning routines</a> and <a href="https://www.youtube.com/watch?v=KkpvqOUrWec">preparing for school</a>. </p>
<p>Many might feel uneasy about <a href="https://theconversation.com/want-to-be-a-social-media-influencer-you-might-want-to-think-again-203306">content creation</a> that showcases private family life. However, at the same time, vlogs might offer families agency and alternative means of making ends meet at a time of stagnant wages and soaring living costs.</p>
<p>Thinking about vlogging as a kind of social reproduction allows us to think through the double-edged sword of content creation. Social reproduction refers to the labour of <a href="https://doi.org/10.1111/1467-8330.00207">lifemaking</a>: the day-to-day work of care, education and sustenance. <a href="https://doi.org/10.1177/0309132518791730">Feminist theorists</a> use this term to think about the ways in which caring labour supports and shapes our social, political and economic world.</p>
<p>Social reproduction is “<a href="https://doi.org/10.1111/1467-8330.00207">the fleshy, messy and indeterminate stuff of everyday life</a>.” It involves the responsibilities and relationships involved in maintaining daily life.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/544800/original/file-20230825-21-qhucf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A man and two young children sit in front of cameras and a laptop." src="https://images.theconversation.com/files/544800/original/file-20230825-21-qhucf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/544800/original/file-20230825-21-qhucf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/544800/original/file-20230825-21-qhucf7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/544800/original/file-20230825-21-qhucf7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/544800/original/file-20230825-21-qhucf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/544800/original/file-20230825-21-qhucf7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/544800/original/file-20230825-21-qhucf7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Many might feel uneasy about content that showcases private family life. However, vlogs offer alternative means of making ends meet.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<h2>A response to the pressures of parenting</h2>
<p>Family vlogging did not develop in a vacuum. Instead, the trend towards “mumpreneurs” emerged from within a <a href="https://newleftreview.org/issues/ii100/articles/nancy-fraser-contradictions-of-capital-and-care">care crisis</a>. The cost of living is rising, wages are stagnating, and government benefits do not provide the support families need. Parents — and mothers in particular — are facing significant pressures when it comes to caring for children and the household.</p>
<p>There has been a rise in gender equity in the workforce, however there is still <a href="https://theconversation.com/we-can-we-reduce-gender-inequality-in-housework-heres-how-58130">huge inequity</a> when it comes to work in the home. Women are working unprecedented (paid and unpaid) hours, and are often being told they are <a href="https://www.sfu.ca/vancity-office-community-engagement/below-the-radar-podcast/series/women-work-more/143-amanda-watson.html">failing at both</a>.</p>
<p>As a response to these pressures, mothers developed their own online communities to express the <a href="https://jarm.journals.yorku.ca/index.php/jarm/article/view/40238">highs and lows of parenting</a>. These communities began as <a href="https://doi.org/10.1080/1369118X.2016.1187642">“mommy blogs,”</a> but have increasingly moved to vlog format over the years. </p>
<p>Family vlogs can offer intimate counter-narratives to the expectations of parenthood. Mothers can share <a href="https://doi.org/10.1177/17504813221123663">the anxieties and pressures they face</a> and offer support to one another.</p>
<h2>Commodifying families</h2>
<p>However, there can be downsides to the trend. Many family vlogs are highly curated productions that can perpetuate ideas about what constitutes “good” motherhood, rather than challenge racialized, gendered and classist <a href="https://doi.org/10.1177/2056305117707186">ideals of motherhood</a>. In this way, vlogs are less about connection and more about commodification.</p>
<p>The implications of this monetization are complex. Performing <a href="https://doi.org/10.1093/ccc/tcy008">socially desirable</a> forms of motherhood can reproduce racial, sexual and class-based exclusion around who does and who does not count as a good mother. Dominant ideas of “motherhood” are shaped by heterosexual family structures, and there is a <a href="https://www.penguinrandomhouse.com/books/37354/women-race-and-class-by-angela-y-davis/">long history</a> of surveilling and <a href="https://utorontopress.com/9781442691520/exalted-subjects/">disciplining</a> racialized parents.</p>
<p>YouTube <a href="https://support.google.com/youtube/answer/72851">creators</a> depend on <a href="https://www.youtube.com/intl/en_ph/creators/how-things-work/video-monetization/">viewership and subscribers</a> to monetize their content. They also use YouTube advertisements, sponsorships and brand deals to generate income. While some creators can make millions of dollars, most do not. Many are precarious workers with fluctuating incomes determined by <a href="https://support.google.com/youtube/answer/141805#zippy=%2Chow-does-youtube-choose-what-videos-to-promote%2Chow-are-videos-ranked-on-home">YouTube’s algorithm</a>. </p>
<p>On the other hand, content creation allows mothers to rebel against economic insecurity by making their motherhood a source of income. While this offers a means of paying the bills, who benefits and who doesn’t when a certain version of the family is commodified? </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/544801/original/file-20230825-15-k4cmur.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A man and a young girl preparing food in a kitchen while a smartphone films" src="https://images.theconversation.com/files/544801/original/file-20230825-15-k4cmur.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/544801/original/file-20230825-15-k4cmur.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/544801/original/file-20230825-15-k4cmur.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/544801/original/file-20230825-15-k4cmur.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/544801/original/file-20230825-15-k4cmur.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/544801/original/file-20230825-15-k4cmur.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/544801/original/file-20230825-15-k4cmur.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Many content creators are dependent on social media algorithms that determine what content gets the most views.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<h2>Kids and clickbait: What is the law?</h2>
<p>Exploitation is twofold for family vloggers. Firstly, in the United States, parents are considered responsible for protecting their underage children’s privacy information and consent. Many influencers live or move to the U.S. for <a href="https://www.cbc.ca/player/play/1987946563736">creator funds</a> and better networking opportunities. This can become an issue when <a href="https://theconversation.com/why-arent-there-any-legal-protections-for-the-children-of-influencers-196463">parents exploit their children</a> while also being <a href="https://www.newsweek.com/youtube-lets-lawless-lucrative-sharenting-industry-put-kids-mercy-internet-1635112">in charge of providing consent</a>. </p>
<p>Secondly, <a href="https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/45530.pdf">social media algorithms</a> determine whether a video becomes popular on a platform, which <a href="https://www.youtube.com/intl/en_ca/creators/how-things-work/content-creation-strategy/">prioritizes content that gains the most views</a>.</p>
<p>The algorithms can <a href="https://theconversation.com/want-to-be-a-social-media-influencer-you-might-want-to-think-again-203306">change without warning</a>, so creators never know if their content will remain popular. If family vloggers choose to stop showcasing their children on their channels, they might <a href="https://www.popsugar.com/family/posting-kids-faces-social-media-privacy-49045872">lose viewership</a> and priority within the algorithm.</p>
<p>Existing U.S. laws are unequipped to handle this new form of child labour. <a href="https://www.washingtonpost.com/history/2023/08/25/illinois-child-influencer-earnings-law-history-jackie-coogan/">The Coogan Act</a> attempts to protect the income of child performers, but it does not account for the unique conditions of child social media stars. </p>
<p>Most recently, <a href="https://www.nbcnews.com/news/child-influencers-law-illinois-reaction-rcna99831">Illinois is the first U.S. state</a> to pass a law to ensure child influencers featured in monetized videos receive financial compensation. The law will take effect in July 2024, and there is hope that other states will follow suit. </p>
<p>This is a good start, but it is not enough. Policymakers should also look at the steps France has taken to protect child influencers. In 2020, the country passed a law that gives children the <a href="https://www.bbc.com/news/world-europe-54447491">right to be forgotten</a>. This means that child influencers can request that the platform removes content featuring them without their parent’s permission.</p>
<p>Laws need to include more than financial compensation for child influencers. There need to be regulations protecting children’s privacy, rights to have content removed and preventing children from being overworked. There also needs to be a call for greater regulation and transparency of social media algorithms that control and manipulate what is profitable.</p>
<p>Whether it is entertainment, exploitation or employment, family vlogging is a reminder of the complex interconnections between care work and wage work. As the households of strangers stream across our screens, parents and lawmakers must think carefully about the impacts on families and children.</p><img src="https://counter.theconversation.com/content/211093/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Rebecca Hall receives funding from the Social Sciences and Humanities Research Council.</span></em></p><p class="fine-print"><em><span>Christina Pilgrim does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Vlogging has emerged as a new source of intimate entertainment, and for creators, potential income. However, they also raise serious questions about exploitation and the privacy rights of children.Rebecca Hall, Assistant Professor, Global Development Studies, Queen's University, OntarioChristina Pilgrim, Master's student, Department of Sociology, Queen's University, OntarioLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2135952023-09-25T14:05:46Z2023-09-25T14:05:46ZThe UK just passed an online safety law that could make people less safe<figure><img src="https://images.theconversation.com/files/549630/original/file-20230921-21-2xqp7x.jpg?ixlib=rb-1.1.0&rect=18%2C9%2C6207%2C4081&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/text-messages-cellphone-screen-abstract-hologram-1452139727">Tero Vesalainen / Shutterstock</a></span></figcaption></figure><p>The UK’s long-debated online safety bill (OSB) has <a href="https://www.reuters.com/world/uk/uks-online-safety-bill-passed-by-parliament-2023-09-19/">been approved</a> by the House of Lords, clearing the way for it to become law. But it has pitted the government, which proposed the bill, <a href="https://www.bbc.co.uk/news/technology-66854622">against tech companies</a> that provide secure messaging services. Critics say it will allow authorities in the UK to compel service providers to break users’ encryption.</p>
<p>In July, 68 cybersecurity academics <a href="https://haddadi.github.io/UKOSBOpenletter.pdf">published an open letter</a> outlining their concerns about the <a href="https://bills.parliament.uk/bills/3137">OSB</a>. In it, they argue that the bill undermines the safety and privacy of users online. </p>
<p>The OSB has met with significant opposition from industry as well. Apple <a href="https://www.bbc.co.uk/news/technology-66028773">released a statement</a> explaining that encryption “helps everyday citizens defend themselves from surveillance, identity theft, fraud, and data breaches. The OSB poses a serious threat to this protection.”</p>
<p>In April, several secure messaging providers, such as Whatsapp, Element, Session and Signal, signed <a href="https://twitter.com/signalapp/status/1648117518291308544">another open letter</a> urging the UK government to rethink the bill.</p>
<p>Yet the bill is now set to become law. On a high level, the OSB imposes duties of care on to providers of so-called “user-to-user” internet services, those that allow users to upload or share content that can be seen by other users. This covers activities such as uploading photos onto Instagram or sending messages via WhatsApp. </p>
<p>This distinguishes social media and online messaging services from internet services such as online banking, in which only the provider sees the content uploaded by the end user. These duties of care are aimed at preventing users from communicating illegal content such as child sexual abuse material. </p>
<h2>Why is encryption important?</h2>
<p>Since the OSB addresses messaging applications, cybersecurity experts have expressed alarm at the potential of the bill to <a href="https://faq.whatsapp.com/820124435853543">undermine so-called end-to-end encryption</a>. For messaging applications such as WhatsApp and Signal, end-to-end encryption ensures that only the sender of a given message and their intended recipients can read the content of the message. Even the service provider is prevented from reading the message. </p>
<p>This has been a point of contention for governments and intelligence agencies worldwide, since it means they can no longer <a href="https://www.washingtonpost.com/business/technology/2014/09/25/68c4e08e-4344-11e4-9a15-137aa0153527_story.html">persuade tech companies</a> to let them access a user’s messages.</p>
<p>Proponents of end-to-end encryption, such as the Electronic Frontier Foundation digital rights activist group, argue that privacy of communication <a href="https://www.eff.org/deeplinks/2023/03/tell-uks-house-lords-protect-end-end-encryption-online-safety-bill">is a fundamental right</a> that protects vulnerable groups, such as dissidents in authoritarian regimes. Encryption, they argue, helps ensure this privacy. </p>
<p>However, critics such as intelligence and law enforcement agencies argue that the widespread use of this form of encryption <a href="https://www.theatlantic.com/politics/archive/2015/07/nsa-encryption-ungoverned-spaces/398423/">hinders their ability</a> to detect criminal activity such as terrorism or child sexual exploitation.</p>
<h2>Is the OSB the only legislation to do this?</h2>
<p>The OSB is not the first piece of legislation that has come under fire over its potential to undermine the safety and privacy of end-to-end encryption. In 2018, the Australian government <a href="https://www.legislation.gov.au/Details/C2018A00148">passed the Tola Act</a>, which also contained measures to compel tech companies to work with the authorities. Politicians argued that it was necessary to address terrorism. But there was a strong <a href="https://www.zdnet.com/article/parliamentary-security-committee-review-backs-the-use-of-controversial-tola-act/">backlash from critics</a> who said it <a href="https://policyreview.info/articles/analysis/australias-encryption-laws-practical-need-or-political-strategy">could undermine encryption</a>.</p>
<p><a href="https://home-affairs.ec.europa.eu/news/eu-proposes-new-rules-fight-child-sexual-abuse-2022-05-11_en">A recent proposal</a> by the European Commission suggests similar requirements for service providers of user-generated content in EU countries and has sparked <a href="https://docs.google.com/document/d/13Aeex72MtFBjKhExRTooVMWN9TC-pbH-5LEaAbMF91Y">its own open letter</a> from security and privacy researchers concerned for the potential harm to secure digital societies.</p>
<h2>Can the OSB help undermine encryption?</h2>
<p>The bill specifically requires the UK communications regulator, Ofcom, to issue “codes of practice” to providers of user-to-user services. The codes provide a basis for Ofcom to obtain information from these providers and fine them for non-compliance.</p>
<p>These codes also require that all providers of user-to-user services “must take or use proportional measures to prevent individuals from encountering illegal content by means of the service”. </p>
<p>COnservative MP <a href="https://www.gov.uk/government/people/damian-collins">Damian Collins</a>, who – as minister for tech and the digital economy from July to October 2022 – helped develop the OSB, said <a href="https://www.channel4.com/news/online-safety-bill-debate-could-it-lead-to-unprecedented-paradigm-shifting-surveillance">in a recent debate</a> that companies should “use their best endeavours to detect, proactively detect, content related to child sexual exploitation”. But he also added: “We are not going to ask companies to break encryption.”</p>
<p><a href="https://haddadi.github.io/UKOSBOpenletter.pdf">The open letter</a> from the 68 academics points out the fundamental flaw in this argument: “There is no technological solution to the contradiction inherent in both keeping information confidential from third parties and sharing that same information with third parties.” </p>
<p>The president of messaging app Signal, Meredith Whittaker, says the bill contains no protections against breaking encryption. </p>
<p>Indeed, the OSB’s language allows Ofcom to issue “notices” that could be used to compel messaging applications to undermine encryption. These would require the provider of the service to “use accredited technology to identify illegal content communicated publicly or privately by means of the service, and to swiftly take down that content”.</p>
<p>Since end-to-end encryption fundamentally prevents the service provider from reading user-sent content, this necessitates breaking encryption to identify that content.</p>
<h2>What outcome are we likely to see?</h2>
<p>Looking at the language of the OSB, the concerns of cybersecurity experts would appear to have some foundation, <a href="https://www.theguardian.com/technology/2023/feb/24/signal-app-warns-it-will-quit-uk-if-law-weakens-end-to-end-encryption">despite the denials</a> of Damian Collins and the Home Office. The OSB provides mechanisms for the government to compel messaging applications to undermine their own security measures to achieve its goals. </p>
<p>Removing these provisions would be straightforward. Deleting the phrasing “or privately” from the bill would allow the OSB to stand mostly untouched while addressing the concerns of providers that use end-to-end encryption. </p>
<p>It is painfully ironic then, that <a href="https://gizmodo.com/whatsapp-signal-uk-online-safety-bill-encryption-censor-1850347516">since both Signal and WhatsApp</a> have indicated that they would leave the UK rather than undermine encryption, that the current wording of the UK’s online safety bill would potentially leave UK users of end-to-end encryption less safe online.</p><img src="https://counter.theconversation.com/content/213595/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Benjamin Dowling received funding by EPSRC grant EP/X016226/1. </span></em></p>The online safety bill contains measures that appear to compel messaging services to break encryption.Benjamin Dowling, Lecturer of Cybersecurity, University of SheffieldLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2031092023-06-22T12:32:00Z2023-06-22T12:32:00ZFear trumps anger when it comes to data breaches – angry customers vent, but fearful customers don’t come back<figure><img src="https://images.theconversation.com/files/530963/original/file-20230608-14786-a4sqhf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">One-third of customers will return to a hacked site without even changing their password, according to a recent study.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/young-asian-businesswoman-sitting-on-the-bench-in-royalty-free-image/1295580690">d3sign/Moment Collection/Getty Images</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em></p>
<h2>The big idea</h2>
<p>When a person is notified of a data breach involving their personal information, if they react with a feeling of fear – as opposed to anger – they’re more likely to stop using the site. </p>
<p>That was the main finding of <a href="http://www.doi.org/10.1109/TEM.2022.3189599">a study I conducted</a> with three co-authors that examined which emotions lead customers to change their behavior after a breach. We found that angry customers, on the other hand, are more likely to vent on different social media platforms but then return to the breached site.</p>
<p>We surveyed 208 U.S. consumers, ages 18 to 60, and asked them to describe their feelings after being informed of a data breach on their favorite and frequently used website. Subscription websites, such as Netflix and Xbox Live, and free-to-use websites, such as Facebook and Snapchat, were considered. We then asked the participants to explain, in their own words, what actions they took in response.</p>
<p>We found that positive attitudes toward the website before the breach did not meaningfully affect whether consumers reengaged with the website after the breach, as some <a href="https://doi.org/10.1080/07421222.2018.1451962">prior research</a> has indicated. Instead, the emotional response of fear, in particular, weighed heavily on customers. </p>
<p>Fearful customers appeared to stop using the breached site to reduce their feelings of stress and vulnerability. Other customers resorted to providing false biographical details or removing credit card data, name and date of birth from the website as they continued using it. </p>
<h2>Why it matters</h2>
<p>In 2022 alone, U.S. <a href="https://www.statista.com/statistics/273550/data-breaches-recorded-in-the-united-states-by-number-of-breaches-and-records-exposed/">customer data was compromised</a> in over 1,800 incidents, affecting over 400 million individuals. </p>
<p>Much of the prior research has focused on <a href="https://doi.org/10.1080/1062726X.2017.1356310">customer anger</a> in the wake of a data breach and the need for companies to placate angry customers or manage negative media coverage. To do so, companies may <a href="https://doi.org/10.1057/s41299-021-00121-9">engage crisis managers to contain the damage</a>, <a href="https://www.cnbc.com/2022/02/14/equifax-settlement-letters-going-out-regarding-free-credit-monitoring.html">partner with identity protection services</a>, <a href="https://www.cnn.com/2022/07/25/tech/tmobile-data-breach-settlement/index.html">pay fines or settlements</a>, or try to lure back customers with <a href="https://www.reuters.com/article/us-media-playstation-idINTRE7415J120110502">free services</a>. </p>
<p>However, our research shows that companies need to address fearful customers differently after a data breach has occurred – if they want to avoid customer loss. To do this, companies can work with their IT departments to identify customers who are no longer active after a breach and then reach out to them directly to assuage their fears. </p>
<h2>What still isn’t known</h2>
<p>It is not yet known how companies should react in the aftermath of a data breach. It isn’t clear why customers return. One likely explanation is <a href="https://doi.org/10.1016/j.chb.2017.12.001">privacy fatigue</a> – when customers believe keeping their online data secure is futile. </p>
<p>In our study we found one-third of customers returned after a breach without even changing their passwords. More than half returned after making some changes, such as removing their credit card data, changing their passwords or removing personal information.</p>
<p>This may be why researchers cannot provide reliable recommendations for handling data breaches. From a company’s standpoint, if customers will return anyway, there is little incentive to do more than the bare minimum to address a breach. </p>
<h2>What’s next</h2>
<p>We are now studying the behavior of people who have experienced multiple data breaches in the past year. We want to know how these customers change their behaviors, as well as how they judge the recovery efforts of the companies whose sites were breached.</p>
<p>Recent regulations, such as the EU’s 2018 <a href="https://gdpr.eu/what-is-gdpr/">data protection law</a> and newly introduced <a href="https://www.nytimes.com/2021/05/14/technology/state-privacy-internet-laws.html">state bills</a> in the U.S. – along with updates to the <a href="https://www.oag.ca.gov/privacy/ccpa">California Consumer Privacy Act</a> – will force companies and data brokers to think more seriously about the kinds of data being collected and stored. Health care, retail, finance, social networking and other websites will need to make significant changes in how they inform customers of – and compensate them for – such data breaches.</p><img src="https://counter.theconversation.com/content/203109/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Rajendran Murthy does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Companies tend to focus on appeasing angry customers after a data breach. New research shows they may want to pay more attention to customers who are afraid to return to their site.Rajendran Murthy, Professor of Marketing, Rochester Institute of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2027322023-05-22T05:01:37Z2023-05-22T05:01:37ZA TikTok ban isn’t a data security solution. It will be difficult to enforce – and could end up hurting users<figure><img src="https://images.theconversation.com/files/527441/original/file-20230522-27-ezjq7p.jpeg?ixlib=rb-1.1.0&rect=24%2C57%2C5447%2C3461&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Montana has made an unprecedented move to become the <a href="https://www.theguardian.com/us-news/2023/may/17/tiktok-ban-montana">first US state</a> to ban TikTok. </p>
<p>However, doubts have been raised over the decision’s legal foundation, enforcement mechanisms and underlying motives. While the move draws attention to data security on social media, banning TikTok alone may not provide a comprehensive solution to this problem. </p>
<p>For one, the move risks alienating the many young people who have come to rely on the app for meaningful connection, and in some cases their income. It also does little in the way of ensuring better future data privacy and protection for users.</p>
<h2>Caught in political crossfire</h2>
<p>Since its <a href="https://www.businessofapps.com/data/tik-tok-statistics/">meteoric rise</a> in 2020, TikTok has been caught in geopolitical tensions between the US and China. These tensions peaked in late 2020 when then-president Donald Trump signed an executive order directing ByteDance – the Chinese media giant and <a href="https://newsroom.tiktok.com/en-au/the-truth-about-tiktok">parent company of TikTok</a> – to divest from its US operations, or face being banned. In response, TikTok partnered with Oracle on <a href="https://mashable.com/article/project-texas-tiktok">Project Texas</a>: a US$1.5 billion initiative to relocate all US user data to servers outside China. </p>
<p>Allegations that China-based employees at ByteDance had accessed the TikTok user data led to TikTok CEO Shou Zi Chew <a href="https://www.reuters.com/technology/tiktok-ceo-face-tough-questions-support-us-ban-grows-2023-03-23/">appearing before Congress</a> in March amid yet more calls for it to be banned, and reports of the Biden administration <a href="https://www.nytimes.com/2023/03/15/technology/tiktok-biden-pushes-sale.html">pushing for its sale</a>.</p>
<p>Throughout these controversies, TikTok has denied sharing user data with the Chinese government, and said it wouldn’t do so even if asked. Nonetheless, governments worldwide – <a href="https://theconversation.com/why-was-tiktok-banned-on-government-devices-an-expert-on-why-the-security-concerns-make-sense-202339">including in Australia</a> – have banned TikTok on government devices, citing concerns over data protection.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/is-china-out-to-spy-on-us-through-drones-and-other-tech-perhaps-thats-not-the-question-we-should-be-asking-205576">Is China out to spy on us through drones and other tech? Perhaps that’s not the question we should be asking</a>
</strong>
</em>
</p>
<hr>
<h2>Enforcing a ban is a daunting task</h2>
<p>Montana’s new law will make downloading TikTok within state lines illegal from January 1 2024. The law imposes fines of up to US$10,000 per day for entities offering access to or downloads of the app within the state. Users themselves will not incur penalties. </p>
<p>The current legislation places responsibility for blocking access on Apple and Google – the operators of app stores on iOS and Android devices. These companies would be held liable for any violations. However, they lack the capacity to enforce geofencing at the state level, making it difficult for them to prevent Montana residents from downloading TikTok. </p>
<p>As a result, it may ultimately fall on TikTok itself to block usage by Montana residents by collecting geolocation data. But this raises privacy concerns – the very concerns driving the ban in the first place.</p>
<p>For now, the ban’s enforceability remains to be seen. How will the government of Montana prevent users from using virtual private networks (VPNs) to access TikTok? VPNs encrypt data traffic and allow users to present themselves as being in another location, making it possible for tech-savvy users to bypass bans. Residents could also cross state lines to download the app.</p>
<p>Montana may become a testing ground for the “TikTok-free America” that some national lawmakers envision. Apart from TikTok, the ban also targets messaging apps including Chinese-owned WeChat and Russian-founded Telegram – highlighting growing apprehensions over data security and privacy.</p>
<p>But it’s unclear if such a ban is an effective solution for lawmakers’ concerns about American users’ privacy and data security. </p>
<p>Even if the ban in Montana is successful, its national impact will be limited. The state has a population of just over one million, whereas the US as a whole has <a href="https://variety.com/2023/digital/asia/tiktok-150-million-us-monthly-users-government-ban-1235560251/">more than 100 million</a> monthly TikTok users. As such, the ban in Montana will likely affect only a few hundred thousand prospective users, at best. </p>
<h2>TikTok’s importance for Gen Z</h2>
<p>While TikTok’s popularity in the US continues to soar, nearly half of all US-based users are the digital-native teens and 20-somethings of <a href="https://www.pewresearch.org/short-reads/2019/01/17/where-millennials-end-and-generation-z-begins/">Generation Z</a>. TikTok is <a href="https://www.forbes.com/sites/forbestechcouncil/2020/07/07/what-the-rise-of-tiktok-says-about-generation-z/?sh=27ca6e4f6549">Gen Z’s playground</a>.</p>
<p>Young people have protested potential bans by flooding the app with videos mocking lawmakers they see as out of touch with modern technology, further magnifying their disdain for such regulation.</p>
<p>Congresswoman Alexandria Ocasio-Cortez <a href="https://www.nytimes.com/2023/03/25/nyregion/aoc-tiktok-ban-video.html">supported young protesters</a>, highlighting the unprecedented nature of banning an app that would stifle free speech while raising questions regarding digital rights in the US.</p>
<p><iframe id="tc-infographic-863" class="tc-infographic" height="400px" src="https://cdn.theconversation.com/infographics/863/82b24fcfd0c2ff908869687ef187de0648de6dca/site/index.html" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>TikTok has emerged as a vital platform for Gen Z users to express their political views, entertain themselves and <a href="https://theconversation.com/how-young-lgbtqia-people-used-social-media-to-thrive-during-covid-lockdowns-156130">interact with their peers</a>. Where other platforms might feel saturated with older generations, TikTok provides an environment where young people can safely lower the barriers to meaningful online participation.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/tiktok-is-teaching-the-world-about-autism-but-is-it-empowering-autistic-people-or-pigeonholing-them-192093">TikTok is teaching the world about autism – but is it empowering autistic people or pigeonholing them?</a>
</strong>
</em>
</p>
<hr>
<p>And despite what some may think, it’s not just a quirky app for dance videos. TikTok has become a golden goose for millions of content creators who rely on the app as their stage to showcase their talents, build their brands and connect with fans and customers. Many local small businesses also rely on TikTok to reach potential customers. </p>
<p>With the app now under threat, the future livelihoods of these creators and small businesses are in jeopardy too.</p>
<h2>A ban won’t fix privacy and data security issues</h2>
<p>A successfully implemented TikTok ban may drive users to Silicon Valley’s <a href="https://edition.cnn.com/2023/03/24/tech/tiktok-alternatives/index.html">big tech platforms</a>. But the security of user data with these companies, including Meta (which owns Facebook and Instagram) and Google, can’t be assumed to be more secure than TikTok. They <a href="https://theconversation.com/amazon-google-and-facebook-warrant-antitrust-scrutiny-for-many-reasons-not-just-because-theyre-large-118370">also collect significant amounts</a> of user data that can be shared or sold to third-party entities, including those with connections to China or countries with similar data laws.</p>
<p>The underlying issues of data security will persist beyond a TikTok ban. If data security really is the main concern, policymakers should address the problem comprehensively and systematically across social media platforms. </p>
<p>Tackling the root cause is essential. Until that’s done, snapping off the branches – TikTok or otherwise – will do little to keep users’ data safe.</p><img src="https://counter.theconversation.com/content/202732/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Milovan Savic does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Montana has announced plans to ban the app from January 2024, making it a potential testing ground for a ‘TikTok-free’ America.Milovan Savic, Research Fellow, ARC Centre of Excellence for Automated Decision-Making and Society, Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1960192023-03-14T20:43:07Z2023-03-14T20:43:07ZConsumer Privacy Protection Act could lead to fines for deceptive designs in apps and websites<figure><img src="https://images.theconversation.com/files/510195/original/file-20230214-24-dapuo7.JPG?ixlib=rb-1.1.0&rect=23%2C31%2C2507%2C1831&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Canada’s proposed Consumer Privacy Protection Act prohibits online consent processes that are deceptive or misleading.
</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><iframe style="width: 100%; height: 100px; border: none; position: relative; z-index: 1;" allowtransparency="" allow="clipboard-read; clipboard-write" src="https://narrations.ad-auris.com/widget/the-conversation-canada/consumer-privacy-protection-act-could-lead-to-fines-for-deceptive-designs-in-apps-and-websites" width="100%" height="400"></iframe>
<p>Canada’s <a href="https://www.parl.ca/DocumentViewer/en/44-1/bill/C-27/first-reading">proposed Consumer Privacy Protection Act (CPPA)</a> prohibits online consent processes that are deceptive or misleading. </p>
<p>Companies may face fines for breaking the act’s rules. This could be trouble for social media platforms, online shopping companies and other services that use deceptive user interface designs in their apps and websites.</p>
<p>The CPPA is a component of <a href="https://www.parl.ca/DocumentViewer/en/44-1/bill/C-27/first-reading">Bill C-27</a>, described by the federal government as <a href="https://ised-isde.canada.ca/site/innovation-better-canada/en/canadas-digital-charter/bill-summary-digital-charter-implementation-act-2020">an attempt to improve Canadian privacy law</a> and ensure responsible use of personal information and artificial intelligence by companies. </p>
<p>The possibility of fines for deceptive or misleading consent processes suggests the government views consent as fundamental to personal information protections. As a result, companies may be held accountable for deceptive user interface designs associated with app and website consent processes.</p>
<p>User interface design means deciding how to present buttons, links, prompts, images, video, text and other visual elements on-screen. Decisions about the shape, colour, size and placement of these elements influence what people see first or second, where they click/tap, whether a purchase is made, a complaint is lodged or consent is given.</p>
<p>Deceptive designs (sometimes problematically called <a href="https://www.deceptive.design">dark pattern designs</a>) are design choices that can mislead, coerce and exploit people for the benefit of for-profit companies. A <a href="https://doi.org/10.1145/3359183">study of about 11,000 shopping sites</a> describes 15 types of deceptive designs, each with a unique approach to manipulation.</p>
<h2>Fines for deceptive design</h2>
<p>Deceptive design is a top information policy issue internationally, and problematic consent processes are a primary focus of current enforcement efforts. In 2022, the Commission Nationale de l'Informatique et des Libertés (CNIL), a data protection authority in France, <a href="https://www.cnil.fr/en/cookies-cnil-fines-google-total-150-million-euros-and-facebook-60-million-euros-non-compliance">fined Google the equivalent of C$215 million and Facebook the equivalent of C$86 million</a> for deceptive design. </p>
<p>CNIL said the companies provided people with a button to accept online cookies “immediately,” but did not provide a similar prompt for refusal. CNIL claimed that requiring multiple clicks to refuse all cookies improperly influenced the consent process.</p>
<p>Action by the U.S. Federal Trade Commission (FTC) <a href="https://www.ftc.gov/news-events/news/press-releases/2022/11/ftc-action-against-vonage-results-100-million-customers-trapped-illegal-dark-patterns-junk-fees-when-trying-cancel-service">led to internet telephone company Vonage having to refund</a> the equivalent of C$133 million to customers for deceptive designs that made it easy to sign up for a service, but very difficult to cancel. FTC action also <a href="https://www.ftc.gov/news-events/news/press-releases/2020/09/childrens-online-learning-program-abcmouse-pay-10-million-settle-ftc-charges-illegal-marketing">led to the company that runs the online learning program ABCMouse</a> having to pay the equivalent of C$13 million for similar designs. </p>
<figure class="align-center ">
<img alt="A sign that says 'Federal Trade Commission Building' sits in front of a square beige bulidings" src="https://images.theconversation.com/files/508496/original/file-20230206-17-nkq202.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/508496/original/file-20230206-17-nkq202.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/508496/original/file-20230206-17-nkq202.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/508496/original/file-20230206-17-nkq202.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/508496/original/file-20230206-17-nkq202.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/508496/original/file-20230206-17-nkq202.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/508496/original/file-20230206-17-nkq202.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The U.S. Federal Trade Commission is addressing the use of deceptive designs by companies.</span>
<span class="attribution"><span class="source">(AP Photo/Alex Brandon)</span></span>
</figcaption>
</figure>
<p>The <a href="https://www.pewtrusts.org/en/research-and-analysis/blogs/stateline/2022/03/04/it-turns-out-state-lawmakers-hate-auto-renew-contracts-too">company Noom</a>, which owns an app for tracking food and exercise consumption, recently settled the equivalent of a C$83 million class action suit after customers alleged they were unfairly charged subscription fees. </p>
<p>Commenting on deceptive designs, <a href="https://www.ftc.gov/news-events/news/press-releases/2022/09/ftc-report-shows-rise-sophisticated-dark-patterns-designed-trick-trap-consumers">the FTC stated</a> that “more and more companies are using digital dark patterns to trick people into buying products and giving away their personal information…these traps will not be tolerated.” </p>
<h2>The clickwrap</h2>
<p>A deceptive design common to online consent processes is the clickwrap. The clickwrap, or clickthrough agreement, is a set of user interface designs people often encounter when signing up for a new app or website, or when terms of service and privacy policies change.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/gtQ2tNUTF3Q?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A video essay about how clickwrap designs help digital platforms pressure people into accepting terms and conditions. (Jonathan Obar)</span></figcaption>
</figure>
<p>Clickwraps can include an appealing “accept” button and less-noticeable links to policies. As people read from the top of the screen to the bottom, they might notice the colourful accept button first and miss links to policies below the button or elsewhere on screen. </p>
<p><a href="https://doi.org/10.1177/2056305118784770">In a previous study I co-authored about clickwraps</a>, study participants said they saw a prominently displayed accept/join button first, while links to policies were small and “easy to miss.”</p>
<p>A <a href="https://dx.doi.org/10.2139/ssrn.3898254">recent paper I co-authored that has yet to be peer-reviewed</a> suggests the text on clickwrap accept buttons rarely says “agree,” and often says something like “sign up” or “create account” instead. This choice of text may distract people from the consent process taking place, keeping the focus on a quick sign up.</p>
<p>Clickwraps are a problem if the goal is to ensure an engaging online consent process. They raise concerns about for-profit companies moving individuals quickly towards monetized parts of services, instead of encouraging people to question if joining the service is a good idea.</p>
<p>An online consent process is a unique opportunity to engage people in far more than a boring contract.</p>
<p>Information on the future of artificial intelligence (AI), the benefits and drawbacks of data sharing and use, opt-in/out mechanisms, contact information for policymakers and privacy advocates, and digital literacy tools could all be available for review before consent is provided. </p>
<p>Instead, clickwraps make it easy to skip the fine print, as well as the opportunity to understand how service use has implications for the future.</p>
<h2>Implications for AI and the future</h2>
<p>One implication is the connection between deceptive user interface designs and the future of AI development. This is perhaps one reason the Canadian government is prioritizing the issue. </p>
<p>As big data expands through the ubiquity of the internet, endless data sets are now available across the global economy. Some AI developers don’t engage directly with consumers, which raises questions about who is responsible for ensuring data is acquired via lawful consent processes.</p>
<figure class="align-center ">
<img alt="A man wearing a suit and glasses speaks into a microphone from behind a desk. Canadian flags stand in the background." src="https://images.theconversation.com/files/514742/original/file-20230310-18-q24z8g.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/514742/original/file-20230310-18-q24z8g.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/514742/original/file-20230310-18-q24z8g.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/514742/original/file-20230310-18-q24z8g.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/514742/original/file-20230310-18-q24z8g.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/514742/original/file-20230310-18-q24z8g.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/514742/original/file-20230310-18-q24z8g.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Privacy Commissioner of Canada Philippe Dufresne speaking at a press conference in Ottawa on Jan. 26, 2023.</span>
<span class="attribution"><span class="source">THE CANADIAN PRESS/Spencer Colby</span></span>
</figcaption>
</figure>
<p>The <a href="https://www.priv.gc.ca/en/about-the-opc/what-we-do/consultations/completed-consultations/consultation-ai/reg-fw_202011/">Office of the Privacy Commissioner of Canada emphasizes</a> that the lack of a direct relationship with some AI developers, along with the challenge of understanding how data may be used in the future, further burdens people with having to decide whether clicking “sign up” is wise.</p>
<p>As governments figure out how to ensure meaningful consent is central to AI development, digital service providers must do their part to design user interfaces that are not deceptive or misleading.</p>
<p>If Canada’s Bill C-27 becomes law, will government-imposed monetary penalties move companies away from clickwraps and towards interface designs that facilitate education and understanding? It’s difficult to tell. It may depend on whether the Canadian government follows the lead of policymakers in the U.S. and France to hold companies accountable for deceptive designs.</p><img src="https://counter.theconversation.com/content/196019/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Some of Jonathan Obar's work referenced in this article received funding from the Office of the Privacy Commissioner of Canada, the Social Sciences and Humanities Research Council, and York University.</span></em></p>Whether or not Bill C-27 moves companies away from deceptive design in apps and websites depends on how, and if, the Canadian government holds companies accountable for their actions.Jonathan Obar, Associate Professor, Department of Communication and Media Studies, York University, CanadaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2006922023-03-08T13:40:55Z2023-03-08T13:40:55ZShould you pay for Meta’s and Twitter’s verified identity subscriptions? A social media researcher explains how the choice you face affects everyone else<figure><img src="https://images.theconversation.com/files/513996/original/file-20230307-172-u720z6.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5582%2C3710&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">If you want to use two-factor authentication via text message on Twitter, you'll have to pay for it.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/twitter-verified-seen-on-mobile-with-a-stock-graph-on-news-photo/1246403941">NurPhoto via Getty Images</a></span></figcaption></figure><p>Social media services have generally been free of charge for users, but now, with ad revenues slowing down, social media companies are <a href="https://www.wsj.com/articles/would-you-pay-for-social-media-meta-twitter-and-snap-want-to-find-out-856524f8">looking for new revenue streams</a> beyond targeted ads. Now, Twitter is charging for its blue check verification, and Meta and Twitter both charge for identity protection.</p>
<p>Users benefit from “free” services such as social media platforms. According to <a href="https://doi.org/10.1073/pnas.1815663116">one study</a>, in the U.S., Facebook users say they would have to be paid <a href="https://mitsloan.mit.edu/ideas-made-to-matter/how-much-are-search-engines-worth-to-you">in the range of $40 to $50</a> to leave the social networking service for one month. If you value Facebook highly enough that you’d need to get paid to take a break, why not pay for these new services if you can afford them? </p>
<p>Meta plans to offer <a href="https://www.theverge.com/2023/2/20/23607106/twitter-facebook-instagram-meta-security-subscription">paid customer support and account monitoring</a> on Facebook and Instagram to guard against impersonators for <a href="https://www.theverge.com/2023/2/19/23606268/meta-instagram-facebook-test-paid-verification">US$11.99 a month on the web and $14.99 a month on iOS devices</a>. Twitter’s proposed changes make two-factor authentication via text messaging <a href="https://www.theverge.com/2023/2/20/23607106/twitter-facebook-instagram-meta-security-subscription">a premium feature for paid users</a>. Twitter Blue costs $8 a month on Android devices and $11 a month on iOS devices.</p>
<p>As a researcher who <a href="https://scholar.google.com/citations?user=JpFHYKcAAAAJ">studies social media and artificial intelligence</a>, I see three problems with the rollout of these features. </p>
<h2>The collective action problem</h2>
<p>Information goods, such as those provided by social media platforms, are characterized by the problem of collective action, and information security is no exception. Collective action problems, which economists describe <a href="https://personal.utdallas.edu/%7Eliebowit/palgrave/network.html">as network externalities</a>, result when the actions of one participant in a market affect other participants’ outcomes. </p>
<p>Some people might pay Facebook for improved security, but overall, collective well-being depends on having a very large group of users investing in better security for all. Picture a medieval city under siege from an invader where <a href="https://doi.org/10.1126/science.1130992">each family would be responsible for a stretch of the wall</a>. Collectively, the community is only as strong as the weakest link. Will Twitter and Meta still deliver the promised and paid-for results if not enough users sign up for these services?</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/514057/original/file-20230307-16-6if8n3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a screenshot with large and small text and a white checkmark inside a 12-point star" src="https://images.theconversation.com/files/514057/original/file-20230307-16-6if8n3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/514057/original/file-20230307-16-6if8n3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/514057/original/file-20230307-16-6if8n3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/514057/original/file-20230307-16-6if8n3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/514057/original/file-20230307-16-6if8n3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/514057/original/file-20230307-16-6if8n3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/514057/original/file-20230307-16-6if8n3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Meta is beginning to roll out a paid identity protection service for Facebook and Instagram users.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/this-photo-illustration-taken-in-melbourne-on-february-24-news-photo/1247430814">William West/AFP via Getty Images</a></span>
</figcaption>
</figure>
<p>While large platforms such as Facebook and Twitter could benefit from lock in, meaning having users who are dependent on or at least heavily invested in them, it’s not clear how many users will pay for these features. This is an area where the platforms’ profit motive is in conflict with the overall goal of the platform, which is to have a large enough community that people will continue using the platform because all of their social or business connections are there. </p>
<h2>Economics of information security</h2>
<p>Charging for identity protection raises the question of how much each person values privacy or security online. Markets for privacy have posed a similar conundrum. For digital products in particular, consumers are not fully informed about how their data is collected, for what purposes and with what consequences. </p>
<p>Scammers can find many ways to breach security and exploit vulnerabilities in large platforms such as Facebook. But valuing security or privacy is complicated because social media users do not know exactly how much Meta or Twitter invests in keeping everyone safe. When users of digital platforms do not understand how platforms safeguard their information, the resulting lack of trust could limit the number of people willing to pay for features such as security and identity verification.</p>
<p>Social media users in particular face <a href="https://doi.org/10.1257/jel.54.2.442">imperfect or asymmetric information</a> about their data, so they do not know how to correctly value features such as security. In the standard economic logic, markets assign prices based on buyers’ willingness to pay and sellers’ lowest acceptable bids, or <a href="https://www.investopedia.com/terms/r/reserve-price.asp">reservation prices</a>. However, digital platforms such as Meta benefit from individuals’ data by virtue of their size – they have such a large amount of personal data. There is no market for individual data rights, even though there have been a few policy proposals such as California governor Gavin Newsom’s <a href="https://www.cnbc.com/2019/02/12/california-gov-newsom-calls-for-new-data-dividend-for-consumers.html">call for a data dividend</a>. </p>
<p>Some cybersecurity experts have already pointed out the <a href="https://www.washingtonpost.com/politics/2023/02/21/paid-security-features-twitter-meta-spark-cybersecurity-concerns/">downsides to monetizing security features</a>. In particular, in giving a very rushed timeline, one month from announcement to implementation, to pay for a more secure option, there is a real risk that many users will <a href="https://www.theverge.com/2023/2/20/23607106/twitter-facebook-instagram-meta-security-subscription">turn off two-factor authentication altogether</a>. Further, security, user authentication and identity verification <a href="https://time.com/6257711/facebook-instagram-twitter-paid-verification/">are issues that concern everyone</a>, not just content creators or those who can afford to pay. </p>
<p>In the first three months of 2022 alone, nearly one-fifth of teens and adults in the U.S. <a href="https://www2.deloitte.com/us/en/pages/about-deloitte/articles/press-releases/connectivity-and-mobile-trends.html">reported their social media accounts getting hacked</a>. The same survey found that 24% of consumers reported being overwhelmed by devices and subscriptions, indicating significant fatigue and cognitive overload in having to manage their virtual experiences. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1626760590629933057"}"></div></p>
<p>It is also the case that social media platforms are not really free. The old adage is <a href="https://quoteinvestigator.com/2017/07/16/product/">if you are not paying, then you are the product</a>. Digital platforms such as Meta and Twitter monetize the enormous tracts of data they have about users through a <a href="https://theconversation.com/why-bad-ads-appear-on-good-websites-a-computer-scientist-explains-178268">complex online advertising-driven ecosystem</a>. The system makes use of very granular individual user data and predictive analytics <a href="https://doi.org/10.1257/jep.23.3.37">to help companies microtarget online ads</a> and <a href="https://doi.org/10.1007/s11151-013-9399-3">track and compare advertising views with outcomes</a>. There are <a href="https://theconversation.com/facebook-begins-to-shift-from-being-a-free-and-open-platform-into-a-responsible-public-utility-101577">hidden costs</a> associated with people’s loss of privacy and control over their personal information, including loss of trust and vulnerability to identity theft. </p>
<h2>Social media and online harms</h2>
<p>The other problem is how these moves to monetize security options increase online harms for vulnerable users without identity protection provisions. Not everyone can afford to pay Meta or Twitter to keep their personal information safe. Social bots have become <a href="https://doi.org/10.1007/978-3-030-91779-1_11">increasingly more sophisticated</a>. <a href="https://www.cnbc.com/2023/02/23/biggest-benefits-risks-in-meta-twitter-verification-subscriptions.html">Scams increased by almost 288%</a> from 2021 to 2022, according to one report. Scammers and phishers have found it easy enough to <a href="https://www.washingtonpost.com/technology/2023/02/23/facebook-instagram-fee/">gain access to people’s personal information and impersonate others</a>. </p>
<p>For those who are scammed, the process of account recovery is frustrating and time-consuming. Such moves might hurt the most vulnerable, such as those who need Meta to find access to job information, or the elderly and infirm who use social media to learn about what is happening in their communities. Communities that have invested resources in building a shared online space using platforms such as Twitter and Facebook may be harmed by monetization efforts. </p>
<p>People are tired of having to navigate numerous subscriptions and having security and privacy concerns that persist. At the same time, it’s an open question whether enough users will pay for these services to boost collective security. Ultimately, the service a social media platform offers is the opportunity to connect with others. Will users pay for the ability to maintain social connections the way they pay for content, such as entertainment or news? Social media giants may have a difficult path ahead.</p><img src="https://counter.theconversation.com/content/200692/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Anjana Susarla receives funding from the National Institute of Health.</span></em></p>Twitter and Meta are looking to make money from protecting users’ identities. This raises questions about collective security, people understanding what they’re paying for and who remains vulnerable.Anjana Susarla, Professor of Information Systems, Michigan State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1989792023-03-02T19:38:20Z2023-03-02T19:38:20ZProtecting privacy online begins with tackling ‘digital resignation’<figure><img src="https://images.theconversation.com/files/512989/original/file-20230301-26-syl2am.jpg?ixlib=rb-1.1.0&rect=25%2C8%2C5725%2C3819&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Going online often involves surrendering some privacy, and many people are becoming resigned to the fact that their data will be collected and used without their explicit consent.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>From <a href="https://www.cnbc.com/2022/11/26/the-biggest-risks-of-using-fitness-trackers-to-monitor-health.html">smart watches</a> and meditation apps to digital assistants and social media platforms, we interact with technology daily. And some of these technologies have <a href="https://childdatacitizen.com/coerced-digital-participation/">become an essential part of our social and professional lives</a>. </p>
<p>In exchange for access to their digital products and services, many tech companies collect and use our personal information. They use that information to predict and influence our future behaviour. This kind of <a href="https://news.harvard.edu/gazette/story/2019/03/harvard-professor-says-surveillance-capitalism-is-undermining-democracy/">surveillance capitalism</a> can take the form of <a href="https://theconversation.com/the-dark-side-of-alexa-siri-and-other-personal-digital-assistants-126277">recommendation algorithms</a>, targeted advertising and <a href="https://www.mckinsey.com/capabilities/growth-marketing-and-sales/our-insights/the-future-of-personalization-and-how-to-get-ready-for-it">customized experiences</a>. </p>
<p>Tech companies claim these personalized experiences and benefits enhance the user’s experience, however <a href="https://repository.upenn.edu/cgi/viewcontent.cgi?article=1554&context=asc_papers">the vast majority of consumers are unhappy with these practices</a>, especially after learning how their data is collected.</p>
<h2>‘Digital resignation’</h2>
<p><a href="https://dx.doi.org/10.2139/ssrn.1478214">Public knowledge is lacking</a> when it comes to how data is collected. Research shows that corporations both cultivate feelings of resignation and <a href="https://repository.upenn.edu/cgi/viewcontent.cgi?article=1554&context=asc_papers">exploit this lack of literacy</a> to normalize the practice of maximizing the amount of data collected. </p>
<p>Events like the <a href="https://www.wired.com/story/cambridge-analytica-facebook-privacy-awakening/">Cambridge Analytica</a> scandal and revelations of mass government surveillance by <a href="https://www.reuters.com/article/us-usa-nsa-spying-idUSKBN25T3CK">Edward Snowden</a> shine a light on data collection practices, but they leave people powerless and resigned that their data will be collected and used without their explicit consent. This is called <a href="http://dx.doi.org/10.1177/1461444819833331">“digital resignation”</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A smartphone displaying the facebook logo." src="https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">In 2022 Facebook’s parent company, Meta, agreed to pay $725 million to settle a lawsuit concerning users’ personal information to be fed to Cambridge Analytica.</span>
<span class="attribution"><span class="source">(AP Photo/Michael Dwyer, File</span></span>
</figcaption>
</figure>
<p>But while there is much discussion surrounding the collection and use of personal data, there is far less discussion about the modus operandi of tech companies. </p>
<p><a href="https://spectrum.library.concordia.ca/id/eprint/990750/">Our research</a> shows that tech companies use a variety of strategies to deflect responsibility for privacy issues, neutralize critics and prevent legislation. These strategies are designed to limit citizens’ abilities to make informed choices. </p>
<p>Policymakers and corporations themselves must acknowledge and correct these strategies. Corporate accountability for privacy issues cannot be achieved by addressing data collection and use alone. </p>
<h2>The pervasiveness of privacy violations</h2>
<p>In their study of harmful industries such as the tobacco and mining sectors, <a href="http://dx.doi.org/10.1086/653091">Peter Benson and Stuart Kirsch</a> identified strategies of denial, deflection and symbolic action used by corporations to deflect criticism and prevent legislation.</p>
<p>Our research shows that these strategies hold true in the tech industry. Facebook has a long history of <a href="https://www.theguardian.com/technology/2019/aug/23/cambridge-analytica-facebook-response-internal-document">denying and deflecting responsibility</a> for privacy issues despite its numerous scandals and criticisms.</p>
<p>Amazon has also been harshly criticized for providing <a href="https://www.theguardian.com/technology/2022/jul/13/amazon-ring-doorbell-videos-police-11-times-without-permission">Ring security camera footage to law enforcement officials without a warrant or customer consent</a>, sparking <a href="https://www.eff.org/deeplinks/2021/02/lapd-requested-ring-footage-black-lives-matter-protests">civil rights concerns</a>. The company has also created <a href="https://www.theverge.com/2022/9/20/23362010/ring-nation-mgm-amazon-mark-burnett-barry-poznick-civil-rights-cancel">a reality show using Ring security camera footage</a>. </p>
<p>Canadian and U.S. federal government employees have <a href="https://www.wsj.com/articles/canada-follows-u-s-europe-with-tiktok-ban-on-government-devices-2273b07f">recently been banned from downloading TikTok</a> onto their devices due to an “unacceptable” risk to privacy. TikTok has launched <a href="https://www.theverge.com/2023/2/2/23583491/tiktok-transparency-center-tour-photos-bytedance">an elaborate spectacle of symbolic action</a> with the opening of its <a href="https://www.youtube.com/watch?v=PxfIGVQTfWQ">Transparency and Accountability Center</a>. This cycle of denial, deflection and symbolic action normalizes privacy violations and fosters cynicism, resignation and disengagement.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A black and silver ring doorbell on a door frame." src="https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Amazon has faced criticism for creating a new reality show based on footage captured by Ring doorbells.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<h2>How to stop digital resignation</h2>
<p>Technology permeates every aspect of our daily lives. But informed consent is impossible when the average person is neither motivated nor <a href="https://ndg.asc.upenn.edu/wp-content/uploads/2018/09/Persistent-Misperceptions.pdf">knowledgeable enough</a> to read terms and conditions policies designed to confuse.</p>
<p>The <a href="https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age_en">European Union</a> has recently enacted laws that recognize these harmful market dynamics and have started holding platforms and tech companies <a href="https://www.cnn.com/2022/11/30/tech/twitter-eu-compliance-warning/index.html">accountable</a>. </p>
<p>Québec has recently revised its privacy laws with <a href="https://www.quebec.ca/gouvernement/ministeres-et-organismes/institutions-democratique-acces-information-laicite/acces-documents-protection-renseignements-personnels/pl64-modernisation-de-la-protection-des-renseignements-personnels">Law 25</a>. The law is designed to provide citizens with increased protection and control over their personal information. It gives people the ability to request their personal information and move it to another system, to rectify or delete it (<a href="https://gdpr.eu/right-to-be-forgotten/">the right to be forgotten</a>) as well as the right to be informed when being subjected to automated decision making. </p>
<p>It also requires organizations to appoint a privacy officer and committee, and conduct privacy impact assessments for every project where personal information is involved. Terms and policies must also be communicated clearly and transparently and consent must be explicitly obtained.</p>
<p>At the federal level, the government has tabled <a href="https://ised-isde.canada.ca/site/innovation-better-canada/en/canadas-digital-charter/bill-summary-digital-charter-implementation-act-2020">Bill C-27, the <em>Digital Charter Implementation Act</em></a> and is currently under review by the House of Commons. It bears many resemblances to Québec’s Law 25 and also includes additional measures to regulate technologies such as artificial intelligence systems.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A laptop showing a terms and conditions document." src="https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Online terms and conditions are often too long and difficult for consumers to understand.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>Our findings highlight the urgent need for more privacy literacy and stronger regulations that not just regulate what is permitted, but also monitor and make accountable the firms who breach consumer privacy. This would ensure informed consent to data collection and disincentivize violations. We recommend that: </p>
<p>1) Tech companies must explicitly specify what personal data will be collected and used. Only essential data should be collected and customers should be able to opt out of non-essential data collection. This is similar to the <a href="https://gdpr.eu/cookies/">EU’s General Data Protection Regulation</a> to obtain user consent before using non-essential cookies or <a href="https://support.apple.com/en-ca/HT212025">Apple’s App Tracking Transparency</a> feature which allows users to block apps from tracking them.</p>
<p>2) Privacy regulations must also recognize and address the rampant use of <a href="https://www.vox.com/recode/22351108/dark-patterns-ui-web-design-privacy">dark patterns</a> to influence people’s behaviour, such as coercing them into providing consent. This can include the use of design elements, language or features such as making it difficult to decline non-essential cookies or making the button to provide more personal data more prominent than the opt-out button.</p>
<p>3) Privacy oversight bodies such as the <a href="https://www.priv.gc.ca/en">Office of the Privacy Commissioner of Canada</a> <a href="https://www.cbc.ca/news/canada/nova-scotia/houston-privacy-commissioner-promise-may-be-softening-1.6624079">must be fully independent</a> and authorized to investigate and <a href="https://financialpost.com/news/privacy-watchdogs-lament-lack-powers-tim-hortons-probe">enforce privacy regulations</a>.</p>
<p>4) While privacy laws like Québec’s require organizations to appoint a privacy officer, the role must also be fully independent and given the power to enforce compliance with privacy laws if it is to be effective in improving accountability.</p>
<p>5) Policymakers must be more proactive in updating legislation to account for the rapid advances of digital technology. </p>
<p>6) Finally, penalties for non-compliance often pale in comparison to the profits gained and social harms from misuse of data. For example, the U.S. Federal Trade Commission (FTC) imposed <a href="https://www.ftc.gov/news-events/news/press-releases/2019/07/ftc-imposes-5-billion-penalty-sweeping-new-privacy-restrictions-facebook">a $5 billion penalty on Facebook</a> (5.8 per cent of its <a href="https://investor.fb.com/investor-news/press-release-details/2021/Facebook-Reports-Fourth-Quarter-and-Full-Year-2020-Results/default.aspx">2020 annual revenue</a>) for its role in the <a href="https://www.vox.com/policy-and-politics/2018/3/23/17151916/facebook-cambridge-analytica-trump-diagram">Cambridge Analytica scandal</a>.</p>
<p>While this fine is the highest ever given by the FTC, it is not representative of the social and political impacts of the scandal and its influence in <a href="https://www.npr.org/2018/03/20/595338116/what-did-cambridge-analytica-do-during-the-2016-election">key political events</a>. In some cases, it may be more profitable for a company to strategically pay a fine for non-compliance. </p>
<p>To make tech giants more responsible with their users’ data, the cost of breaching data privacy must outweigh the potential profits of exploiting consumer data.</p><img src="https://counter.theconversation.com/content/198979/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Many people have become resigned to the fact that tech companies collect our private data. But policymakers must do more to limit the amount of personal information corporations can collect.Meiling Fong, PhD Student, Individualized Program, Concordia UniversityZeynep Arsel, Concordia University Chair in Consumption, Markets, and Society, Concordia UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1973932023-01-18T13:38:56Z2023-01-18T13:38:56ZDozens of US schools, universities move to ban TikTok<figure><img src="https://images.theconversation.com/files/504510/original/file-20230113-14-datjvf.jpg?ixlib=rb-1.1.0&rect=0%2C6%2C4608%2C3442&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The TikTok social media app has raised concerns about cybersecurity and online safety.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/illustration-tiktok-a-short-video-platform-suqian-jiangsu-news-photo/1245918786">Future Publishing via Getty Images</a></span></figcaption></figure><p>A growing number of public schools and colleges in the U.S. are moving to ban TikTok – the popular Chinese-owned social media app that allows users to share short videos.</p>
<p>They are following the lead of the <a href="https://www.nbcnews.com/tech/tech-news/tiktok-ban-biden-government-college-state-federal-security-privacy-rcna63724">federal government</a> and <a href="https://news.yahoo.com/tiktok-bans-government-devices-raise-222316798.html">several states</a>, that are banishing the social media app because <a href="https://www.nbcnews.com/tech/students-question-tiktok-bans-public-universities-rcna62801">authorities believe foreign governments – specifically China – could use the app</a> to spy on Americans.</p>
<p>The app is created by ByteDance, which is based in China and has <a href="https://www.theguardian.com/technology/2022/nov/07/tiktoks-china-bytedance-data-concerns">ties to the Chinese government</a>. </p>
<p><a href="https://www.nbcnews.com/tech/students-question-tiktok-bans-public-universities-rcna62801">The University of Oklahoma, Auburn University in Alabama</a> and <a href="https://www.cnet.com/news/social-media/tiktok-also-banned-by-some-us-universities/">26 public universities and colleges in Georgia</a> have banned the app from campus Wi-Fi networks. <a href="https://www.bestcolleges.com/news/these-colleges-just-banned-tiktok/">Montana’s governor has asked</a> the state’s university system to ban it. </p>
<p>Some K-12 schools have also blocked the app. Public schools in Virginia’s <a href="https://www.fox5dc.com/news/stafford-county-public-schools-blocking-students-access-to-tiktok">Stafford, Prince William and Loudoun counties</a> have banned TikTok on school-issued devices and schools’ Wi-Fi networks. Louisiana’s state superintendent of education recommended that <a href="https://www.wdsu.com/article/louisiana-superintendent-education-tik-tok-ban/42393440">schools in the state remove the app from public devices</a> and <a href="https://www.edweek.org/technology/should-schools-ban-tiktok-louisiana-ed-chief-urges-districts-to-do-it/2023/01#:%7E:text=He%20implored%20districts%20to%20delete,laptops%2C%20a%20department%20spokesman%20added.">block it</a> on school-issued devices. </p>
<p>As a <a href="https://scholar.google.com/citations?user=g-jALEoAAAAJ&hl=en&oi=ao">researcher</a> who specializes in <a href="https://doi.org/10.1080/1097198X.2019.1603527">cybersecurity</a>, I don’t believe these schools are overreacting. TikTok captures user data in a way that is <a href="https://www.theguardian.com/technology/2022/jul/19/tiktok-has-been-accused-of-aggressive-data-harvesting-is-your-information-at-risk">more aggressive than other apps</a>.</p>
<p>The version of TikTok that is raising all these concerns is not available in China itself. In an effort to protect Chinese students from the harmful effects of social media, the Chinese Communist Party has issued a rule that limits the time students can spend on TikTok to <a href="https://www.voanews.com/a/fbi-says-it-has-national-security-concerns-about-tiktok/6836340.html">40 minutes a day</a>. And they can view only <a href="https://www.voanews.com/a/fbi-says-it-has-national-security-concerns-about-tiktok/6836340.html">videos with a patriotic theme or educational content</a> such as science experiments and museum exhibits.</p>
<h2>Aggressive tactics to capture and harvest user data</h2>
<p>All <a href="https://www.wdsu.com/article/louisiana-superintendent-education-tik-tok-ban/42393440">major social media platforms</a> <a href="https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/">raise privacy concerns and include security risks</a> for users.</p>
<p>But TikTok does more than the rest. Its default privacy settings allow the app to collect much more information than the app needs to actually function. </p>
<p>Every hour, the app accesses users’ <a href="https://www.theguardian.com/technology/2022/jul/19/tiktok-has-been-accused-of-aggressive-data-harvesting-is-your-information-at-risk">contact lists and calendars</a>. It also <a href="https://www.theguardian.com/technology/2022/jul/19/tiktok-has-been-accused-of-aggressive-data-harvesting-is-your-information-at-risk">collects the location of devices</a> used to access the service and can scan hard drives attached to any of those devices. </p>
<p>If a user changes privacy settings to avoid that scrutiny, the app <a href="https://www.theguardian.com/technology/2022/jul/19/tiktok-has-been-accused-of-aggressive-data-harvesting-is-your-information-at-risk">persistently asks for that permission to be restored</a>. Other social networking apps, like Facebook, don’t ask users to revise their privacy settings if they lock down their information.</p>
<p>How TikTok handles the data it collects from users also raises concerns. Ireland’s data protection regulator, for instance, is <a href="https://www.politico.eu/article/eu-leaders-fire-warning-shots-at-tiktok-over-privacy/">investigating possible illegal transfers</a> of European citizens’ data to Chinese servers and potential violations of rules protecting children’s privacy.</p>
<h2>Cybersecurity vulnerabilities</h2>
<p>As <a href="https://businessplus.ie/tech/social-media-lost-user-data/">with other social media services</a>, researchers have found <a href="https://research.checkpoint.com/2020/tik-or-tok-is-tiktok-secure-enough/">serious vulnerabilities</a> with TikTok.</p>
<p>In 2020, cybersecurity company Check Point found that it could send users messages that looked as if they came from TikTok but actually contained malicious links. When users clicked on those links, <a href="https://futurism.com/major-security-flaws-tiktok">Check Point’s researchers could seize control of their TikTok accounts</a>, get access to private information, delete existing content and even post new material under that user’s account.</p>
<p>Hackers have also taken advantage of <a href="https://www.theregister.com/2022/11/29/tiktok_invisible_challenge_malware/">viral TikTok trends to distribute malicious software</a> that creates additional cybersecurity problems. For instance, a trend called the “Invisible Challenge” encouraged users to use a TikTok filter called “Invisible Body” to film themselves naked – assuring users their followers would only see a blurry image, not anything revealing. </p>
<p>Cybercriminals created TikTok videos that claimed they had made software that would reveal users’ nude bodies by reversing the body-masking filter. But the software they encouraged users to download actually just stole people’s <a href="https://www.bleepingcomputer.com/news/security/tiktok-invisible-body-challenge-exploited-to-push-malware/">social media, credit card and cryptocurrency credentials</a> from elsewhere on their phones, as well as files from victims’ computers.</p>
<h2>National security concerns</h2>
<p>Many U.S. lawmakers have objected to <a href="https://www.npr.org/2022/12/22/1144745813/why-the-proposed-tiktok-ban-is-more-about-politics-than-privacy-according-to-exp">the app’s location tracking services</a>, saying it could allow the Chinese government to monitor <a href="https://www.newsweek.com/tiktok-security-concerns-explained-republican-led-states-look-ban-it-1765790">the movements and locations of U.S. citizens</a> – including members of the military or government officials.</p>
<p>If the Chinese government wants information about the <a href="https://www.statista.com/statistics/1100836/number-of-us-tiktok-users/">more than 90 million TikTok users</a>, it does not need to hack anything.</p>
<p>That’s because China’s <a href="https://www.cnbc.com/2019/03/05/huawei-would-have-to-give-data-to-china-government-if-asked-experts.html">2017 National Intelligence Law</a> <a href="https://usa.kaspersky.com/resource-center/preemptive-safety/is-tiktok-safe">requires Chinese companies</a> to <a href="https://www.theguardian.com/technology/2022/jul/19/tiktok-has-been-accused-of-aggressive-data-harvesting-is-your-information-at-risk">share any data they collect if the government asks</a>.</p>
<p>Technology industry observers have also raised concerns that ByteDance, the company that makes TikTok, may be <a href="https://www.newsweek.com/tiktok-owned-controlled-china-communist-party-ccp-influence-1752415">partially owned by the Chinese government</a>.</p>
<p>These problems take on even more importance in the context of the Chinese government’s alleged efforts to build a <a href="https://www.infosecurity-magazine.com/news/chinas-mss-linked-to-marriott/">huge “data lake” of information about all Americans</a>. China has been linked to several large-scale cyberattacks targeting federal employees and U.S. consumers. These attacks include the <a href="https://edition.cnn.com/2017/08/24/politics/fbi-arrests-chinese-national-in-opm-data-breach/index.html">2015 hack of the U.S. Office of Personnel Management</a>, 2017 attacks on the <a href="https://www.csoonline.com/article/3444488/equifax-data-breach-faq-what-happened-who-was-affected-what-was-the-impact.html">consumer credit reporting agency Equifax</a> and the 2018 attack on hotel group <a href="https://www.infosecurity-magazine.com/news/chinas-mss-linked-to-marriott/">Marriott International</a>. </p>
<h2>Negative effects outweighing positive ones?</h2>
<p><a href="https://www.edweek.org/technology/tiktok-gas-twitter-how-social-media-is-influencing-education/2022/12">Teachers and school administrators have used TikTok</a> in some interesting, and useful, ways – such as connecting with students, building relationships, teaching about the risks of social media and delivering small, quick lessons.</p>
<p>But it is not clear whether those positive effects counterbalance the potential and actual harm. In addition to general concerns about <a href="https://doi.org/10.1177/0894439316660340">the possible risks of social media addictions</a>, some school officials say increased TikTok use has <a href="https://www.fox5dc.com/news/stafford-county-public-schools-blocking-students-access-to-tiktok">distracted students from paying attention</a> to teachers.</p>
<p>Also, the app’s algorithm for recommending videos to watch next has increased students’ risk of <a href="https://www.cnn.com/2022/12/15/tech/tiktok-teens-study-trnd/index.html">suicide and eating disorders</a>. The “One Chip Challenge,” which asks TikTok users to eat a single chip containing <a href="https://shop.paqui.com/products/one-chip-challenge">two of the world’s spiciest chili peppers</a>, sent <a href="https://medicalxpress.com/news/2022-10-tiktok-trend-kids-home-sick.html">some students to the hospital</a> and made others sick.</p>
<p>TikTok videos have also led students to <a href="https://www.krgv.com/news/students-destroy-steal-school-property-for-viral-tiktok-challenge/">engage in vandalism</a>. In response to one viral challenge, some students <a href="https://www.cbsnews.com/losangeles/news/viral-trend-on-tiktok-encourages-students-to-damage-school-property-steal/">stole bathroom sinks and soap dispensers</a> from schools. </p>
<p>With all that potential for harm and damage, it’s not surprising school officials are considering a ban on TikTok.</p><img src="https://counter.theconversation.com/content/197393/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nir Kshetri does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>School officials are becoming increasingly wary of TikTok amid concerns that the app poses a risk to student safety and privacy and makes the nation vulnerable to spies.Nir Kshetri, Professor of Management, University of North Carolina – GreensboroLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1872772022-07-20T20:08:00Z2022-07-20T20:08:00ZEven if TikTok and other apps are collecting your data, what are the actual consequences?<figure><img src="https://images.theconversation.com/files/475053/original/file-20220720-27-dzfe0b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>By now, most of us are aware social media companies collect vast amounts of our information. By doing this, they can target us with ads and monetise our attention. The latest chapter in the data-privacy debate concerns one of the world’s most popular apps among young people – TikTok. </p>
<p>Yet anecdotally it seems the potential risks aren’t really something young people care about. Some were <a href="https://twitter.com/theprojecttv/status/1548962230741487617">interviewed</a> by The Project this week regarding the risk of their TikTok data being accessed from China. </p>
<p>They said it wouldn’t stop them using the app. “Everyone at the moment has access to everything,” one person said. Another said they didn’t “have much to hide from the Chinese government”. </p>
<p>Are these fair assessments? Or should Australians actually be worried about yet another social media company taking their data? </p>
<h2>What’s happening with TikTok?</h2>
<p>In a 2020 Australian parliamentary hearing on foreign interference through social media, TikTok representatives <a href="https://www.aph.gov.au/Parliamentary_Business/Hansard/Hansard_Display?bid=committees/commsen/1a5e6393-fec4-4222-945b-859e3f8ebd17/&sid=0002">stressed</a>: “TikTok Australia data is stored in the US and Singapore, and the security and privacy of this data are our highest priority.”</p>
<p>But as Australian Strategic Policy Institute (ASPI) analyst Fergus Ryan has <a href="https://www.aspistrategist.org.au/its-time-tiktok-australia-came-clean/">observed</a>, it’s not about where the data are <em>stored</em>, but who has <em>access</em>. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1549185634837102592"}"></div></p>
<p>On June 17, BuzzFeed published a <a href="https://www.buzzfeednews.com/article/emilybakerwhite/tiktok-tapes-us-user-data-china-bytedance-access">report</a> based on 80 leaked internal TikTok meetings which seemed to confirm access to US TikTok data by Chinese actors. The report refers to multiple examples of data access by TikTok’s parent company ByteDance, which is based in China. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/concerns-over-tiktok-feeding-user-data-to-beijing-are-back-and-theres-good-evidence-to-support-them-186211">Concerns over TikTok feeding user data to Beijing are back – and there's good evidence to support them</a>
</strong>
</em>
</p>
<hr>
<p>Then in July, TikTok Australia’s director of public policy, Brent Thomas, wrote to the shadow minister for cyber security, James Paterson, regarding China’s access to Australian user data.</p>
<p>Thomas denied having been asked for data from China or having “given data to the Chinese government” – but he also noted access is “based on the need to access data”. So there’s good reason to believe Australian users’ data <em>may</em> be accessed from China.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1546957121274621952"}"></div></p>
<h2>Is TikTok worse than other platforms?</h2>
<p>TikTok collects rich consumer information, including personal information and behavioural data from people’s activity on the app. In this respect, it’s not different from other social media companies. </p>
<p>They all need oceans of user data to push ads onto us, and run data analytics behind a shiny facade of cute cats and trendy dances. </p>
<p>However, TikTok’s corporate roots extend to authoritarian China – and not the US, where most of our other social media come from. This carries implications for TikTok users.</p>
<p>Hypothetically, since TikTok moderates content according to Beijing’s foreign policy goals, it’s possible TikTok could apply censorship controls over Australian users. </p>
<p>This means users’ feeds would be filtered to omit anything that doesn’t fit the Chinese government’s agenda, such as support for Taiwan’s sovereignty, as an example. In “shadowbanning”, a user’s posts appear to have been published to the user themselves, but are not visible to anyone else. </p>
<p>It’s worth noting this censorship risk isn’t hypothetical. In 2019, information about Hong Kong protests was reported to have been <a href="https://www.theguardian.com/technology/2019/sep/25/revealed-how-tiktok-censors-videos-that-do-not-please-beijing">censored</a> not only on Douyin, China’s domestic version of TikTok, but also on TikTok itself. </p>
<p>Then in 2020, ASPI <a href="https://www.aspi.org.au/report/tiktok-wechat">found</a> hashtags related to LGBTQ+ are suppressed in at least eight languages on TikTok. In response to ASPI’s research, a TikTok spokesperson said the hashtags may be restricted as part of the company’s localisation strategy and due to local laws.</p>
<p>In Thailand, keywords such as #acab, #gayArab and anti-monarchy hashtags were found to be shadowbanned. </p>
<p>Within China, Douyin complies with strict national content regulation. This includes censoring information about the religious movement Falun Gong and the Tiananmen massacre, among other examples. </p>
<p>The legal environment in China forces Chinese internet product and service providers to work with government authorities. If Chinese companies disagree, or are unaware of their obligations, they can be slapped with legal and/or financial penalties and be forcefully shut down. </p>
<p>In 2012, another social media product run by the founder of ByteDance, Yiming Zhang, was forced to close. Zhang fell into political line in a <a href="https://chinamediaproject.org/2018/04/11/tech-shame-in-the-new-era/">public apology</a>. He acknowledged the platform deviated from “public opinion guidance” by not moderating content that goes against “socialist core values”. </p>
<p>Individual TikTok users should seriously consider leaving the app until issues of global censorship are clearly addressed.</p>
<h2>But don’t forget, it’s not just TikTok</h2>
<p>Meta products, such as Facebook and Instagram, also measure our interests by the seconds we spend looking at certain posts. They aggregate those behavioural data with our personal information to try to keep us hooked – looking at ads for as long as possible. </p>
<p><a href="https://www.aclu.org/news/privacy-technology/holding-facebook-accountable-for-digital-redlining">Some real cases</a> of targeted advertising on social media have contributed to “digital redlining” – the use of technology to perpetuate social discrimination. </p>
<p>In 2018, Facebook came under fire for showing some employment ads only to men. In 2019, it settled another digital redlining <a href="https://www.theguardian.com/technology/2019/mar/28/facebook-ads-housing-discrimination-charges-us-government-hud">case</a> over discriminatory practices in which housing ads were targeted to certain users on the basis of “race, colour, national origin and religion”. </p>
<p>And in 2021, before the US Capitol breach, military and defence product ads <a href="https://www.buzzfeednews.com/article/ryanmac/facebook-profits-military-gear-ads-capitol-riot">were running</a> alongside conversations about a coup. </p>
<p>Then there are some worst-case scenarios. The 2018 Cambridge Analytica scandal <a href="https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html">revealed</a> how Meta (then Facebook) exposed users’ data to the political consulting firm Cambridge Analytica without their consent. </p>
<p>Cambridge Analytica harvested up to 87 million users’ data from Facebook, derived psychological user profiles and used these to tailor pro-Trump messaging to them. This likely had an influence on the 2016 US presidential election. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/475064/original/file-20220720-19-dzfe0b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A phone shows a TikTok video playing on the screen, with a person mid-dance." src="https://images.theconversation.com/files/475064/original/file-20220720-19-dzfe0b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/475064/original/file-20220720-19-dzfe0b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/475064/original/file-20220720-19-dzfe0b.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/475064/original/file-20220720-19-dzfe0b.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/475064/original/file-20220720-19-dzfe0b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/475064/original/file-20220720-19-dzfe0b.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/475064/original/file-20220720-19-dzfe0b.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">To what extent are we willing to ignore potential risks with social platforms, in favour of addictive content?</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>With TikTok, the most immediate concern for the average Australian user is content censorship – not direct prosecution. But within China, there are recurring instances of Chinese nationals being <a href="https://www.scmp.com/news/china/politics/article/3176605/crackdown-chinas-moderate-rights-voices-how-tweets-are-now">detained or even jailed</a> for using both Chinese and international social media. </p>
<p>You can see how the consequences of mass data harvesting are not hypothetical. We need to demand more transparency from not just TikTok but all major social platforms regarding how data are used.</p>
<p>Let’s continue the <a href="https://www.afr.com/policy/foreign-affairs/tiktok-s-privacy-fundamentally-incompatible-with-australia-20220713-p5b18l">regulation debate</a> TikTok has accelerated. We should look to update privacy protections and embed transparency into Australia’s national regulatory guidelines – for whatever the next big social media app happens to be.</p><img src="https://counter.theconversation.com/content/187277/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ausma Bernot does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>It’s pretty common to find people who are apathetic about their data being harvested and funnelled into unknown corners. But that’s usually because they don’t know what’s at stake.Ausma Bernot, PhD Candidate, Griffith UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1842192022-07-05T12:14:29Z2022-07-05T12:14:29ZBrowser cookies make people more cautious online, study finds<figure><img src="https://images.theconversation.com/files/471474/original/file-20220628-14476-ro18kw.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C8256%2C5487&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Cookie notifications become a ubiquitous aspect of online life.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/november-2021-lower-saxony-oldenburg-a-person-has-a-mobile-news-photo/1236887868">Mohssen Assanimoghaddam/picture alliance via Getty Images</a></span></figcaption></figure><p>Website cookies are online surveillance tools, and the commercial and government entities that use them would <a href="https://techcrunch.com/2015/08/21/agree-to-disagree/">prefer people not read</a> those notifications too closely. People who do read the notifications carefully will find that they have the option to say no to some or all cookies.</p>
<p>The problem is, without careful attention those notifications become an annoyance and a subtle reminder that your online activity can be tracked.</p>
<p>As a researcher who <a href="https://scholar.google.com/citations?user=7cJhUEkAAAAJ&hl=en">studies online surveillance</a>, I’ve found that failing to read the notifications thoroughly can lead to negative emotions and affect what people do online. </p>
<h2>How cookies work</h2>
<p>Browser cookies are not new. <a href="https://qz.com/2000350/the-inventor-of-the-digital-cookie-has-some-regrets/">They were developed in 1994</a> by a Netscape programmer in order to optimize browsing experiences by exchanging users’ data with specific websites. These small text files allowed websites to remember your passwords for easier logins and keep items in your virtual shopping cart for later purchases. </p>
<p>But over the past three decades, cookies have evolved to track users across websites and devices. This is how items in your Amazon shopping cart on your phone can be used to tailor the ads you see on Hulu and Twitter on your laptop. One study found that 35 of 50 popular websites <a href="https://theconversation.com/cookies-i-looked-at-50-well-known-websites-and-most-are-gathering-our-data-illegally-176203">use website cookies illegally</a>. </p>
<p>European regulations <a href="https://gdpr.eu/cookies/">require websites to receive your permission</a> before using cookies. You can avoid this type of third-party tracking with website cookies by carefully reading platforms’ privacy policies and opting out of cookies, but people generally aren’t doing that.</p>
<p><a href="https://doi.org/10.1080/1369118X.2018.1486870">One study</a> found that, on average, internet users spend just 13 seconds reading a website’s terms of service statements before they consent to cookies and other outrageous terms, such as, as the study included, exchanging their first-born child for service on the platform.</p>
<p>These terms-of-service provisions are cumbersome and intended to create friction.</p>
<p><a href="https://www.degruyter.com/document/doi/10.23943/9781400890057/html">Friction</a> is a technique used to slow down internet users, either to maintain governmental control or reduce customer service loads. Autocratic governments that want to maintain control via state surveillance without jeopardizing their public legitimacy frequently use this technique. Friction involves building frustrating experiences into website and app design so that users who are trying to avoid monitoring or censorship become so inconvenienced that they ultimately give up. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/HFyaW50GFOs?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Browser cookies explained.</span></figcaption>
</figure>
<h2>How cookies affect you</h2>
<p>My newest research sought to understand how website cookie notifications are used in the U.S. <a href="https://doi.org/10.1080/19331681.2022.2063215">to create friction and influence user behavior</a>.</p>
<p>To do this research, I looked to the concept of mindless compliance, an idea made infamous by Yale psychologist Stanley Milgram. <a href="https://www.simplypsychology.org/milgram.html">Milgram’s experiments</a> – now considered a radical breach of research ethics – asked participants to administer electric shocks to fellow study takers in order to test obedience to authority. </p>
<p>Milgram’s research demonstrated that people often consent to a request by authority without first deliberating on whether it’s the right thing to do. In a much more routine case, I suspected this is also what was happening with website cookies. </p>
<p>I conducted a large, nationally representative experiment that presented users with a boilerplate browser cookie pop-up message, similar to one you may have encountered on your way to read this article. </p>
<p>I evaluated whether the cookie message triggered an emotional response – either anger or fear, which are both expected responses to online friction. And then I assessed how these cookie notifications influenced internet users’ willingness to express themselves online. </p>
<p>Online expression is central to democratic life, and <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2959611">various types of internet monitoring are known to suppress it</a>. </p>
<p>The results showed that cookie notifications triggered strong feelings of anger and fear, suggesting that website cookies are no longer perceived as the helpful online tool they were designed to be. Instead, they are a hindrance to accessing information and making informed choices about one’s privacy permissions.</p>
<p>And, as suspected, cookie notifications also reduced people’s stated desire to express opinions, search for information and go against the status quo. </p>
<h2>Cookie solutions</h2>
<p>Legislation regulating cookie notifications like the <a href="https://gdpr.eu/">EU’s General Data Protection Regulation </a> and <a href="https://oag.ca.gov/privacy/ccpa">California Consumer Privacy Act</a> were designed with the public in mind. But notification of online tracking is creating an unintentional boomerang effect. </p>
<p>There are three design choices that could help. First, making consent to cookies more mindful, so people are more aware of which data will be collected and how it will be used. This will involve changing the default of website cookies from opt-out to opt-in so that people who want to use cookies to improve their experience can voluntarily do so. </p>
<p>Second, cookie permissions change regularly, and what data is being requested and how it will be used should be front and center.</p>
<p>And third, U.S. internet users should possess the right to be forgotten, or the right to remove online information about themselves that is harmful or not used for its original intent, including the data collected by tracking cookies. This is a provision granted in the General Data Protection Regulation but does not extend to U.S. internet users.</p>
<p>In the meantime, I recommend that people read the terms and conditions of cookie use and accept only what’s necessary.</p><img src="https://counter.theconversation.com/content/184219/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Elizabeth Stoycheff has received funding from WhatsApp and Facebook for other endeavors, but that has no bearing on these research findings. </span></em></p>Cookie notifications remind people that they are being tracked, which affects how people behave online.Elizabeth Stoycheff, Associate Professor of Communication, Wayne State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1850382022-06-19T19:53:40Z2022-06-19T19:53:40ZInsurance firms can skim your online data to price your insurance — and there’s little in the law to stop this<figure><img src="https://images.theconversation.com/files/469391/original/file-20220617-24-txo2j0.jpeg?ixlib=rb-1.1.0&rect=58%2C69%2C7684%2C5084&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>What if your insurer was tracking your online data to price your car insurance? Seems far-fetched, right? </p>
<p>Yet there is predictive value in the digital traces we leave online. And insurers may use data collection and analytics tools to find our data and use it to price insurance services. </p>
<p>For instance, <a href="https://pubmed.ncbi.nlm.nih.gov/27849366/">some</a> <a href="https://www.researchgate.net/publication/350525424_Smartphone_Operating_System_Preference_Based_On_Different_Personality_Lifestyle_Traits_Of_The_Consumer">studies</a> <a href="https://www.nber.org/papers/w24771#fromrss">have</a> found a correlation between whether an individual uses an Apple or Android phone and their likelihood of exhibiting certain personality traits. </p>
<p>In one example, US insurance broker Jerry analysed the driving behaviour of some 20,000 people to conclude Android users are <a href="https://getjerry.com/studies/sorry-iphone-fans-android-users-are-safer-drivers">safer drivers</a> than iPhone users. What’s stopping insurers from referring to such reports to price their insurance?</p>
<p>Our latest <a href="https://www.sciencedirect.com/science/article/abs/pii/S0267364922000152">research</a> shows Australian consumers have no real control over how data about them, and posted by them, might be collected and used by insurers. </p>
<p>Looking at several examples from customer loyalty schemes and social media, we found insurers can access vast amounts of consumer data under Australia’s <a href="https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988">weak privacy laws</a>. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A person's hands are visible holding an Apple phone on the left (screen facing forward), and a generic Android on the right." src="https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">How would you feel if a detail as menial as the brand of your phone was used to price your car insurance?</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>Your data is already out there</h2>
<p>Insurers are already using big data to price consumer insurance through personalised pricing, according to evidence gathered by industry regulators in the <a href="https://www.fca.org.uk/publication/feedback/fs16-05.pdf">United Kingdom</a>, <a href="https://register.eiopa.europa.eu/Publications/EIOPA_BigDataAnalytics_ThematicReview_April2019.pdf">European Union</a> and <a href="https://www.dfs.ny.gov/industry_guidance/circular_letters/cl2019_01">United States</a>.</p>
<p>Consumers often “agree” to all kinds of data collection and privacy policies, such as those used in loyalty schemes (who doesn’t like freebies?) and by social media companies. But they have no control over how their data are used once it’s handed over.</p>
<p>There are far-reaching inferences that can be drawn from data collected through loyalty programs and social media platforms – and these may be uncomfortable, or even highly sensitive.</p>
<p>Researchers using data analytics and machine learning have claimed to build models that can guess a person’s sexual orientation from pictures of <a href="https://osf.io/zn79k/">their face</a>, or their suicidal tendencies from <a href="https://www.sciencedirect.com/science/article/pii/S2214782915000160">posts on Twitter</a>.</p>
<p>Think about all the details revealed from a grocery shopping history alone: diet, household size, addictions, health conditions and social background, among others. In the case of social media, a user’s posts, pictures, likes, and links to various groups can be used to draw a precise picture of that individual.</p>
<p>What’s more is Australia has a <a href="https://www.cdr.gov.au/">Consumer Data Right</a> which already requires banks to share consumers’ banking data (at the consumer’s request) with another bank or app, such as to access a new service or offer. </p>
<p>The regime is actively being expanded to other parts of the economy including the energy sector, with the idea being competitors could use information on energy usage to make competitive offers. </p>
<p>The Consumer Data Right is advertised as <a href="https://www.cdr.gov.au/">empowering</a> for consumers – enabling access to new services and offers, and providing people with choice, convenience and control over their data. </p>
<p>In practice, however, it means insurance firms accredited under the program can require you to share your banking data in exchange for insurance services.</p>
<p>The previous Coalition government also <a href="https://ministers.treasury.gov.au/ministers/jane-hume-2020/media-releases/more-power-compare-and-switch-telco-providers-and-share">proposed “open finance”</a>, which would expand the Consumer Data Right to include access to your insurance and superannuation data. This hasn’t happened yet, but it’s likely the new Albanese government will look into it.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/soft-terms-like-open-and-sharing-dont-tell-the-true-story-of-your-data-95521">Soft terms like 'open' and 'sharing' don't tell the true story of your data</a>
</strong>
</em>
</p>
<hr>
<h2>Why more data in insurers’ hands may be bad news</h2>
<p>There are plenty of reasons to be concerned about insurers collecting and using increasingly detailed data about people for insurance pricing and claims management. </p>
<p>For one, large-scale data collection provides incentives for cyber attacks. Even if data is held in anonymised form, it can be <a href="https://techcrunch.com/2019/07/24/researchers-spotlight-the-lie-of-anonymous-data/">re-identified</a> with the right tools. </p>
<p>Also, insurers may be able to infer (or at least think they can infer) facts about an individual which they want to keep private, such as their sexual orientation, <a href="https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/?sh=794d21176668">pregnancy</a> status or religious beliefs. </p>
<p>There’s plenty of evidence the outputs of artificial intelligence tools employed in mass data analytics can be inaccurate and discriminatory. Insurers’ decisions may then be based on misleading or untrue data. And these tools are so complex it’s often difficult to work out if, or where, errors or bias are present.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A magnifying glass hovers over a Facebook post's likes" src="https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Each day, people post personal information online. And much of it can be easily accessed by others.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>Although insurers are meant to pool risk and compensate the unlucky, some might use data to only offer affordable insurance to very low-risk people. Vulnerable consumers may face <a href="https://actuaries.logicaldoc.cloud/download-ticket?ticketId=09c77750-aa90-4ba9-835e-280ae347487b">exclusion</a>. </p>
<p>A more widespread use of data, especially via the Consumer Data Right, will especially disadvantage those who are unable or unwilling to share data with insurers. These people may be low risk, but if they can’t or won’t prove this, they’ll have to pay more than a fair price for their insurance cover. </p>
<p>They may even pay more than what they would have in a pre-Consumer Data Right world. So insurance may move <em>further</em> from a fair price when more personal data are available to insurance firms. </p>
<h2>We need immediate action</h2>
<p>Our <a href="http://www5.austlii.edu.au/au/journals/SydLawRw/2021/20.html">previous research</a> demonstrated that apart from anti-discrimination laws, there are inadequate constraints on how insurers are allowed to use consumers’ data, such as those taken from online sources. </p>
<p>The more insurers base their assessments on data a consumer didn’t directly provide, the harder it will be for that person to understand how their “riskiness” is being assessed. If an insurer requests your transaction history from the last five years, would you know what they are looking for? Such problems will be exacerbated by the expansion of the Consumer Data Right.</p>
<p>Interestingly, insurance firms themselves might <a href="https://www.nature.com/news/can-we-open-the-black-box-of-ai-1.20731">not know</a> how collected data translates into risk for a specific consumer. If their approach is to simply feed data into a complex and opaque artificial intelligence system, all they’ll know is they’re getting a supposedly “better” risk assessment with more data.</p>
<p>Recent <a href="https://theconversation.com/bunnings-kmart-and-the-good-guys-say-they-use-facial-recognition-for-loss-prevention-an-expert-explains-what-it-might-mean-for-you-185126">reports</a> of retailers collecting shopper data for facial recognition have highlighted how important it is for the Albanese government to urgently reform <a href="https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988">our privacy laws</a>, and take a close look at other data laws, including proposals to <a href="https://treasury.gov.au/review/statutory-review-consumer-data-right">expand the Consumer Data Right</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/hidden-costs-manipulation-forced-continuity-report-reveals-how-australian-consumers-are-being-duped-online-184450">Hidden costs, manipulation, forced continuity: report reveals how Australian consumers are being duped online</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/185038/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Zofia Bednarz receives funding from the Australian Research Council Centre of Excellence on Automated Decision-Making and Society. </span></em></p><p class="fine-print"><em><span>Kayleen Manwaring receives funding from the UNSW Allens Hub for Technology, Law & Innovation. </span></em></p><p class="fine-print"><em><span>Kimberlee Weatherall receives funding from the Australian Research Council. She is a Chief Investigator with the ARC Centre of Excellence on Automated Decision-Making and Society, and a Fellow with the Gradient Institute.</span></em></p>There’s little transparency surrounding how insurance firms collect, analyse and use our personal data when they establish insurance costs.Zofia Bednarz, Lecturer in Commercial Law, University of SydneyKayleen Manwaring, Senior Research Fellow, UNSW Allens Hub for Technology, Law & Innovation and Senior Lecturer, School of Private & Commercial Law, UNSW Law & Justice, UNSW SydneyKimberlee Weatherall, Professor of Law, University of SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1821342022-05-02T03:40:34Z2022-05-02T03:40:34ZACCC says consumers need more choices about what online marketplaces are doing with their data<figure><img src="https://images.theconversation.com/files/460704/original/file-20220502-15-3s0and.jpg?ixlib=rb-1.1.0&rect=0%2C8%2C2731%2C1524&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Consumers using online retail marketplaces such as eBay and Amazon “have little effective choice in the amount of data they share”, according to the <a href="https://www.accc.gov.au/publications/serial-publications/digital-platform-services-inquiry-2020-2025/digital-platform-services-inquiry-march-2022-interim-report">latest report</a> of the Australian Competition & Consumer Commission (ACCC) Digital Platform Services Inquiry. </p>
<p>Consumers may benefit from personalisation and recommendations in these marketplaces based on their data, but many are in the dark about how much personal information these companies collect and share for other purposes. </p>
<p><a href="https://www.accc.gov.au/media-release/concerning-issues-for-consumers-and-sellers-on-online-marketplaces">ACCC chair Gina Cass-Gottlieb</a> said:</p>
<blockquote>
<p>We believe consumers should be given more information about, and control over, how online marketplaces collect and use their data. </p>
</blockquote>
<p>The report reiterates the ACCC’s earlier calls for amendments to the Australian Consumer Law to address unfair data terms and practices. It also points out that the government is considering <a href="https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988">proposals for major changes to privacy law</a>. </p>
<p>However, none of these proposals is likely to come into effect in the near future. In the meantime, we should also consider whether practices such as obtaining information about users from third-party data brokers are fully compliant with existing privacy law. </p>
<h2>Why did the ACCC examine online marketplaces?</h2>
<p>The ACCC examined competition and consumer issues associated with “general online retail marketplaces” as part of its <a href="https://www.accc.gov.au/focus-areas/inquiries-ongoing/digital-platform-services-inquiry-2020-2025">five-year Digital Platform Services Inquiry</a>. </p>
<p>These marketplaces facilitate transactions between third-party sellers and consumers on a common platform. They do not include retailers that don’t operate marketplaces, such as Kmart, or platforms such as Gumtree that carry classified ads but don’t allow transactions.</p>
<p>The ACCC report focuses on the four largest online marketplaces in Australia: Amazon Australia, Catch, eBay Australia and Kogan. In 2020–21, these four carried sales totalling $8.4 billion.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/460716/original/file-20220502-18-4pvx0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/460716/original/file-20220502-18-4pvx0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=401&fit=crop&dpr=1 600w, https://images.theconversation.com/files/460716/original/file-20220502-18-4pvx0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=401&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/460716/original/file-20220502-18-4pvx0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=401&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/460716/original/file-20220502-18-4pvx0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/460716/original/file-20220502-18-4pvx0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/460716/original/file-20220502-18-4pvx0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Online marketplaces such as Amazon, eBay, Catch and Kogan facilitate transactions between third-party buyers and sellers.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/new-york-usa-november-1-2018-1219079038">Shutterstock</a></span>
</figcaption>
</figure>
<p>According to the report, eBay has the largest sales of these companies. Amazon Australia is the second-largest and the fastest-growing, with an 87% increase in sales over the past two years.</p>
<p>The ACCC examined:</p>
<ul>
<li>the state of competition in the relevant markets</li>
<li>issues facing sellers who depend on selling their products through these marketplaces</li>
<li>consumer issues including concerns about personal information collection, use and sharing.</li>
</ul>
<h2>Consumers don’t want their data used for other purposes</h2>
<p>The ACCC expressed concern that in online marketplaces, “the extent of data collection, use and disclosure … often does not align with consumer preferences”. </p>
<p>The Commission pointed to surveys about <a href="https://www.accc.gov.au/system/files/Consumer%20Policy%20Research%20Centre%20%28CPRC%29%20%2818%20August%202021%29.pdf">Australian consumer attitudes to privacy</a> which indicate:</p>
<ul>
<li>94% did not feel comfortable with how digital platforms including online marketplaces collect their personal information</li>
<li>92% agreed that companies should only collect information they need for providing their product or service</li>
<li>60% considered it very or somewhat unacceptable for their online behaviour to be monitored for targeted ads and offers.</li>
</ul>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-one-simple-rule-change-could-curb-online-retailers-snooping-on-you-166174">How one simple rule change could curb online retailers' snooping on you</a>
</strong>
</em>
</p>
<hr>
<p>However, the four online marketplaces analysed: </p>
<ul>
<li>do not proactively present privacy terms to consumers “throughout the purchasing journey”</li>
<li>may allow advertisers or other third parties to place tracking cookies on users’ devices</li>
<li>do not clearly identify how consumers can opt out of cookies while still using the marketplace.</li>
</ul>
<p>Some of the marketplaces also obtain extra data about individuals from third-party data brokers or advertisers.</p>
<p>The <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3432769">harms from increased tracking and profiling</a> of consumers include decreased privacy; manipulation based on detailed profiling of traits and weaknesses; and discrimination or exclusion from opportunities. </p>
<h2>Limited choices: you can’t just ‘walk out of a store’</h2>
<p>Some might argue that consumers must not actually care that much about privacy if they keep using these companies, but the choice is not so simple. </p>
<p>The ACCC notes the relevant privacy terms are often spread across multiple web pages and offered on a “take it or leave it” basis. </p>
<p>The terms also use “bundled consents”. This means that agreeing to the company using your data to fill your order, for example, may be bundled together with agreeing for the company to use your data for its separate advertising business. </p>
<p>Further, as my research has shown, there is <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3905693">so little competition on privacy</a> between these marketplaces that consumers can’t just find a better offer. The ACCC agrees:</p>
<blockquote>
<p>While consumers in Australia can choose between a number of online marketplaces, the common approaches and practices of the major online marketplaces to data collection and use mean that consumers have little effective choice in the amount of data they share.</p>
</blockquote>
<p>Consumers also seem unable to require these companies to delete their data. The situation is quite different from conventional retail interactions where a consumer can select “unsubscribe” or walk out of a store. </p>
<h2>Does our privacy law currently permit all these practices?</h2>
<p>The ACCC has reiterated its earlier calls to amend the Australian Consumer Law to prohibit unfair practices and make unfair contract terms illegal. (At present unfair contract terms are just void, or unenforceable.)</p>
<p>The report also points out that the government is considering proposals for major changes to privacy law, but <a href="https://theconversation.com/a-new-proposed-privacy-code-promises-tough-rules-and-10-million-penalties-for-tech-giants-170711">these changes</a> are uncertain and may take more than a year to come into effect. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/a-new-proposed-privacy-code-promises-tough-rules-and-10-million-penalties-for-tech-giants-170711">A new proposed privacy code promises tough rules and $10 million penalties for tech giants</a>
</strong>
</em>
</p>
<hr>
<p>In the meantime, we should look more closely at the practices of these marketplaces under current privacy law. </p>
<p>For example, under the <a href="https://www.legislation.gov.au/Series/C2004A03712">federal Privacy Act</a> the four marketplaces</p>
<blockquote>
<p>must collect personal information about an individual only from the individual unless … it is unreasonable or impracticable to do so.</p>
</blockquote>
<p>However, <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3905693">some online marketplaces</a> say they collect information about individual consumers’ interests and demographics from “<a href="https://www.ebay.com.au/help/policies/member-behaviour-policies/user-privacy-notice-privacy-policy?id=4260&mkevt=1&mkcid=1&mkrid=705-53470-19255-0&campid=5338596835&customid=&toolid=10001#section4">data providers</a>” and <a href="https://www.amazon.com.au/gp/help/customer/display.html?nodeId=202075050&ref_=footer_iba">other third parties</a>. </p>
<p>We don’t know the full detail of what’s collected, but demographic information might include our age range, income, or family details. </p>
<p>How is it “unreasonable or impracticable” to obtain information about our demographics and interests directly from us? Consumers could ask online marketplaces this question, and complain to the <a href="https://www.oaic.gov.au/privacy/privacy-complaints">Office of the Australian Information Commissioner</a> if there is no reasonable answer.</p><img src="https://counter.theconversation.com/content/182134/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, and the Australian Privacy Foundation.</span></em></p>Consumers should have more control over how online marketplaces such as eBay and Amazon collect and use their data, according to a new ACCC report.Katharine Kemp, Senior Lecturer, Faculty of Law & Justice, UNSW, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1775402022-02-21T19:07:44Z2022-02-21T19:07:44ZMandatory logins for ABC iview could open an intimate window onto your life<figure><img src="https://images.theconversation.com/files/447462/original/file-20220221-17-1nx6zk7.jpg?ixlib=rb-1.1.0&rect=5%2C0%2C1272%2C640&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">ABC</span></span></figcaption></figure><p>Last week, the ABC <a href="https://about.abc.net.au/press-releases/%E2%80%AFabc%E2%80%AFiview%E2%80%AFlogin-to-watch/">announced</a> it will begin to track the viewing habits of all users of its <a href="https://iview.abc.net.au">iview streaming platform</a> from March 15. This will be done by making users create an account and log in to watch shows and “benefit from the next stage of personalised services” such as “program recommendations [and] watchlists”.</p>
<p>The change was initially planned for the middle of last year, but was <a href="https://www.innovationaus.com/abc-quietly-delays-iview-login-plans-data-sharing/">delayed after heavy criticism</a> from privacy experts and others over the proposed arrangements for sharing and recording data. One point of contention was the ABC’s plans to share viewer data with Facebook and Google.</p>
<p>The ABC <a href="https://about.abc.net.au/statements/abc%E2%80%AFiview%E2%80%AFlogin-to-watch-faqs/">says</a> “significant work has been undertaken in providing effective privacy controls” during this delay. But nevertheless, <a href="https://www.salingerprivacy.com.au/2022/01/06/the-abcs-of-privacy/">critics maintain</a> the new provisions still involve sharing using data without full consent. </p>
<p>So how concerned should we be about our privacy here?</p>
<h2>All your data are belong to us</h2>
<p>For years we’ve known organisations such as Google and Facebook are collecting data on every search and social media post we make, and every website we visit. </p>
<p>Often the argument for collecting these data is similar to that used by the ABC: that collecting it provides for more personalised recommendations and a better user experience. However, tech companies also make billions using these data to sell personalised ads (and sometimes by selling the actual data).</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-ugly-truth-tech-companies-are-tracking-and-misusing-our-data-and-theres-little-we-can-do-127444">The ugly truth: tech companies are tracking and misusing our data, and there's little we can do</a>
</strong>
</em>
</p>
<hr>
<p>They’re not the only ones keeping tabs on us. Loyalty cards such as Woolworths Everyday Rewards or Coles Flybuys do the same thing: tracking your purchases, adding them to a database, and mining them for information about your life. </p>
<p>If you buy 10 cans of cat food a fortnight, you probably have two cats. If you suddenly start buying 15, you’ve probably acquired a third. </p>
<p>Nappies, baby formula and baby food reveal how many kids you have, how old they are, and how they’re growing up. The ratio of Tim Tams to bread and milk can give clues as to your level of disposable income. </p>
<p>Despite this, millions of Australians scan these cards every day. It’s hard to know if they’ve fully weighed the pros and cons, or just never really thought about them.</p>
<h2>A healthy fear of your shadow (profile)</h2>
<p>So how much should we care about this? And how much do we? </p>
<p>When I put these questions to my students in an undergraduate class on Information Technology & Society, they mostly respond that if they’re doing nothing wrong then they have no reason to care if major corporations know what they eat for breakfast.</p>
<p>Older “mature age” students tend to feel differently, often raising concerns about what the data are used for, both now and potentially in the future. Older students may have had negative experiences with data, such as having a home loan disallowed over a credit report, while younger people may not look so far ahead.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/447473/original/file-20220221-14-8dly68.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/447473/original/file-20220221-14-8dly68.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=510&fit=crop&dpr=1 600w, https://images.theconversation.com/files/447473/original/file-20220221-14-8dly68.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=510&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/447473/original/file-20220221-14-8dly68.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=510&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/447473/original/file-20220221-14-8dly68.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=640&fit=crop&dpr=1 754w, https://images.theconversation.com/files/447473/original/file-20220221-14-8dly68.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=640&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/447473/original/file-20220221-14-8dly68.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=640&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Data recorded today may be used for other purposes in the future.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>Indeed, organisations like Electronic Frontiers Australia have argued this type of data collection can be <a href="https://www.efa.org.au/2014/11/06/ethics-big-data/">a slippery slope to profiling and bias</a>, with organisations using this to choose who should receive particular services or assistance. </p>
<p>The ever-growing collection of data comes at the same time as government moves to centralise their databases under the banner of <a href="https://my.gov.au">myGov</a>, tying all government services to Medicare or tax file numbers. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/is-chinas-social-credit-system-coming-to-australia-117095">Is China's social credit system coming to Australia?</a>
</strong>
</em>
</p>
<hr>
<p>We are still a long way from a dystopian situation like China’s social credit system, where all our behaviour feeds into a rating system that determines our access to services and housing, but these moves could make one easier to implement in future. </p>
<h2>How enjoying Q+A might raise tricky questions</h2>
<p>Which brings us back to the ABC and its plan to require every user to create a profile and log into its service. The main question here is the same one to ask when using a Flybuys card or creating a new social media account. </p>
<p>Does the convenience of sharing these data (with the ABC in this case), in terms of personal recommendations and watch lists, and indeed, the ability to access the service at all, balance what we think our data will ultimately be used for?</p>
<p>And when we ask this question, it helps to think in very broad terms. While in this case we’re just talking about viewing history and watch time, it’s not too dissimilar to cat food and nappies when you think about it. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-privacy-paradox-we-claim-we-care-about-our-data-so-why-dont-our-actions-match-143354">The privacy paradox: we claim we care about our data, so why don't our actions match?</a>
</strong>
</em>
</p>
<hr>
<p>Significant amounts of information could be inferred from our viewing habits: everything from our political leanings to our attention span. What that can then be used for is anyone’s guess.</p>
<p>That’s not to say you shouldn’t create an account, but rather that you need to go in with your eyes wide open. Think about what iview means to you, what data might be shared, and how it might be used. And then decide if you really love Bluey all that much after all.</p><img src="https://counter.theconversation.com/content/177540/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Michael Cowling does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The ABC’s decision to force viewers to create accounts to watch shows online raises concerns over privacy.Michael Cowling, Associate Professor – Information & Communication Technology (ICT), CQUniversity AustraliaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1707112021-10-27T04:22:07Z2021-10-27T04:22:07ZA new proposed privacy code promises tough rules and $10 million penalties for tech giants<figure><img src="https://images.theconversation.com/files/428675/original/file-20211027-21-chefvu.jpeg?ixlib=rb-1.1.0&rect=5%2C2%2C1991%2C1353&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>This week the federal government <a href="https://consultations.ag.gov.au/rights-and-protections/online-privacy-bill-exposure-draft/">announced</a> proposed legislation to develop an online privacy code (or “OP Code”) setting tougher privacy standards for Facebook, Google, Amazon and many other online platforms. </p>
<p>These companies collect and use vast amounts of consumers’ personal data, much of it without their knowledge or real consent, and the code is intended to guard against privacy harms from these practices.</p>
<p>The higher standards would be backed by increased penalties for interference with privacy under the Privacy Act and greater enforcement powers for the federal privacy commissioner. Serious or repeated breaches of the code could carry penalties of up to A$10 million or 10% of turnover for companies.</p>
<p>However, relevant companies are likely to try to avoid obligations under the OP Code by drawing out the process for drafting and registering the code. They are also likely to try to exclude themselves from the code’s coverage, and argue about the definition of “personal information”.</p>
<p>The current definition of “personal information” under the Privacy Act does not clearly include technical data such as IP addresses and device identifiers. Updating this will be important to ensure the OP Code is effective.</p>
<h2>Which organisations would be covered and why?</h2>
<p>The code is intended to address some clear online privacy dangers, while we await broader changes from the <a href="https://consultations.ag.gov.au/rights-and-protections/privacy-act-review-discussion-paper/">current broader review of the Privacy Act</a> that would apply across all sectors.</p>
<p>The OP Code would target online platforms that “collect a high volume of personal information or trade in personal information”, including:</p>
<ul>
<li><p>social media networks such as Facebook; dating apps like Bumble; online blogging or forum sites like Reddit; gaming platforms; online messaging and videoconferencing services such as WhatsApp and Zoom</p></li>
<li><p><a href="https://theconversation.com/its-time-for-third-party-data-brokers-to-emerge-from-the-shadows-94298">data brokers</a> that trade in personal information, including Quantium, Acxiom, Experian and Nielsen Corporation</p></li>
<li><p>other large online platforms that collect personal information and have more than 2.5 million annual users in Australia, such as Amazon, Google and Apple.</p></li>
</ul>
<p>The OP Code would impose higher standards for these companies than otherwise apply under the Privacy Act.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/its-time-for-third-party-data-brokers-to-emerge-from-the-shadows-94298">It's time for third-party data brokers to emerge from the shadows</a>
</strong>
</em>
</p>
<hr>
<h2>Higher standards for consent - maybe</h2>
<p>The OP Code would set out details about how these organisations must meet obligations under the Privacy Act. This would include higher standards for what constitutes users’ “consent” for how their data are used.</p>
<p>The government’s <a href="https://consultations.ag.gov.au/rights-and-protections/online-privacy-bill-exposure-draft/user_uploads/online-privacy-bill-explanatory-paper.pdf">explanatory paper</a> says the OP Code would require consent to be “voluntary, informed, unambiguous, specific and current”. (Unfortunately, the draft legislation itself doesn’t actually say that, and will require some amendment to achieve this.)</p>
<p>This description draws on the definition of consent in the European Union’s <a href="https://gdpr.eu/what-is-gdpr/">General Data Protection Regulation</a>.</p>
<p>In the EU, for example, <a href="https://gdpr-info.eu/issues/consent/">“unambiguous” consent</a> means a person must take clear, affirmative action – for instance by ticking a box or clicking a button – to consent to a use of their information. </p>
<p>Consent must also be “specific”, so companies cannot, for example, require consumers to consent to unrelated uses (such as market research) when their data is only needed to process a specific purchase.</p>
<h2>Requests to stop using and disclosing personal information</h2>
<p>The ACCC recommended we should have a right to erase our personal data as a means of reducing the power imbalance between consumers and large platforms. In the EU, the “right to be forgotten” by search engines and the like is part of this erasure right. The government has not adopted this recommendation.</p>
<p>However, the OP Code would include an obligation for organisations to comply with a consumer’s reasonable request to stop using and disclosing their personal data. Companies would be allowed to charge a “non-excessive” fee for fulfilling these requests. This is a very weak version of the EU right to be forgotten.</p>
<p>For example, Amazon currently states in its <a href="https://www.amazon.com.au/gp/help/customer/display.html?nodeId=GX7NJQ4ZB8MHFRNJ#GUID-C3396B35-7018-45C5-999A-5989043DA870__SECTION_C877F3A6113249BF905B04840EFB3496">privacy policy</a> that it uses customers’ personal data in its advertising business and discloses the data to its vast Amazon.com corporate group. The proposed OP Code would mean Amazon would have to stop this, at a customer’s request, unless it had reasonable grounds for refusing.</p>
<p>Ideally, the code should also allow consumers to ask a company to stop <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3905693">collecting their personal information from third parties</a>, as they currently do, to build profiles on us.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-one-simple-rule-change-could-curb-online-retailers-snooping-on-you-166174">How one simple rule change could curb online retailers' snooping on you</a>
</strong>
</em>
</p>
<hr>
<h2>Increased protections for children and vulnerable groups</h2>
<p>The draft bill also includes a vague provision for the OP Code to add protections for kids and other vulnerable people who are not capable of making their own privacy decisions.</p>
<p>A more controversial proposal would require new consents and verification for kids using social media services such as Facebook and WhatsApp. These services would be required to:</p>
<ul>
<li><p>take reasonable steps to verify the age of social media users</p></li>
<li><p>obtain parental consent before collecting, using or disclosing personal information of a child under 16</p></li>
<li><p>ensure its data practices are “fair and reasonable in the circumstances”, with the best interests of the child as the primary consideration.</p></li>
</ul>
<h2>What is ‘personal information’?</h2>
<p>A key tactic companies will likely use to avoid the new rules is to claim that the information they use is not truly “personal”, since the OP Code and the Privacy Act only apply to “personal information”, as defined in the Act. </p>
<p>The companies may claim the data they collect is only connected to our individual device or to an online identifier they’ve allocated to us, rather than our legal name. However, the effect is the same. The data is used to build a more detailed profile on an individual and to have effects on that individual.</p>
<p>Australia needs to update the definition of “personal information” to clarify it includes data such as IP addresses, device identifiers, location data, and any other online identifiers that may be used to identify an individual or to interact with them on an individual basis. Data should only be de-identified if no individual is identifiable from that data. </p>
<h2>Increased penalties and upgraded enforcement</h2>
<p>The government has pledged to give tougher powers to the privacy commissioner, and to hit companies with tougher penalties for breaching their obligations once the code comes into effect.</p>
<p>The maximum civil penalty for a serious and/or repeated interference with privacy will be increased up to the equivalent penalties in the Australian Consumer Law. </p>
<p>For individuals, the maximum penalty will increase to more than A$500,000. For corporations, the maximum will be the greater of A$10 million, or three times the value of the benefit received from the breach, or (if this value cannot be determined) 10% of the company’s annual turnover.</p>
<p>The privacy commissioner could also issue infringement notices for failing to provide relevant information to an investigation. The maximum penalty will be A$2,644 for individuals or A$13,320 for companies.</p>
<p>Such civil penalty provisions will make it unnecessary for the Commissioner to resort to prosecution of a criminal offence, or to civil litigation, in these cases. </p>
<h2>Don’t hold your breath</h2>
<p>Once legislation is passed, it will take around 12 months for the code to be developed and registered.</p>
<p>The tech giants will have plenty of opportunity to create delay in this process. Companies are likely to challenge the content of the code, and whether they should even be covered by it at all.</p><img src="https://counter.theconversation.com/content/170711/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, the Centre for Law, Markets & Regulation and the Australian Privacy Foundation.</span></em></p><p class="fine-print"><em><span>Graham Greenleaf is a board member of the NGO, the Australian Privacy Foundation.</span></em></p>A proposed online privacy code would give consumers more control over how tech companies collect and use their dataKatharine Kemp, Senior Lecturer, Faculty of Law & Justice, UNSW, UNSW SydneyGraham Greenleaf, Professor of Law and Information Systems, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1661742021-08-16T19:53:06Z2021-08-16T19:53:06ZHow one simple rule change could curb online retailers’ snooping on you<figure><img src="https://images.theconversation.com/files/416255/original/file-20210816-21-6808b0.jpg?ixlib=rb-1.1.0&rect=46%2C0%2C5184%2C3453&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Rupixen.com/Unsplash</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>I spent last week studying the 26,000 words of privacy terms published by eBay and Amazon, trying to extract some straight answers, and comparing them to the privacy terms of other online marketplaces such as Kogan and Catch (my full summary is <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3905693">here</a>).</p>
<p>There’s bad news and good news.</p>
<p>The bad news is that none of the privacy terms analysed are good. Based on their published policies, there is no major online marketplace operating in Australia that sets a commendable standard for respecting consumers’ data privacy.</p>
<p>All the policies contain vague, confusing terms and give consumers no real choice about how their data are collected, used and disclosed when they shop on these websites. Online retailers that operate in both Australia and the European Union give their customers in the EU better privacy terms and defaults than us, because the EU has stronger privacy laws.</p>
<p>The Australian Competition and Consumer Commission (ACCC) is currently collecting submissions as part of an inquiry into online marketplaces in Australia. You can have your say <a href="https://consultation.accc.gov.au/mergers-and-adjudication/consumer-questionnaire-general-online-retail/">here</a> by August 19.</p>
<p>The good news is that, as a first step, there is a clear and simple “anti-snooping” rule we could introduce to cut out one unfair and unnecessary, but very common, data practice.</p>
<p>Deep in the fine print of the privacy terms of all the above-named websites, you’ll find an unsettling term.</p>
<p>It says these retailers can obtain extra data about you from other companies, for example, <a href="https://theconversation.com/its-time-for-third-party-data-brokers-to-emerge-from-the-shadows-94298">data brokers</a>, advertising companies, or suppliers from whom you have previously purchased.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/its-time-for-third-party-data-brokers-to-emerge-from-the-shadows-94298">It's time for third-party data brokers to emerge from the shadows</a>
</strong>
</em>
</p>
<hr>
<p>eBay, for example, can take the data about you from a data broker and combine it with the data eBay already has about you, to form a detailed profile of your interests, purchases, behaviour and characteristics.</p>
<p>The problem is the online marketplaces give you no choice in this. There’s no privacy setting that lets you opt out of this data collection, and you can’t escape by switching to another major marketplace, because they all do it.</p>
<p>An online bookseller doesn’t need to collect data about your fast-food preferences to sell you a book. It wants these extra data for its own advertising and business purposes.</p>
<figure class="align-center ">
<img alt="Empty Amazon packaging" src="https://images.theconversation.com/files/416259/original/file-20210816-13-1vkjfla.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/416259/original/file-20210816-13-1vkjfla.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=664&fit=crop&dpr=1 600w, https://images.theconversation.com/files/416259/original/file-20210816-13-1vkjfla.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=664&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/416259/original/file-20210816-13-1vkjfla.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=664&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/416259/original/file-20210816-13-1vkjfla.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=834&fit=crop&dpr=1 754w, https://images.theconversation.com/files/416259/original/file-20210816-13-1vkjfla.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=834&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/416259/original/file-20210816-13-1vkjfla.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=834&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Online shopping leaves a digital paper trail as well as empty boxes.</span>
<span class="attribution"><span class="source">STRF/STAR MAX/IPx/AP</span></span>
</figcaption>
</figure>
<p>You might well be comfortable giving retailers information about yourself, so as to receive targeted ads and aid the retailer’s other business purposes. But this preference should not be assumed. If you want retailers to collect data about you from third parties, it should be done only on your explicit instructions, rather than automatically for everyone.</p>
<p>The “bundling” of these uses of a consumer’s data is <a href="https://www.oaic.gov.au/updates/news-and-media/flight-centre-found-to-have-interfered-with-privacy/">potentially unlawful</a> even under our existing privacy laws, but this needs to be made clear. </p>
<h2>Time for an ‘anti-snooping’ rule</h2>
<p>Here’s my suggestion, which forms the basis of my own submission to the ACCC inquiry. </p>
<p>Online retailers should be barred from collecting data about a consumer from another company, unless the consumer has clearly and actively requested this.</p>
<p>For example, this could involve clicking on a check-box next to a plainly worded instruction such as:</p>
<blockquote>
<p>Please obtain information about my interests, needs, behaviours and/or characteristics from the following data brokers, advertising companies and/or other suppliers.</p>
</blockquote>
<p>The third parties should be specifically named. And the default setting should be that third-party data are not collected without the customer’s express request.</p>
<p>This rule would be consistent with what we know from <a href="https://cprc.org.au/app/uploads/2018/07/Consumer-Data-and-the-Digital-Economy_smallest-file-size.pdf">consumer surveys</a>: most Australian consumers are not comfortable with companies unnecessarily sharing their personal information.</p>
<p>There could be reasonable exceptions to this rule, such as for fraud detection, address verification or credit checks. But data obtained for these purposes should not be used for marketing, advertising or generalised “market research”.</p>
<h2>Can’t we already opt out of targeted ads?</h2>
<p>Online marketplaces do claim to allow choices about “personalised advertising” or marketing communications. Unfortunately, these are worth little in terms of privacy protection.</p>
<p>Amazon says you can opt out of seeing targeted advertising. It does not say you can opt out of all data collection for advertising and marketing purposes.</p>
<p>Similarly, eBay lets you opt out of being shown targeted ads. But the later passages of its <a href="https://www.ebay.com.au/help/policies/p-behaviour-policies/ebay-cookie-notice?id=4267&mkevt=1&mkcid=1&mkrid=705-53470-19255-0&campid=5338596835&customid=&toolid=10001">Cookie Notice</a> state:</p>
<blockquote>
<p>your data may still be collected as described in our User Privacy Notice.</p>
</blockquote>
<p>This gives eBay the right to continue to collect data about you from data brokers, and to share them with a range of third parties.</p>
<p>Many retailers and large digital platforms operating in Australia justify their collection of consumer data from third parties on the basis you’ve already given your implied consent to the third parties disclosing it.</p>
<p>That is, there’s some obscure term buried in the thousands of words of privacy policies that supposedly apply to you, which says that <a href="https://www.bunnings.com.au/policies/privacy-policy">Bunnings</a>, for instance, can share data about you with various “related companies”.</p>
<p>Of course, Bunnings didn’t highlight this term, let alone give you a choice in the matter, when you ordered your hedge cutter last year. It only included a “Policies” link at the foot of its website; the term was on another web page, buried in the detail of its Privacy Policy.</p>
<p>Such terms should ideally be eradicated entirely. But in the meantime, we can turn the tap off on this unfair flow of data, by stipulating that online retailers cannot obtain such data about you from a third party without your express, active and unequivocal request.</p>
<h2>Who should be bound by an ‘anti-snooping’ rule?</h2>
<p>While the focus of this article is on online marketplaces covered by the ACCC inquiry, many other companies have similar third-party data collection terms, including <a href="https://www.woolworths.com.au/shop/discover/about-us/privacy-policy">Woolworths</a>, <a href="https://www.coles.com.au/privacy#coles-group">Coles</a>, major banks, and digital platforms such as Google and Facebook.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/heres-how-tech-giants-profit-from-invading-our-privacy-and-how-we-can-start-taking-it-back-120078">Here's how tech giants profit from invading our privacy, and how we can start taking it back</a>
</strong>
</em>
</p>
<hr>
<p>While some argue users of “free” services like Google and Facebook should expect some surveillance as part of the deal, this should not extend to asking other companies about you without your active consent.</p>
<p>The anti-snooping rule should clearly apply to any website selling a product or service.</p>
<p>With lockdowns barring many of us from visiting physical shops, we should be able to make purchases online without being unwittingly roped into a company’s advertising side hustle.</p><img src="https://counter.theconversation.com/content/166174/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, the Centre for Law, Markets & Regulation and the Australian Privacy Foundation.</span></em></p>There is no major online marketplace operating in Australia that sets a commendable standard for respecting consumers’ data privacy. Letting customers opt out of data tracking would be a good start.Katharine Kemp, Senior Lecturer, Faculty of Law & Justice, UNSW, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1621722021-06-20T20:18:58Z2021-06-20T20:18:58ZIs your phone really listening to your conversations? Well, turns out it doesn’t have to<figure><img src="https://images.theconversation.com/files/407172/original/file-20210618-27-os1quw.jpeg?ixlib=rb-1.1.0&rect=209%2C7%2C4782%2C2986&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Have you ever chatted with a friend about buying a certain item and been targeted with an ad for that same item the next day? If so, you may have wondered whether your smartphone was “listening” to you. </p>
<p>But is it really? Well, it’s no coincidence the item you’d been interested in was the same one you were targeted with. </p>
<p>But that doesn’t mean your device is actually listening to your conversations — it doesn’t need to. There’s a good chance you’re already giving it all the information it needs. </p>
<h2>Can phones hear?</h2>
<p>Most of us regularly <a href="https://www.emeraldgrouppublishing.com/archived/learning/management_thinking/articles/cookies.htm">disclose our</a> information to a wide range of websites and apps. We do this when we grant them certain permissions, or allow “cookies” to track our online activities.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/94-of-australians-do-not-read-all-privacy-policies-that-apply-to-them-and-thats-rational-behaviour-96353">94% of Australians do not read all privacy policies that apply to them – and that’s rational behaviour</a>
</strong>
</em>
</p>
<hr>
<p>So-called “first-party cookies” allow websites to “remember” certain details about our interaction with the site. For instance, login cookies let you save your login details so you don’t have to re-enter them each time.</p>
<p>Third-party cookies, however, are created by domains that are external to the site you’re visiting. The third party will often be a marketing company in a partnership with the first-party website or app. </p>
<p>The latter will host the marketer’s ads and grant it access to data it collects from you (which you will have given it permission to do — perhaps by clicking on some innocuous looking popup).</p>
<p>As such, the advertiser can build a picture of your life: your routines, wants and needs. These companies constantly seek to gauge the popularity of their products and how this varies based on factors such as a customer’s age, gender, height, weight, job and hobbies. </p>
<p>By classifying and clustering this information, advertisers improve their recommendation algorithms, using something called <a href="https://link.springer.com/article/10.1007/s40747-020-00212-w">recommender systems</a> <a href="https://arxiv.org/pdf/2009.06861.pdf">to target</a> the right customers with the right ads.</p>
<h2>Computers work behind the scenes</h2>
<p>There are several machine-learning techniques in artificial intelligence (AI) that help systems filter and analyse your data, such as data clustering, classification, association and <a href="https://bdtechtalks.com/2019/05/28/what-is-reinforcement-learning/">reinforcement learning</a> (RL). </p>
<p>An RL agent can <a href="https://bdtechtalks.com/2021/02/22/reinforcement-learning-ad-optimization/">train itself</a> based on feedback gained from user interactions, akin to how a young child will learn to repeat an action if it leads to a reward.</p>
<p>By viewing or pressing “like” on a social media post, you send a reward signal to an RL agent confirming you’re attracted to the post — or perhaps interested in the person who posted it. Either way, a message is sent to the RL agent about your personal interests and preferences.</p>
<p>If you start actively liking posts about “mindfulness” on a social platform, its system will learn to send you advertisements for companies that can offer related products and content. </p>
<p>Ad recommendations may be based on other data, too, including but not limited to:</p>
<ul>
<li><p>other ads you clicked on through the platform</p></li>
<li><p>personal details you provided the platform (such as your age, email address, gender, location and which devices you access the platform on)</p></li>
<li><p>information shared with the platform by other advertisers or marketing partners that already have you as a customer</p></li>
<li><p>specific pages or groups you have joined or “liked” on the platform.</p></li>
</ul>
<p>In fact, AI algorithms can help marketers take huge pools of data and use them to construct your entire social network, ranking people around you based on how much you “care about” (interact with) them. </p>
<p>They can then start to target you with ads based on not only your own data, but on data collected from your friends and family members using the same platforms as you. </p>
<p>For example, Facebook might be able to recommend you something your friend recently bought. It didn’t need to “listen” to a conversation between you and your friend to do this.</p>
<p><div data-react-class="InstagramEmbed" data-react-props="{"url":"https://www.instagram.com/p/CPhRFBjBX17/?utm_medium=copy_link","accessToken":"127105130696839|b4b75090c9688d81dfd245afe6052f20"}"></div></p>
<h2>Exercising your right to privacy is a choice</h2>
<p>While app providers are <em>supposed</em> to provide clear terms and conditions to users about how they collect, store and use data, nowadays it’s on users to be careful about which permissions they give to the apps and sites they use. </p>
<p>When in doubt, give permissions on an as-needed basis. It makes sense to give WhatsApp access to your camera and microphone, as it can’t provide some of its services without this. But not all apps and services will ask for only what is necessary. </p>
<p>Perhaps you don’t mind receiving targeted ads based on your data, and may find it appealing. <a href="https://hbr.org/2020/10/when-do-we-trust-ais-recommendations-more-than-peoples">Research</a> has shown people with a more “utilitarian” (or practical) worldview actually prefer recommendations from AI to those from humans. </p>
<p>That said, it’s possible AI recommendations can constrain people’s choices and <a href="https://theconversation.com/ai-is-killing-choice-and-chance-which-means-changing-what-it-means-to-be-human-151826">minimise serendipity</a> in the long term. By presenting consumers with algorithmically curated choices of what to watch, read and stream, companies may be implicitly keeping our tastes and lifestyle within a narrower frame.</p>
<h2>Don’t want to be predicted? Don’t be predictable</h2>
<p>There are some simple tips you can follow to limit the amount of data you share online. First, you should review your phone’s app permissions regularly. </p>
<p>Also, think twice before an app or website asks you for certain permissions, or to allow cookies. Wherever possible, avoid using your social media accounts to connect or log in to other sites and services. In most cases there will be an option to sign up via email, which could even be a <a href="https://helpdeskgeek.com/free-tools-review/5-best-free-disposable-email-accounts/">burner email</a>.</p>
<p>Once you do start the sign-in process, remember you only have to share as much information as is needed. And if you’re sensitive about privacy, perhaps consider installing a virtual private network (VPN) on your device. This will mask your IP address and encrypt your online activities.</p>
<h2>Try it yourself</h2>
<p>If you still think your phone is listening to you, there’s a simple experiment you can try.</p>
<p>Go to your phone’s settings and restrict access to your microphone for all your apps. Pick a product you know you haven’t searched for in any of your devices and talk about it out loud at some length with another person. </p>
<p>Make sure you repeat this process a few times. If you still don’t get any targeted ads within the next few day, this suggests your phone isn’t really “listening” to you. </p>
<p>It has other ways of finding out what’s on your mind.</p><img src="https://counter.theconversation.com/content/162172/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dana Rezazadegan is affiliated with Swinburne University of Technology. She is Superstar of STEM at Science and Technology Australia and Honorary fellow at Macquarie University.</span></em></p>Have you ever been targeted with ads that are scarily specific to you, and wondered how the app or website could have known?Dana Rezazadegan, Lecturer, Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1609752021-05-18T03:17:20Z2021-05-18T03:17:20ZACIC thinks there are no legitimate uses of encryption. They’re wrong, and here’s why it matters<figure><img src="https://images.theconversation.com/files/400954/original/file-20210517-23-8relag.jpg?ixlib=rb-1.1.0&rect=5%2C5%2C3988%2C2946&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption"></span> </figcaption></figure><p>Australia’s parliament is considering legislation to give <a href="https://www.homeaffairs.gov.au/about-us/our-portfolios/national-security/lawful-access-telecommunications/surveillance-legislation-amendment-identify-and-disrupt-bill-2020">new powers</a> to the Australian Criminal Intelligence Commission (ACIC) and the Australian Federal Police. These powers will allow them to modify online data, monitor network activity, and take over online accounts in some circumstances. </p>
<p>Last week, in a <a href="https://www.aph.gov.au/DocumentStore.ashx?id=0cfd0e34-ae76-42e4-9438-d8218c70b760&subId=706935">submission</a> to parliament regarding the proposed powers, ACIC made an inaccurate and concerning claim about privacy and information security. ACIC claimed “there is no legitimate reason for a law-abiding member of the community to own or use an encrypted communication platform”.</p>
<p>Encrypted communication platforms, including WhatsApp, Signal, Facetime and iMessage, are in common use, allowing users to send messages that can only be read by the intended recipients. There are many legitimate reasons law-abiding people may use them. And surveillance systems, no matter how well-intentioned, may have negative effects and be used for different purposes or by different people than those they were designed for. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/when-is-not-a-backdoor-just-a-backdoor-australias-struggle-with-encryption-79421">When is 'not a backdoor' just a backdoor? Australia's struggle with encryption</a>
</strong>
</em>
</p>
<hr>
<h2>How surveillance can go wrong</h2>
<p>Surveillance systems often produce unintended effects.</p>
<p>In 1849, the authorities at Tasmania’s Port Arthur penal colony built the <a href="https://portarthur.org.au/separate-prison/">Separate Prison</a>, intended as a humane and enlightened method of imprisonment. Based on the ideas of Jeremy Bentham’s <a href="https://en.wikipedia.org/wiki/Panopticon">Panopticon</a>, the design emphasised constant surveillance and psychological control rather than corporal punishment. However, many inmates suffered serious psychological problems resulting from the lack of normal communication with others. </p>
<p>From 2006 onwards, Facebook developed a privacy-invading apparatus intended to facilitate making money through targeted advertising. Facebook’s system has since been abused by <a href="https://en.wikipedia.org/wiki/Facebook%E2%80%93Cambridge_Analytica_data_scandal">Cambridge Analytica</a> and others for <a href="https://www.theguardian.com/technology/2021/apr/12/facebook-loophole-state-backed-manipulation">political manipulation</a>, with disastrous consequences for some democracies.</p>
<p>In 2018, Australia’s parliament passed the <a href="https://www.legislation.gov.au/Details/C2018A00148">Telecommunications and Other Legislation Amendment (Assistance and Access) Act</a>, with the ostensible purpose of helping police to catch terrorists, paedophiles and other serious criminals. The act gave the Australian Federal Police powers to “add, copy, delete or alter” material on computers. These powers were used the following year to <a href="https://www.abc.net.au/news/2019-06-05/abc-raided-by-australian-federal-police-afghan-files-stories/11181162">raid the Australian Broadcasting Corporation</a> in connection with a story on alleged war crimes in Afghanistan. </p>
<p>These examples demonstrate two facts about security and surveillance. First, surveillance may be used by people of any moral character. Second, a surveillance mechanism may be used by different people, or may achieve a completely different effect, from its original design.</p>
<p>We therefore need to consider what avoiding, undermining or even outlawing the use of encrypted platforms would mean for law-abiding members of the community.</p>
<h2>Encryption limits the power of security agencies</h2>
<p>There are already laws that decide who is allowed to listen to communications taking place over a telecommunications network. While such communications are generally protected, law enforcement and national security agencies can be authorised to intercept them. </p>
<p>However, where communications are encrypted, agencies will not automatically be able to retrieve the content of the conversations they intercept. The <a href="https://www.legislation.gov.au/Details/C2018A00148">Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018</a> was passed to enable agencies to get assistance to try to maintain their ability to get access to the (unencrypted) content of communications. For example, they can ask that one or more forms of electronic protection be removed. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/facebooks-push-for-end-to-end-encryption-is-good-news-for-user-privacy-as-well-as-terrorists-and-paedophiles-128782">Facebook's push for end-to-end encryption is good news for user privacy, as well as terrorists and paedophiles</a>
</strong>
</em>
</p>
<hr>
<p>There are also federal, state and territory laws that can require people to assist law enforcement and national security agencies in accessing (unencrypted) data. There are also numerous proposals to clarify these laws, extend state powers and even to prevent the use of encryption in certain circumstances. </p>
<h2>More surveillance power is not always better</h2>
<p>While people may hold different views on particular proposals about state powers and encryption, there are some things on which we should all be able to agree. </p>
<p>First, facts matter. If the ACIC is wrong about lawful uses of encryption, its assertion should be withdrawn or discounted. </p>
<p>Second, people need both security and privacy. In fact, privacy can facilitate security (the more people know about you, the easier it is to trick you, track you and/or harm you). </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/you-may-be-sick-of-worrying-about-online-privacy-but-surveillance-apathy-is-also-a-problem-86474">You may be sick of worrying about online privacy, but 'surveillance apathy' is also a problem</a>
</strong>
</em>
</p>
<hr>
<p>Third, law enforcement and national security agencies need some surveillance powers to do their jobs. Most of the time, this contributes to the social good of public safety. </p>
<p>Fourth, more is not necessarily better when it comes to surveillance powers. We must ask what purpose the powers serve, whether they are reasonably necessary for achieving that purpose, whether they are likely to achieve the purpose, what negative consequences might result, and whether the powers are proportionate.</p>
<h2>Lawful use of encrypted communication is common</h2>
<p>We can only develop good policy in this area if we have the facts on lawful uses of encryption. </p>
<p>There are many good reasons for law-abiding citizens to use end-to-end encrypted communication platforms. Parents may send photos or videos of their children to trusted friends or relatives, but prefer not to share them with third parties. The explosion of telehealth during the COVID-19 pandemic has led many patients to clarify that they do not want their consultation with their doctor to be shared with an intermediary such as Facebook or Google (or Huawei or WeChat). </p>
<p>Even the New South Wales iVote online voting system — hardly a standout example of excessive security given that it <a href="https://www.itnews.com.au/news/nsw-electoral-commission-confirms-ivote-contains-critical-scytl-crypto-defect-520460">contained a defect that potentially allowed vote manipulation to take place</a> — advertises the use of end-to-end encryption to protect the privacy of votes in transit. The necessity of privacy to protect a citizen’s right to vote without coercion is one of the oldest examples of legal privacy requirements.</p>
<h2>Undermining encryption will hurt legitimate users</h2>
<p>As law-abiding citizens do have legitimate reasons to rely on end-to-end encryption, we should develop laws and policies around government surveillance accordingly. Any legislation that undermines information security across the board will have an impact on lawful users as well as criminals.</p>
<p>There will likely be significant disagreement in the community about where to go from there. But we have to get the facts right first. </p>
<p>We should not consider legislation to deliberately undermine the communications security of all individuals without acknowledging the potential harm this could cause to law-abiding citizens.</p><img src="https://counter.theconversation.com/content/160975/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Lyria Bennett Moses has previously done collaborative funded research with the ACIC, but it is unrelated to this piece.</span></em></p><p class="fine-print"><em><span>Gernot Heiser and Vanessa Teague do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>People have plenty of legitimate reasons to use encrypted communications platforms such as WhatsApp or Signal for their own security and privacy.Gernot Heiser, Scientia Professor and John Lions Chair, UNSW SydneyLyria Bennett Moses, Director of the Allens Hub for Technology, Law and Innovation, UNSW SydneyVanessa Teague, Adjunct associate professor (ANU) and CEO, Thinking Cybersecurity, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1599162021-04-28T06:49:55Z2021-04-28T06:49:55ZApple’s new ‘app tracking transparency’ has angered Facebook. How does it work, what’s all the fuss about, and should you use it?<figure><img src="https://images.theconversation.com/files/397507/original/file-20210428-13-k6hkhm.jpg?ixlib=rb-1.1.0&rect=19%2C0%2C2212%2C1473&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Amr Alfiky/AP</span></span></figcaption></figure><p>Apple users across the globe are adopting the <a href="https://techcrunch.com/2021/04/26/ios-14-5-goes-live-with-watch-unlocking-tracking-transparency-and-kissing-emojis/">latest operating system update</a>, called iOS 14.5, featuring the now-obligatory <a href="https://blog.emojipedia.org/first-look-217-new-emojis-in-ios-14-5/">new batch of emojis</a>. </p>
<p>But there’s another change that’s arguably less fun but much more significant for many users: the introduction of “app tracking transparency”. </p>
<p>This feature promises to usher in a new era of user-oriented privacy, and not everyone is happy — most notably Facebook, which relies on tracking web users’ browsing habits to sell targeted advertising. Some commentators have described it as the beginnings of a new <a href="https://www.cnet.com/news/facebook-vs-apple-heres-what-you-need-to-know-about-their-privacy-feud/">privacy feud</a> between the two tech behemoths.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1386708158870167553"}"></div></p>
<h2>So, what is app tracking transparency?</h2>
<p>App tracking transparency is a continuation of Apple’s push to be recognised as the <a href="https://www.apple.com/newsroom/2021/01/data-privacy-day-at-apple-improving-transparency-and-empowering-users/">platform of privacy</a>. The new feature allows apps to display a pop-up notification that explains what data the app wants to collect, and what it proposes to do with it.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/Ihw_Al4RNno?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Privacy | App Tracking Transparency | Apple.</span></figcaption>
</figure>
<p>There is nothing users need to do to gain access to the new feature, other than install the latest iOS update, which happens automatically on most devices. Once upgraded, apps that use tracking functions will <a href="https://www.forbes.com/sites/kateoflahertyuk/2021/04/24/ios-145-how-this-outstanding-new-feature-will-change-your-iphone-forever/">display a request to opt in or out</a> of this functionality.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/397486/original/file-20210428-19-wfqoup.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="iPhone screenshot showing new App Tracking Transparency functionality" src="https://images.theconversation.com/files/397486/original/file-20210428-19-wfqoup.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/397486/original/file-20210428-19-wfqoup.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=840&fit=crop&dpr=1 600w, https://images.theconversation.com/files/397486/original/file-20210428-19-wfqoup.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=840&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/397486/original/file-20210428-19-wfqoup.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=840&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/397486/original/file-20210428-19-wfqoup.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1056&fit=crop&dpr=1 754w, https://images.theconversation.com/files/397486/original/file-20210428-19-wfqoup.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1056&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/397486/original/file-20210428-19-wfqoup.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1056&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A new App Tracking Transparency feature across iOS, iPadOS, and tvOS will require apps to get the user’s permission before tracking their data across apps or websites owned by other companies.</span>
<span class="attribution"><span class="source">Apple newsroom</span></span>
</figcaption>
</figure>
<h2>How does it work?</h2>
<p>As Apple has <a href="https://developer.apple.com/documentation/apptrackingtransparency">explained</a>, the app tracking transparency feature is a new “application programming interface”, or API — a suite of programming commands used by developers to interact with the operating system. </p>
<p>The API gives software developers a few pre-canned functions that allow them to do things like “request tracking authorisation” or use the tracking manager to “check the authorisation status” of individual apps. </p>
<p>In more straightforward terms, this gives app developers a uniform way of requesting these tracking permissions from the device user. It also means the operating system has a centralised location for storing and checking what permissions have been granted to which apps.</p>
<p>What is missing from the fine print is that there is no physical mechanism to prevent the tracking of a user. The app tracking transparency framework is merely a pop-up box.</p>
<p>It is also interesting to note the specific wording of the pop-up: “ask app not to track”. If the application is using legitimate “device advertising identifiers”, answering no will result in this <a href="https://developer.apple.com/app-store/user-privacy-and-data-use/">identifier being set to zero</a>. This will reduce the tracking capabilities of apps that honour Apple’s tracking policies.</p>
<p>However, if an app is really determined to track you, there are many techniques that could allow them to make surreptitious user-specific identifiers, which may be <a href="https://www.eff.org/wp/behind-the-one-way-mirror">difficult for Apple to detect or prevent</a>. </p>
<p>For example, while an app might not use Apple’s “device advertising identifier”, it would be easy for the app to generate a little bit of “random data”. This data could then be passed between sites under the guise of normal operations such as retrieving an image with the data embedded in the filename. While this would contravene Apple’s developer rules, detecting this type of secret data could be very difficult.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/your-smartphone-apps-are-tracking-your-every-move-4-essential-reads-108586">Your smartphone apps are tracking your every move – 4 essential reads</a>
</strong>
</em>
</p>
<hr>
<p>Apple seems prepared to crack down hard on developers who don’t play by the rules. The most recent additions to Apple’s <a href="https://developer.apple.com/app-store/review/guidelines/">App Store guidelines</a> explicitly tells developers:</p>
<blockquote>
<p>You must receive explicit permission from users via the App Tracking Transparency APIs to track their activity.</p>
</blockquote>
<p>It’s unlikely major app developers will want to fall foul of this policy — a ban from the App Store would be costly. But it’s hard to imagine Apple sanctioning a really big player like Facebook or TikTok without some serious behind-the-scenes negotiation.</p>
<h2>Why is Facebook objecting?</h2>
<p>Facebook is fuelled by web users’ data. Inevitably, anything that gets in the way of its gargantuan revenue-generating network is seen as a threat. In 2020, Facebook’s revenue from advertising exceeded <a href="https://investor.fb.com/investor-news/press-release-details/2021/Facebook-Reports-Fourth-Quarter-and-Full-Year-2020-Results/default.aspx">US$84 billion</a> – a 21% rise on 2019.</p>
<p>The issues are deep-rooted and reflect the two tech giants’ very different business models. Apple’s business model is the sale of laptops, computers, phones and watches – with a significant proportion of its income derived from the vast ecosystem of apps and in-app purchases used on these devices. Apple’s app revenue was reported at <a href="https://www.cnbc.com/2021/01/08/apples-app-store-had-gross-sales-around-64-billion-in-2020.html">US$64 billion in 2020</a>.</p>
<p>With a vested interest in ensuring its customers are loyal and happy with its devices, Apple is well positioned to deliver privacy without harming profits.</p>
<h2>Should I use it?</h2>
<p>Ultimately, it is a choice for the consumer. Many apps and services are offered ostensibly for free to users. App developers often cover their costs through subscription models, in-app purchases or in-app advertising. If enough users decide to embrace privacy controls, developers will either change their funding model (perhaps moving to paid apps) or attempt to find other ways to track users to maintain advertising-derived revenue.</p>
<p>If you don’t want your data to be collected (and potentially sold to unnamed third parties), this feature offers one way to restrict the amount of your data that is trafficked in this way.</p>
<p>But it’s also important to note that tracking of users and devices is a valuable tool for advertising optimisation by building a comprehensive picture of each individual. This increases the relevance of each advert while also reducing advertising costs (by only targeting users who are likely to be interested). Users also arguably benefit, as they see more (relevant) adverts that are contextualised for their interests.</p>
<p>It may slow down the rate at which we receive personalised ads in apps and websites, but this change won’t be an end to intrusive digital advertising. In essence, this is the price we pay for “free” access to these services.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/facebook-data-breach-what-happened-and-why-its-hard-to-know-if-your-data-was-leaked-158417">Facebook data breach: what happened and why it's hard to know if your data was leaked</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/159916/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Apple’s latest iPhone operating system lets you opt out of having your online habits tracked by the apps you use. That’s a big part of Facebook’s business model, but don’t expect a privacy revolution.Paul Haskell-Dowland, Associate Dean (Computing and Security), Edith Cowan UniversityNikolai Hampton, School of Science, Edith Cowan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1591382021-04-19T05:59:25Z2021-04-19T05:59:25ZACCC ‘world first’: Australia’s Federal Court found Google misled users about personal location data<figure><img src="https://images.theconversation.com/files/395647/original/file-20210419-17-19eklxj.jpg?ixlib=rb-1.1.0&rect=2%2C0%2C1914%2C1273&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Henry Perks / Unsplash</span></span></figcaption></figure><p>The Federal Court has <a href="https://www.accc.gov.au/media-release/google-misled-consumers-about-the-collection-and-use-of-location-data">found</a> Google misled some users about personal location data collected through Android devices for two years, from January 2017 to December 2018. </p>
<p>The Australian Competition & Consumer Commission (ACCC) says this decision is a “world first” in relation to Google’s location privacy settings. The ACCC now intends to seek various orders against Google. These will include monetary penalties under the Australian Consumer Law (ACL), which could be up to A$10 million or 10% of Google’s local turnover. </p>
<p>Other companies too should be warned that representations in their privacy policies and privacy settings could lead to similar liability under the ACL.</p>
<p>But this won’t be a complete solution to the problem of many companies <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3432769">concealing what they do with data</a>, including the way they share consumers’ personal information.</p>
<h2>How did Google mislead consumers about their location history?</h2>
<p>The Federal Court found Google’s previous location history settings would have led some reasonable consumers to believe they could prevent their location data being saved to their Google account. In fact, selecting “Don’t save my Location History in my Google Account” alone could not achieve this outcome.</p>
<p>Users needed to change an additional, separate setting to stop location data from being saved to their Google account. In particular, they needed to navigate to “Web & App Activity” and select “Don’t save my Web & App Activity to my Google Account”, even if they had already selected the “Don’t save” option under “Location History”. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-ugly-truth-tech-companies-are-tracking-and-misusing-our-data-and-theres-little-we-can-do-127444">The ugly truth: tech companies are tracking and misusing our data, and there's little we can do</a>
</strong>
</em>
</p>
<hr>
<p>ACCC Chair Rod Sims responded to the Federal Court’s findings, saying:</p>
<blockquote>
<p>This is an important victory for consumers, especially anyone concerned about their privacy online, as the Court’s decision sends a strong message to Google and others that big businesses must not mislead their customers.</p>
</blockquote>
<p>Google has since changed the way these settings are presented to consumers, but is still liable for the conduct the court found was likely to mislead some reasonable consumers for two years in 2017 and 2018. </p>
<h2>ACCC has misleading privacy policies in its sights</h2>
<p>This is the second recent case in which the ACCC has succeeded in establishing misleading conduct in a company’s representations about its use of consumer data. </p>
<p>In 2020, the medical appointment booking app HealthEngine admitted it had disclosed more than 135,000 patients’ non-clinical personal information to insurance brokers without the informed consent of those patients. HealthEngine <a href="https://www.accc.gov.au/media-release/healthengine-to-pay-29-million-for-misleading-reviews-and-patient-referrals">paid fines of A$2.9 million</a>, including approximately <a href="http://www.austlii.edu.au/cgi-bin/viewdoc/au/cases/cth/FCA/2020/1203.html?context=1;query=%22HealthEngine%20Pty%20Ltd%22;mask_path=">A$1.4 million</a> relating to this misleading conduct.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-safe-are-your-data-when-you-book-a-covid-vaccine-157869">How safe are your data when you book a COVID vaccine?</a>
</strong>
</em>
</p>
<hr>
<p>The ACCC has two similar cases in the wings, including another <a href="https://www.accc.gov.au/media-release/correction-accc-alleges-google-misled-consumers-about-expanded-use-of-personal-data">case</a> regarding Google’s privacy-related notifications and a case about Facebook’s representations about <a href="https://www.accc.gov.au/media-release/accc-alleges-facebook-misled-consumers-when-promoting-app-to-protect-users-data">a supposedly privacy-enhancing app called Onavo</a>. </p>
<p>In bringing proceedings against companies for misleading conduct in their privacy policies, the ACCC is following the <a href="https://www.ftc.gov/news-events/media-resources/protecting-consumer-privacy/privacy-security-enforcement">US Federal Trade Commission</a> which has sued many US companies for misleading privacy policies. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/395651/original/file-20210419-17-1hsvk2m.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/395651/original/file-20210419-17-1hsvk2m.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/395651/original/file-20210419-17-1hsvk2m.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/395651/original/file-20210419-17-1hsvk2m.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/395651/original/file-20210419-17-1hsvk2m.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/395651/original/file-20210419-17-1hsvk2m.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/395651/original/file-20210419-17-1hsvk2m.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The ACCC has more cases in the wings about data privacy.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>Will this solve the problem of confusing and unfair privacy policies?</h2>
<p>The ACCC’s success against Google and HealthEngine in these cases sends an important message to companies: they must not mislead consumers when they publish privacy policies and privacy settings. And they may receive significant fines if they do. </p>
<p>However, this will not be enough to stop companies from setting privacy-degrading terms for their users, if they spell such conditions out in the fine print. Such terms are currently commonplace, even though consumers are <a href="https://www.oaic.gov.au/engage-with-us/research/australian-community-attitudes-to-privacy-survey-2020-landing-page/2020-australian-community-attitudes-to-privacy-survey/">increasingly concerned</a> about their privacy and want more privacy options. </p>
<p>Consider the US experience. The US Federal Trade Commission brought action against the <a href="https://www.ftc.gov/news-events/press-releases/2014/04/ftc-approves-final-order-settling-charges-against-flashlight-app">creators of a flashlight app</a> for publishing a privacy policy which didn’t reveal the app was tracking and sharing users’ location information with third parties. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/we-need-a-code-to-protect-our-online-privacy-and-wipe-out-dark-patterns-in-digital-design-145622">We need a code to protect our online privacy and wipe out 'dark patterns' in digital design</a>
</strong>
</em>
</p>
<hr>
<p>However, in the agreement settling this claim, the <a href="https://www.washingtonpost.com/news/the-switch/wp/2013/12/09/heres-why-the-ftc-couldnt-fine-a-flashlight-app-for-allegedly-sharing-user-location-data/">solution</a> was for the creators to rewrite the privacy policy to disclose that users’ location and device ID data are shared with third parties. The question of whether this practice was legitimate or proportionate was not considered. </p>
<p>Major changes to Australian privacy laws will also be required before companies will be prevented from pervasively tracking consumers who do not wish to be tracked. The <a href="https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988">current review of the federal Privacy Act</a> could be the beginning of a process to obtain fairer privacy practices for consumers, but any reforms from this review will be a long time coming. </p>
<hr>
<p><em>This is an edited version of an <a href="https://newsroom.unsw.edu.au/news/business-law/world-first-federal-court-rules-google-has-misled-users-personal-location-data">article</a> that originally appeared on <a href="https://newsroom.unsw.edu.au/">UNSW Newsroom</a>.</em></p><img src="https://counter.theconversation.com/content/159138/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, the Centre for Law, Markets & Regulation and the Australian Privacy Foundation.</span></em></p>Companies are allowed to track users as much as they like — as long as they spell it out in the fine print. But a ground-breaking Australian legal judgement should give them pause.Katharine Kemp, Senior Lecturer, Faculty of Law, UNSW, and Academic Lead, UNSW Grand Challenge on Trust, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1592062021-04-19T05:19:38Z2021-04-19T05:19:38ZPrivacy erosion by design: why the Federal Court should throw the book at Google over location data tracking<figure><img src="https://images.theconversation.com/files/395637/original/file-20210419-13-1glet3p.jpg?ixlib=rb-1.1.0&rect=8%2C0%2C5982%2C3988&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>The Australian Competition and Consumer Commission has had a significant <a href="https://www.accc.gov.au/media-release/google-misled-consumers-about-the-collection-and-use-of-location-data.">win</a> against Google. The Federal Court found Google misled some Android users about how to disable <a href="https://www.judgments.fedcourt.gov.au/judgments/Judgments/fca/single/2021/2021fca0367">personal location tracking</a>.</p>
<p>Will this decision actually change the behaviour of the big tech companies? The answer will depend on the size of the penalty awarded in response to the misconduct. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/accc-world-first-australias-federal-court-found-google-misled-users-about-personal-location-data-159138">ACCC 'world first': Australia's Federal Court found Google misled users about personal location data</a>
</strong>
</em>
</p>
<hr>
<p>In theory, the penalty is <a href="https://www.legislation.gov.au/Details/C2021C00151/Html/Volume_3#_Toc67406308">A$1.1 million per contravention</a>. There is a contravention each time a reasonable person in the relevant class is misled. So the total award could, in theory, amount to many millions of dollars. </p>
<p>But the actual penalty will depend on how the court characterises the misconduct. We believe Google’s behaviour should not be treated as a simple accident, and the Federal Court should issue a heavy fine to deter Google and other companies from behaving this way in future. </p>
<h2>Misleading conduct and privacy settings</h2>
<p>The case arose from the representations made by Google to users of Android phones in 2018 about how it obtained personal location data.</p>
<p>The Federal Court held Google had <a href="https://www.judgments.fedcourt.gov.au/judgments/Judgments/fca/single/2021/2021fca0367">misled</a> some consumers by representing that “having Web & App Activity turned ‘on’ would not allow Google to obtain, retain and use personal data about the user’s location”. </p>
<p>In other words, some consumers were misled into thinking they could control Google’s location data collection practices by switching “off” Location History, whereas Web & App Activity also needed to be disabled to provide this protection.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-accc-is-suing-google-for-misleading-millions-but-calling-it-out-is-easier-than-fixing-it-143447">The ACCC is suing Google for misleading millions. But calling it out is easier than fixing it</a>
</strong>
</em>
</p>
<hr>
<p>The ACCC also argued consumers reading Google’s privacy statement would be misled into thinking personal data was collected for their own benefit rather than Google’s. However, the court dismissed this argument on the grounds that reasonable users wanting to turn the Location History “off” </p>
<blockquote>
<p>would have assumed that Google was obtaining as much commercial advantage as it could from use of the user’s personal location data. </p>
</blockquote>
<p>This is surprising and might deserve further attention from regulators concerned to protect consumers from corporations “data harvesting” for profit.</p>
<h2>How much should Google pay?</h2>
<p>The penalty and other enforcement orders against Google will be made at a <a href="https://www.accc.gov.au/media-release/google-misled-consumers-about-the-collection-and-use-of-location-data">later date</a>. </p>
<p>The aim of the penalty is to deter Google specifically, and other firms like Google, from engaging in misleading conduct again. If penalties are too low they may be treated by wrongdoing firms as merely a “<a href="https://www.accc.gov.au/media-release/full-federal-court-orders-6-million-penalty-for-nurofen-specific-pain-products">cost of doing business</a>”. </p>
<p>However, in circumstances where there is a high degree of <a href="https://www.accc.gov.au/media-release/full-court-dismisses-volkswagen-125m-penalty-appeal">corporate culpability</a>, the Federal Court has shown willingness to award higher amounts than in the past. This has occurred even where the regulator has not sought higher penalties. In the recent <a href="https://www.judgments.fedcourt.gov.au/judgments/Judgments/fca/full/2021/2021fcafc0049">Volkswagen Aktiengesellschaft v ACCC</a> judgement, the full Federal Court confirmed an award of A$125 million against Volkswagen for making false representations about compliance with Australian diesel emissions standards.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/395639/original/file-20210419-21-1kki6os.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/395639/original/file-20210419-21-1kki6os.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=407&fit=crop&dpr=1 600w, https://images.theconversation.com/files/395639/original/file-20210419-21-1kki6os.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=407&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/395639/original/file-20210419-21-1kki6os.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=407&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/395639/original/file-20210419-21-1kki6os.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=512&fit=crop&dpr=1 754w, https://images.theconversation.com/files/395639/original/file-20210419-21-1kki6os.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=512&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/395639/original/file-20210419-21-1kki6os.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=512&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The Federal Court found Google’s information about local data tracking was misleading.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>In setting Google’s penalty, a court will consider factors such as the nature and extent of the misleading conduct and any loss to consumers. The court will also <a href="https://jade.io/j/?a=outline&id=509081">take into account</a> whether the wrongdoer was involved in “deliberate, covert or reckless conduct, as opposed to negligence or carelessness”. </p>
<p>At this point, Google may well argue that only some consumers were misled, that it was possible for consumers to be informed if they read more about Google’s privacy policies, that it was only one slip-up, and that its contravention of the law was unintentional. These might seem to reduce the seriousness or at least the moral culpability of the offence. </p>
<p>But we argue they should not unduly cap the penalty awarded. Google’s conduct may not appear as “<a href="https://www.judgments.fedcourt.gov.au/judgments/Judgments/fca/full/2021/2021fcafc0049">egregious and deliberately deceptive</a>” as the Volkswagen case. </p>
<p>But equally Google is a massively profitable company that makes its money precisely from obtaining, sorting and using its users’ personal data. We think therefore the court should look at the number of Android users potentially affected by the misleading conduct and Google’s responsibility for its own choice architecture, and work from there.</p>
<h2>Only some consumers?</h2>
<p>The Federal Court acknowledged not all consumers would be misled by Google’s representations. The court accepted many consumers would simply accept the privacy terms without reviewing them, an outcome consistent with the so-called <a href="https://theconversation.com/the-privacy-paradox-we-claim-we-care-about-our-data-so-why-dont-our-actions-match-143354">privacy paradox</a>. Others would review the terms and click through to more information about the options for limiting Google’s use of personal data to discover the scope of what was collected under the “Web & App Activity” default.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-privacy-paradox-we-claim-we-care-about-our-data-so-why-dont-our-actions-match-143354">The privacy paradox: we claim we care about our data, so why don't our actions match?</a>
</strong>
</em>
</p>
<hr>
<p>This might sound like the court was condoning consumers’ carelessness. In fact the court made use of insights from economists about the <a href="https://theconversation.com/inducing-choice-paralysis-how-retailers-bury-customers-in-an-avalanche-of-options-116078">behavioural biases</a> of consumers in making decisions. </p>
<p>Consumers have limited time to read legal terms and limited ability to understand the future risks arising from those terms. Thus, if consumers are concerned about privacy they might try to limit data collection by selecting various options, but are unlikely to be able to read and understand privacy legalese like a trained lawyer or with the background understanding of a data scientist. </p>
<p>If one option is labelled “Location History”, it is entirely rational for everyday consumers to assume turning it off limits location data collection by Google.</p>
<p>The number of consumers misled by Google’s representations will be difficult to assess. But even if a small proportion of Android users were misled, that will be a very large number of people. </p>
<p>There was evidence before the Federal Court that, after press reports of the tracking problem, the number of consumers switching off the “Web” option increased by 500%. Moreover, Google makes considerable profit from the large amounts of personal data it gathers and retains, and profit is important when it comes deterrence.</p>
<h2>Google’s choice architecture</h2>
<p>It has also been revealed that some employees at Google were not aware of the problem until an <a href="https://apnews.com/article/828aefab64d4411bac257a07c1af0ecb#:%7E:text=SAN%20FRANCISCO%20(AP)%20%E2%80%94%20Google,explicitly%20tell%20it%20not%20to.&text=Computer%2Dscience%20researchers%20at%20Princeton,findings%20at%20the%20AP's%20request">exposé</a> in the press. An urgent meeting was held, referred to internally as the “Oh Shit” meeting.</p>
<p>The individual Google employees at the “Oh Shit” meeting may not have been aware of the details of the system. But that is not the point. </p>
<p>It is the <a href="https://www.uwa.edu.au/news/Article/2021/February/Crown-Collingwood-and-the-corporate-conscience">company</a> fault that is the question. And a company’s culpability is not just determined by what some executive or senior employee knew or didn’t know about its processes. Google’s corporate mindset is manifested or revealed in the systems it designs and puts in place.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/inducing-choice-paralysis-how-retailers-bury-customers-in-an-avalanche-of-options-116078">Inducing choice paralysis: how retailers bury customers in an avalanche of options</a>
</strong>
</em>
</p>
<hr>
<p>Google designed the information system that faced consumers trying to manage their privacy settings. This kind of system design is sometimes referred to as “<a href="https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf.">choice architecture</a>”. </p>
<p>Here the choices offered to consumers steered them away from opting out of Google collecting, retaining and using personal location data. </p>
<p>The “Other Options” (for privacy) information failed to refer to the fact that location tracking was carried out via other processes beyond the one labelled “Location History”. Plus, the default option for “Web & App Activity” (which included location tracking) was set as “on”.</p>
<p>This privacy eroding system arose via the design of the “choice architecture”. It therefore warrants a serious penalty.</p><img src="https://counter.theconversation.com/content/159206/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jeannie Marie Paterson receives funding from the Australian Research Council for research conducted with Professor Elise Bant on 'Developing a Coherent Law of Misleading Conduct' DP 180100932.</span></em></p><p class="fine-print"><em><span>Elise Bant receives funding from the Australian Research Council for research conducted with Professor Jeannie Marie Paterson on 'Developing a Rational Law of Misleading Conduct' DP180100932 and for her Future Fellowship research on 'Unravelling Corporate Fraud' FT190100475.</span></em></p>To deter Google and other big tech companies from misleading users about data collection, the Federal Court should impose heavy fines.Jeannie Marie Paterson, Professor of Law, The University of MelbourneElise Bant, Professor of Law, The University of Western AustraliaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1495352020-11-11T02:53:50Z2020-11-11T02:53:50Z83% of Australians want tougher privacy laws. Now’s your chance to tell the government what you want<figure><img src="https://images.theconversation.com/files/368451/original/file-20201110-24-98dvaw.jpg?ixlib=rb-1.1.0&rect=17%2C0%2C5973%2C3988&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Federal Attorney-General Christian Porter has <a href="https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988">called for submissions</a> to the long-awaited review of the federal Privacy Act 1988.</p>
<p>This is the first wide-ranging review of privacy laws since the Australian Law Reform Commission produced a <a href="https://www.alrc.gov.au/publication/for-your-information-australian-privacy-law-and-practice-alrc-report-108/">landmark report</a> in 2008.</p>
<p>Australia has in the past often hesitated to adopt a strong privacy framework. The new review, however, provides an opportunity to improve data protection rules to an internationally competitive standard. </p>
<p>Here are some of the ideas proposed — and what’s at stake if we get this wrong.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/its-time-for-privacy-invasion-to-be-a-legal-wrong-31288">It's time for privacy invasion to be a legal wrong</a>
</strong>
</em>
</p>
<hr>
<h2>Australians care deeply about data privacy</h2>
<p>Personal information has never had a more central role in our society and economy, and the government has a strong mandate to update Australia’s framework for the protection of personal information. </p>
<p>In the Australian Privacy Commissioner’s 2020 survey, <a href="https://www.oaic.gov.au/assets/engage-with-us/research/acaps-2020/Australian-Community-Attitudes-to-Privacy-Survey-2020.pdf">83% of Australians said they’d like the government to do more</a> to protect the privacy of their data. </p>
<p>The intense debate about the COVIDSafe app <a href="https://auspublaw.org/2020/05/covidsafe-and-identity-governance-beyond-privacy/">earlier this year</a> also shows Australians care deeply about their private information, even in a time of crisis. </p>
<p>Privacy laws and enforcement can hardly keep up with the ever-increasing digitalisation of our lives. Data-driven innovation provides valuable services that many of us use and enjoy. However, the government’s <a href="https://www.ag.gov.au/system/files/2020-10/privacy-act-review--issues-paper-october-2020.pdf">issues paper</a> notes: </p>
<blockquote>
<p>As Australians spend more of their time online, and new technologies emerge, such as artificial intelligence, more personal information about individuals is being captured and processed, raising questions as to whether Australian privacy law is fit for purpose.</p>
</blockquote>
<p>The pandemic has accelerated the existing trend towards digitalisation and created a range of <a href="https://iapp.org/resources/article/white-paper-privacy-risks-to-individuals-in-the-wake-of-covid-19/">new privacy issues</a> including working or studying at home, and the use of personal data in contact tracing.</p>
<p>Australians are <a href="https://www.oaic.gov.au/engage-with-us/research/australian-community-attitudes-to-privacy-survey-2020-landing-page/2020-australian-community-attitudes-to-privacy-survey/">rightly concerned</a> they are losing control over their personal data. </p>
<p>So there’s no question the government’s review is sorely needed. </p>
<h2>Issues of concern for the new privacy review</h2>
<p>The government’s review follows the Australian Competition and Consumer Commission’s <a href="https://www.accc.gov.au/publications/digital-platforms-inquiry-final-report">Digital Platforms Inquiry</a>, which found that some data practices of digital platforms are unfair and undermine consumer trust. We rely heavily on digital platforms such as Google and Facebook for information, entertainment and engagement with the world around us. </p>
<p>Our interactions with these platforms leave countless digital traces that allow us to be <a href="https://theconversation.com/heres-how-tech-giants-profit-from-invading-our-privacy-and-how-we-can-start-taking-it-back-120078">profiled and tracked</a> for profit. The Australian Competition and Consumer Commission (ACCC) <a href="https://www.accc.gov.au/publications/digital-platforms-inquiry-final-report">found</a> that the digital platforms make it hard for consumers to resist these practices and to make free and informed decisions regarding the collection, use and disclosure of their personal data. </p>
<p>The government has <a href="https://www.communications.gov.au/departmental-news/government-response-accc-digital-platforms-inquiry">committed</a> to implement most of the ACCC’s <a href="https://theconversation.com/the-federal-governments-response-to-the-acccs-digital-platforms-inquiry-is-a-let-down-128775">recommendations for stronger privacy laws</a> to give us greater consumer control.</p>
<p>However, the reforms must go further. The review also provides an opportunity to address some long-standing weaknesses of Australia’s privacy regime.</p>
<p>The government’s <a href="https://www.ag.gov.au/system/files/2020-10/privacy-act-review--issues-paper-october-2020.pdf">issues paper</a>, released to inform the review, identified several areas of particular concern. These include:</p>
<ul>
<li><p>the scope of application of the Privacy Act, in particular the definition of “personal information” and current private sector exemptions</p></li>
<li><p>whether the Privacy Act provides an effective framework for promoting good privacy practices</p></li>
<li><p>whether individuals should have a direct right to sue for a breach of privacy obligations under the Privacy Act</p></li>
<li><p>whether a statutory tort for serious invasions of privacy should be introduced into Australian law, allowing Australians to go to court if their privacy is invaded</p></li>
<li><p>whether the enforcement powers of the Privacy Commissioner should be strengthened.</p></li>
</ul>
<p>While most recent attention relates to improving consumer choice and control over their personal data, the review also brings back onto the agenda some never-implemented recommendations from the Australian Law Reform Commission’s 2008 review. </p>
<p>These include introducing a <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3705881">statutory tort for serious invasions of privacy</a>, and extending the coverage of the Privacy Act.</p>
<h2>Exemptions for small business and political parties should be reviewed</h2>
<p>The Privacy Act currently contains several exemptions that limit its scope. The two most contentious exemptions have the effect that political parties and most business organisations need not comply with the general data protection standards under the Act.</p>
<p>The small business exemption is intended to reduce red tape for small operators. However, <a href="https://www.oaic.gov.au/engage-with-us/research/australian-community-attitudes-to-privacy-survey-2020-landing-page/2020-australian-community-attitudes-to-privacy-survey/">largely unknown</a> to the Australian public, it means the vast majority of Australian businesses are not legally obliged to comply with standards for fair and safe handling of personal information.</p>
<p>Procedures for compulsory venue check-ins under COVID health regulations are just one recent illustration of why this is a problem. Some people have raised <a href="https://www.abc.net.au/news/2020-10-31/covid-19-check-in-data-using-qr-codes-raises-privacy-concerns/12823432">concerns</a> that customers’ contact-tracing data, in particular collected via QR codes, may be exploited by marketing companies for targeted advertising.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A woman uses a QR code at a restaurant" src="https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=245&fit=crop&dpr=1 600w, https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=245&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=245&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=308&fit=crop&dpr=1 754w, https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=308&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=308&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Under current privacy laws, cafe and restaurant operators are exempt from complying with certain privacy obligations.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>Under current privacy laws, cafe and restaurant operators are generally exempt from complying with privacy obligations to undertake due diligence checks on third-party providers used to collect customers’ data.</p>
<p>The political exemption is another area of need of reform. As the <a href="https://www.theguardian.com/technology/2020/sep/14/facebook-suffers-blow-in-australia-legal-fight-over-cambridge-analytica">Facebook/Cambridge Analytica scandal</a> showed, political campaigning is becoming <a href="https://tacticaltech.org/#/projects/data-politics">increasingly tech-driven</a>.</p>
<p>However, Australian political parties are exempt from complying with the Privacy Act and anti-spam legislation. This means voters cannot effectively protect themselves against data harvesting for political purposes and micro-targeting in election campaigns <a href="https://theconversation.com/how-political-parties-legally-harvest-your-data-and-use-it-to-bombard-you-with-election-spam-148803">through unsolicited text messages</a>. </p>
<p>There is a <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3717561">good case for arguing</a> political parties and candidates should be subject to the same rules as other organisations. It’s what most Australians would like and, in fact, <a href="https://www.oaic.gov.au/engage-with-us/research/australian-community-attitudes-to-privacy-survey-2020-landing-page/2020-australian-community-attitudes-to-privacy-survey/">wrongly believe is already in place</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-political-parties-legally-harvest-your-data-and-use-it-to-bombard-you-with-election-spam-148803">How political parties legally harvest your data and use it to bombard you with election spam</a>
</strong>
</em>
</p>
<hr>
<h2>Trust drives innovation</h2>
<p>Trust in digital technologies is undermined when data practices come across as opaque, <a href="https://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=1098&context=yjolt">creepy</a> or unsafe. </p>
<p>There is increasing recognition that data protection <a href="https://www.researchgate.net/publication/314636773_Privacy_and_Innovation_From_Disruption_to_Opportunities">drives innovation</a> and adoption of modern applications, rather than impedes it. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A woman looks at her phone in the twilight." src="https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Trust in digital technologies is undermined when data practices come across as opaque, creepy, or unsafe.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>The <a href="https://www.covidsafe.gov.au/">COVIDSafe app</a> is a good example.
When that app was debated, the government accepted that <a href="https://www.oaic.gov.au/privacy/guidance-and-advice/privacy-obligations-regarding-covidsafe-and-covid-app-data/">robust privacy protections</a> were necessary to achieve a strong uptake by the community. </p>
<p>We would all benefit if the government saw that this same principle applies to other areas of society where our precious data is collected.</p>
<hr>
<p><em>Information on how to make a submission to the federal government review of the Privacy Act 1988 can be found <a href="https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988">here</a>.</em></p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/people-want-data-privacy-but-dont-always-know-what-theyre-getting-143782">People want data privacy but don't always know what they're getting</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/149535/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Normann Witzleb consults to the Office of the Australian Information Commissioner. He has received research funding from the International Association of Privacy Professionals.</span></em></p>Australia has hesitated in the past to adopt a strong privacy framework. A new government review provides an opportunity to improve data protection rules to an internationally competitive standard.Normann Witzleb, Associate Professor in Law, Monash UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1481492020-11-09T15:39:59Z2020-11-09T15:39:59ZA circular economy could end waste – at the cost of our privacy<figure><img src="https://images.theconversation.com/files/368299/original/file-20201109-19-9lyyrp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/this-photo-roundabout-called-circus-taken-1254731197">Elise Robert/Shutterstock</a></span></figcaption></figure><p>In a <a href="https://theconversation.com/what-a-sustainable-circular-economy-would-look-like-133808">circular economy</a>, we wouldn’t have any waste or pollution. Instead, everything we make and all our byproducts would be reused or repurposed, helping to end the over-exploitation of our finite resources and damage to our environment and climate.</p>
<p>To make this process profitable, the manufacturers of complex items such as vehicles are likely to use advanced internet-based data systems that can track components and products throughout their life cycles, from source to final use. That means that the companies best suited to helping introduce a circular economic model are the big tech firms that already use similar online data technology, such as Microsoft, Amazon and Google.</p>
<p>And that will inevitably create concerns about the implications on data privacy and security. Further sacrificing our privacy might be the price we have to pay to achieve a waste-free economic model. Or to put it another way, data protection may ultimately become a barrier to a circular economy.</p>
<p>Most research on the circular economy doesn’t examine it from a data-driven perspective. To maximise the value and life of a vehicle, manufacturers need to track its location, ownership and state of disrepair at all times. This is now possible with the use of miniaturised and digital technologies that are behind what is known as the <a href="https://theconversation.com/a-fourth-industrial-revolution-is-powering-the-rise-of-smart-manufacturing-57753">fourth industrial revolution</a>.</p>
<p>Tiny sensors can monitor the status and performance of and its components and GPS chips can track its location, from as soon as it leaves the production line until it is disposed of. This data can be gathered by connecting the object to what is known as the “Internet of Things”, and stored throughout the object’s lifecycle in the cloud. By analysing this data en masse, artificial intelligence can then predict when maintenance or replacement is required, as well as prescribing how to recycle the object.</p>
<p>These tracking capabilities already exist at the supply chain level to make the manufacturing process as cost-effective as possible, and are increasingly used to enable self-driving capabilities. In the future, this technology will also be used to assess the vehicle’s state of disrepair and schedule preventative maintenance and, eventually, to arrange its disposal and recycling.</p>
<p>The leading providers of these technologies are the large tech firms, which are more efficient at accruing, managing and analysing large amounts of data than traditional industrial companies. This has led to partnerships between the two sectors. </p>
<figure class="align-center ">
<img alt="Cars driving on busy road with symbols representing their connection to the internet" src="https://images.theconversation.com/files/368301/original/file-20201109-21-680vnm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/368301/original/file-20201109-21-680vnm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/368301/original/file-20201109-21-680vnm.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/368301/original/file-20201109-21-680vnm.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/368301/original/file-20201109-21-680vnm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/368301/original/file-20201109-21-680vnm.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/368301/original/file-20201109-21-680vnm.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The circular economy could mean more products have their location tracked throughout their lifetimes.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/sensing-system-wireless-communication-network-vehicle-1018721329">Metamorworks/Shutterstock</a></span>
</figcaption>
</figure>
<p>For example, Volkswagen and Amazon Web Services have recently launched an <a href="https://www.volkswagenag.com/en/news/2019/03/volkswagen-and-amazon-web-services-to-develop-industrial-cloud.html">“industrial cloud”</a> initiative to link up all Volkswagen’s factories and those of its global supply chain and enable them to easily share data. This allows the firm to track all the components that make up a vehicle throughout the entire production process.</p>
<p>Microsoft and BMW have launched the <a href="https://news.microsoft.com/2019/04/02/microsoft-and-the-bmw-group-launch-the-open-manufacturing-platform/">Open Manufacturing Platform</a> that similarly enables BMW to integrate data from across its supply chain, monitor components as they are manufactured and assembled into vehicles, thereby enhancing the efficiency of the process. Microsoft has also developed <a href="https://news.microsoft.com/2019/03/20/renault-nissan-mitsubishi-launches-alliance-intelligent-cloud-on-microsoft-azure/">partnerships</a> with Volkswagen and the Renault-Nissan-Mitsubishi Alliance.</p>
<p>This creates the opportunity for the global automotive sector to produce a genuine circular economy <a href="https://www.nielsen.com/us/en/insights/article/2019/technology-ecosystems-explained/">ecosystem</a>, with clear benefits for the environment in terms of reduced resource and energy use and waste production. This will also help end the over-exploitation of valuable resources. </p>
<p>But using self-driving and component-tracking technology for the secondary purpose of making a circular economy model profitable, which involves monitoring vehicle use after they’ve been purchased, raises questions about data ownership, security and privacy, as well as monopoly power.</p>
<h2>Problems with data tracking</h2>
<p>Tech companies would have the capability to track vehicle use and consumer behaviour on a regular basis to even more depth and sophistication than they already do. Internet-enabled cars are also highly prone to attack from hackers, which means this data could also be vulnerable.</p>
<p>Not all consumers are fully aware of how such technology is capable of invading their everyday lives. Tracking real-time mobility may be one step too far in some countries, and for a section of the population that isn’t willing to trade their privacy for environmental benefits.</p>
<p><a href="https://www.vox.com/recode/2020/10/6/21505027/congress-big-tech-antitrust-report-facebook-google-amazon-apple-mark-zuckerberg-jeff-bezos-tim-cook">Regulators and</a> <a href="https://theconversation.com/why-technology-puts-human-rights-at-risk-92087">human rights groups</a> are also concerned about the level of power that lies in the hands of the leading tech companies due to their monopoly ownership of consumer data and their ability to monetise it. A circular economy model would only increase this level of power since the technology companies’ cloud platforms would be used to capture, store, manage and analyse the data.</p>
<p>There are several ways that manufacturers and tech firms could alleviate some of these concerns. For example, they could protect the data against use for advertising, only collect metadata (data about the data) rather than any personal information, and set out clear policies to manage and control the data. This might include opt-out clauses for consumers who don’t want to have their products monitored for environmental purposes. Without these things, manufacturers might find it much hard to put their circular economy plans into action.</p><img src="https://counter.theconversation.com/content/148149/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Manufacturers will increasingly track the products they sell to make recycling them more profitable.Nigel Walton, Assistant Professor, School of Strategy and Leadership, Coventry UniversityAnitha Chinnaswamy, Assistant Professor of Computing, Coventry UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1437822020-10-21T12:19:30Z2020-10-21T12:19:30ZPeople want data privacy but don’t always know what they’re getting<figure><img src="https://images.theconversation.com/files/364311/original/file-20201019-15-1x4nk31.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C7301%2C4104&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Differential privacy lets organizations collect people's data while protecting their privacy, but it's not foolproof.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/social-networking-royalty-free-image/941218518?adppopup=true">imaginima/E+ via Getty Images</a></span></figcaption></figure><p>The Trump administration’s move to <a href="https://theconversation.com/the-us-has-lots-to-lose-and-little-to-gain-by-banning-tiktok-and-wechat-144478">ban the popular video app TikTok</a> has stoked fears about the Chinese government collecting personal information of people who use the app. These fears underscore growing <a href="https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/">concerns Americans have about digital privacy</a> generally.</p>
<p>Debates around privacy might seem simple: Something is private or it’s not. However, the technology that provides digital privacy is anything but simple. </p>
<p>Our data privacy research shows that people’s hesitancy to share their data stems in part from not knowing who would have access to it and how organizations that collect data keep it private. We’ve also found that when people are aware of data privacy technologies, they might not get what they expect.</p>
<h2>Differential privacy explained</h2>
<p>While there are many ways to provide privacy for people who share their data, <a href="https://theconversation.com/explainer-what-is-differential-privacy-and-how-can-it-protect-your-data-90686">differential privacy</a> has recently emerged as a leading technique and is <a href="https://www.brookings.edu/techstream/using-differential-privacy-to-harness-big-data-and-preserve-privacy/">being rapidly adopted</a>.</p>
<p>Imagine your local tourism committee wanted to find out the most popular places in your area. A simple solution would be to collect lists of all the locations you have visited from your mobile device, combine it with similar lists for everyone else in your area, and count how often each location was visited. While efficient, collecting people’s sensitive data in this way can have dire consequences. Even if the data is stripped of names, it may <a href="https://www.nytimes.com/2019/07/23/health/data-privacy-protection.html">still be possible for a data analyst or a hacker to identify and stalk individuals</a>.</p>
<p>Differential privacy can be used to protect everyone’s personal data while gleaning useful information from it. Differential privacy disguises individuals’ information by randomly changing the lists of places they have visited, possibly by removing some locations and adding others. These introduced errors make it virtually impossible to compare people’s information and use the process of elimination to determine someone’s identity. Importantly, these random changes are small enough to ensure that the summary statistics – in this case, the most popular places – are accurate. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/pT19VwBAqKA?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">The U.S. Census Bureau is using differential privacy to protect your data in the 2020 census.</span></figcaption>
</figure>
<p>In practice, differential privacy isn’t perfect. The randomization process must be calibrated carefully. Too much randomness will make the summary statistics inaccurate. Too little will leave people vulnerable to being identified. Also, if the randomization takes place after everyone’s unaltered data has been collected, as is common in some versions of differential privacy, <a href="https://www.nist.gov/blogs/cybersecurity-insights/threat-models-differential-privacy">hackers may still be able to get at the original data</a>.</p>
<p>When differential privacy was <a href="https://doi.org/10.1007/11681878_14">developed in 2006</a>, it was mostly regarded as a theoretically interesting tool. In 2014, Google became the first company to start publicly using differential privacy for <a href="https://doi.org/10.1145/2660267.2660348">data collection</a>.</p>
<p>Since then, new systems using differential privacy have been deployed by Microsoft, Google and the U.S. Census Bureau. Apple uses it to <a href="https://machinelearning.apple.com/research/learning-with-privacy-at-scale">power machine learning algorithms</a> without needing to see your data, and Uber turned to it to make sure their internal data analysts <a href="https://medium.com/uber-security-privacy/differential-privacy-open-source-7892c82c42b6">can’t abuse their power</a>. Differential privacy is often <a href="https://www.adexchanger.com/privacy/why-every-ad-tech-company-must-understand-differential-privacy/">hailed as the solution to the online advertising industry’s privacy issues</a> by allowing advertisers to learn how people respond to their ads without tracking individuals. </p>
<h2>Reasonable expectations?</h2>
<p>But it’s not clear that people who are weighing whether to share their data have clear expectations about, or understand, differential privacy.</p>
<p>In July, we, as researchers at <a href="https://scholar.google.com/citations?user=gbq5qjYAAAAJ&hl=en">Boston University</a>, the <a href="https://scholar.google.com/citations?user=2mYxmokAAAAJ&hl=en">Georgia Institute of Technology</a> and <a href="https://scholar.google.com/citations?user=MKZBcasAAAAJ&hl=en">Microsoft Research and the Max Planck Institute</a>, surveyed 675 Americans to evaluate whether people are willing to trust differentially private systems with their data. </p>
<p>We created descriptions of differential privacy based on those used by companies, media outlets and academics. These definitions ranged from nuanced descriptions that focused on what differential privacy could allow a company to do or the risks it protects against, descriptions that focused on trust in the many companies that are now using it and descriptions that simply stated that differential privacy is “<a href="https://www.census.gov/about/policies/privacy/statistical_safeguards/disclosure-avoidance-2020-census.html">the new gold standard in data privacy protection</a>,” as the Census Bureau has described it.</p>
<p>Americans we surveyed were about twice as likely to report that they would be willing to share their data if they were told, using one of these definitions, that their data would be protected with differential privacy. The specific way that differential privacy was described, however, did not affect people’s inclination to share. The mere guarantee of privacy seems to be sufficient to alter people’s expectations about who can access their data and whether it would be secure in the event of a hack. In turn, those expectations drive people’s willingness to share information. </p>
<p>Troublingly, people’s expectations of how protected their data will be with differential privacy are not always correct. For example, many differential privacy systems do nothing to protect user data from lawful law enforcement searches, but 20% of respondents expected this protection. </p>
<p>The confusion is likely due to the way that companies, media outlets and even academics describe differential privacy. Most explanations focus on what differential privacy does or what it can be used for, but do little to highlight what differential privacy can and can’t protect against. This leaves people to draw their own conclusions about what protections differential privacy provides.</p>
<h2>Building trust</h2>
<p>To help people make informed choices about their data, they need information that accurately sets their expectations about privacy. It’s not enough to tell people that a system meets a “gold standard” of some types of privacy without telling them what that means. Users shouldn’t need a degree in mathematics to make an informed choice.</p>
<p>[<em>Deep knowledge, daily.</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=deepknowledge">Sign up for The Conversation’s newsletter</a>.]</p>
<p>Identifying the best ways to clearly explain the protections provided by differential privacy will require further research to identify which expectations are most important to people who are considering sharing their data. One possibility is using techniques like <a href="https://cups.cs.cmu.edu/privacyLabel/">privacy nutrition labels</a>. </p>
<p>Helping people align their expectations with reality will also require companies using differential privacy as part of their data collecting activities to fully and accurately explain what is and isn’t being kept private and from whom.</p><img src="https://counter.theconversation.com/content/143782/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Gabriel Kaptchuk receives funding from the National Science Foundation and has been a consultant for Microsoft Research and Bolt Labs. </span></em></p><p class="fine-print"><em><span>Dr. Elissa M. Redmiles receives funding from Microsoft, Facebook, and the Max Planck Institute for Software Systems. She is affiliated with Microsoft, Facebook, and Human Computing Associates.</span></em></p><p class="fine-print"><em><span>Rachel Cummings has worked or consulted for Apple, Microsoft Research, and the U.S. Census Bureau. She has received funding from Mozilla, Facebook, and Google.</span></em></p>Differential privacy lets people to share data anonymously, but people need to know more about it to make informed decisions.Gabriel Kaptchuk, Researcher Assistant Professor in Computer Science, Boston UniversityElissa M. Redmiles, Faculty member & Research Group Leader, Max Planck Institute for Software SystemsRachel Cummings, Assistant Professor of Industrial and Systems Engineering, Georgia Institute of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1473512020-10-06T05:28:06Z2020-10-06T05:28:06ZNetflix’s The Social Dilemma highlights the problem with social media, but what’s the solution?<figure><img src="https://images.theconversation.com/files/361801/original/file-20201006-18-1m19l67.png?ixlib=rb-1.1.0&rect=64%2C7%2C2453%2C1105&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Netflix/Screenshot</span></span></figcaption></figure><p>Facebook has <a href="https://www.businessinsider.com.au/facebook-says-netflix-documentary-the-social-dilemma-sensationalist-2020-10?r=US&IR=T">responded</a> to Netflix documentary The Social Dilemma, saying it “buries the substance in sensationalism”.</p>
<p>The show is currently in Netflix Australia’s top ten list and has been popular around the globe. Some <a href="https://www.independent.co.uk/arts-entertainment/films/features/social-dilemma-netflix-film-media-facebook-twitter-algorithm-addiction-conspiracy-b454736.html">media pundits</a> suggest it’s “the most important documentary of our times”. </p>
<p>The Social Dilemma focuses on how big social media companies manipulate users by using algorithms that encourage addiction to their platforms. It also shows, fairly accurately, how platforms harvest personal data to target users with ads – and have so far gone largely unregulated. </p>
<p>But what are we meant to do about it? While the Netflix feature educates viewers about the problems social networks present to both our privacy and agency, it falls short of providing a tangible solution.</p>
<h2>A misleading response</h2>
<p>In a statement responding to the documentary, Facebook <a href="https://about.fb.com/wp-content/uploads/2020/10/What-The-Social-Dilemma-Gets-Wrong.pdf">denied</a> most of the claims made by former Facebook and other big tech company employees interviewed in The Social Dilemma. </p>
<p>It took issue with the allegation users’ data are harvested to sell ads and that this data (or the behavioural predictions drawn from it) represents the “product” sold to advertisers. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/if-its-free-online-you-are-the-product-95182">If it’s free online, you are the product</a>
</strong>
</em>
</p>
<hr>
<p>“Facebook is an ads-supported platform, which means that selling ads allows us to offer everyone else the ability to connect for free,” Facebook says.</p>
<p>However, this is a bit like saying chicken food is free for battery hens. Harvesting users’ data and selling it to advertisers, even if the data is not “<a href="https://about.fb.com/wp-content/uploads/2020/10/What-The-Social-Dilemma-Gets-Wrong.pdf">personally identifiable</a>”, is undeniably Facebook’s business model.</p>
<h2>The Social Dilemma doesn’t go far enough</h2>
<p>That said, The Social Dilemma sometimes resorts to simplistic metaphors to illustrate the harms of social media. </p>
<p>For example, a fictional character is given an “executive team” of people operating behind the scenes to maximise their interaction with a social media platform. This is supposed to be a metaphor for algorithms, but is a little creepy in its implications.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A character from The Social Dilemma looks at his phone." src="https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=303&fit=crop&dpr=1 600w, https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=303&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=303&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=381&fit=crop&dpr=1 754w, https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=381&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=381&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The Social Dilemma uses dramatisations (which aren’t necessarily accurate) to explore how social media algorithms are designed to be addictive.</span>
<span class="attribution"><span class="source">IMDB</span></span>
</figcaption>
</figure>
<p><a href="https://www.cnbc.com/2020/09/18/netflixs-the-social-dilemma-results-in-people-deleting-facebook-instagram.html">News reports</a> allege large numbers of people have <a href="https://www.theage.com.au/national/victoria/it-makes-you-want-to-throw-your-phone-in-the-bin-the-film-turning-teens-off-social-media-20200926-p55zhi.html">disconnected</a> or are taking “breaks” from social media after watching The Social Dilemma. </p>
<p>But although one of the interviewees, <a href="https://www.smithsonianmag.com/innovation/what-turned-jaron-lanier-against-the-web-165260940/">Jaron Lanier</a>, has a book called “10 Reasons To Delete your Social Accounts”, the documentary does not explicitly call for this. No immediately useful answers are given.</p>
<p>Filmmaker Jeff Orlowski seems to frame <a href="https://theconversation.com/ethical-design-is-the-answer-to-some-of-social-medias-problems-89531">“ethical” platform design</a> as the antidote. While this is an important consideration, it’s not a complete answer. And this framing is one of several issues in The Social Dilemma’s approach.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=425&fit=crop&dpr=1 600w, https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=425&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=425&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=534&fit=crop&dpr=1 754w, https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=534&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=534&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Ethical design considers the moral consequences of the design choices in a platform. It is design made with the intent to ‘do good’.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>The program also relies uncritically on interviews with former tech executives, who apparently never realised the consequences of manipulating users for monetary gain. It propagates the Silicon Valley fantasy they were just innocent geniuses wanting to improve the world (despite ample <a href="https://medium.com/@rossformaine/i-was-googles-head-of-international-relations-here-s-why-i-left-49313d23065">evidence</a> to the <a href="https://www.wired.com/story/cambridge-analytica-50m-facebook-users-data/">contrary</a>). </p>
<p>As tech policy expert Maria Farell suggests, these retired “<a href="https://conversationalist.org/2020/03/05/the-prodigal-techbro/">prodigal tech bros</a>”, who are now safely insulated from consequences, are presented as the moral authority. Meanwhile, the digital rights and privacy activists who have worked for decades to hold them to account are largely omitted from view. </p>
<h2>Behavioural change</h2>
<p>Given the documentary doesn’t really tell us how to fight the tide, what can you, as the viewer, do? </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/a-month-at-sea-with-no-technology-taught-me-how-to-steal-my-life-back-from-my-phone-127501">A month at sea with no technology taught me how to steal my life back from my phone</a>
</strong>
</em>
</p>
<hr>
<p>Firstly, you can take The Social Dilemma as a cue to become more aware of how much of your data is given up on a daily basis – and you can change your behaviours accordingly. One way is to change your social media privacy settings to restrict (as much as possible) the data networks can gather from you. </p>
<p>This will require going into the “settings” on every social platform you have, to restrict both the audience you share content with and the number of third parties the platform shares your behavioural data with. </p>
<p>In Facebook, you can actually <a href="https://theconversation.com/how-to-stop-haemorrhaging-data-on-facebook-94511">switch off “platform apps” entirely</a>. This restricts access by partner or third-party applications. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-to-stop-haemorrhaging-data-on-facebook-94511">How to stop haemorrhaging data on Facebook</a>
</strong>
</em>
</p>
<hr>
<p>Unfortunately, even if you do restrict your privacy settings on platforms (particularly Facebook), they can still collect and use your “platform” data. This includes content you read, “like”, click and hover over.</p>
<p>So, you may want to opt for limiting the time you spend on these platforms. This is not always practical, given how <a href="https://www.vox.com/culture/2018/3/22/17146776/delete-facebook-how-to-quit-difficult">important they are in our lives</a>. But if you want to do so, there are dedicated tools for this in some mobile operating systems. </p>
<p>Apple’s iOS, for example, has implemented “screen time” tools aimed at minimising time spent on apps such as Facebook. Some have argued, though, this can <a href="https://www.theatlantic.com/technology/archive/2019/09/why-apple-screen-time-mostly-makes-things-worse/597397/">make things worse</a> by making the user feel bad, while still easily side-stepping the limitation.</p>
<p>As a user, the best you can do is tighten your privacy settings, limit the time you spend on platforms and carefully consider whether you need each one. </p>
<h2>Legislative reform</h2>
<p>In the long run, stemming the flow of personal data to digital platforms will also need legislative change. While legislation can’t fix everything, it can encourage systemic change. </p>
<p>In Australia, we need stronger data privacy protections, preferably in the form of blanket legislative protection such as the General Data Protection Regulation <a href="https://eur-lex.europa.eu/content/news/general-data-protection-regulation-GDPR-applies-from-25-May-2018.html">implemented in Europe</a> in 2018. </p>
<p>The GDPR was designed to bring social media platforms to heel and is geared towards providing individuals more control over their personal data. Australians don’t yet have similar comprehensive protections, but regulators have been making inroads. </p>
<p>Last year, the Australian Competition and Consumer Commission finalised its <a href="https://www.accc.gov.au/speech/the-acccs-digital-platforms-inquiry-and-the-need-for-competition-consumer-protection-and-regulatory-responses">Digital Platforms Inquiry</a> investigating a range of issues relating to tech platforms, including data collection and privacy.</p>
<p>It made a number of recommendations that will hopefully result in legislative change. These focus on improving and bolstering the definitions of “consent” for consumers, including explicit understanding of when and how their data is being tracked online. </p>
<p>If what we’re facing is indeed a “social dilemma”, it’s going to take more than the remorseful words of a few Silicon Valley tech-bros to solve it.</p><img src="https://counter.theconversation.com/content/147351/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The documentary educates viewers about the problems social networks present to both our privacy and agency online. But it doesn’t really tell us how to fight the tide.Belinda Barnet, Senior Lecturer in Media and Communications, Swinburne University of TechnologyDiana Bossio, Lecturer, Media and Communications, Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.