tag:theconversation.com,2011:/id/topics/privacy-law-3067/articlesPrivacy law – The Conversation2024-03-04T13:41:26Ztag:theconversation.com,2011:article/2248812024-03-04T13:41:26Z2024-03-04T13:41:26ZDoes the royal family have a right to privacy? What the law says<figure><img src="https://images.theconversation.com/files/579223/original/file-20240301-20-qpf6ze.jpg?ixlib=rb-1.1.0&rect=97%2C53%2C3149%2C2107&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/auckland-new-zealand-april-11-duchess-186607307">Shaun Jeffers/Shutterstock</a></span></figcaption></figure><p>From court cases to conspiracy theories, the royal family’s right to privacy is, somewhat ironically, nearly always in the spotlight. The latest focus is Kate Middleton, Princess of Wales, whose whereabouts have been the subject of <a href="https://www.townandcountrymag.com/society/tradition/a60008117/kate-middleton-health-speculation-conspiracy-theories-online/">online speculation</a> after it was announced she was undergoing abdominal surgery and would be away from public duties until after Easter.</p>
<p>This comes just weeks after King Charles <a href="https://www.bbc.co.uk/news/uk-68208157">revealed that he is undergoing treatment for cancer</a>, and a legal settlement between Prince Harry and Mirror Group Newspapers over <a href="https://www.bbc.co.uk/news/uk-68249009">illegal phone hacking</a>.</p>
<p>Interest in the personal lives of the royals and other celebrities <a href="https://www.tandfonline.com/doi/full/10.1080/1461670X.2016.1150193">is a constant</a>, driving newspaper sales and online clicks for decades. You only needs to consider the media frenzy that followed Princess Diana to <a href="https://www.tandfonline.com/doi/full/10.1080/17512786.2013.833678">see this</a>, and its potentially devastating consequences. </p>
<p>From a legal perspective, the British courts have ruled that everyone – the royal family included – is entitled to a right to privacy. The Human Rights Act incorporates into British law the rights set out by the European Convention on Human Rights. This includes article 8, which focuses on the right to privacy.</p>
<p>In the years after the Human Rights Act came into force, courts ruled on a string of cases from celebrities claiming that the press invaded their privacy. Courts had to balance article 8 of the convention against article 10, the right to freedom of expression. </p>
<p>Rulings repeatedly stated that, despite being in and sometimes seeking the limelight, celebrities should still be afforded a right to privacy. Some disagree with this position, such as prominent journalist <a href="https://www.independent.co.uk/news/uk/home-news/prince-harry-hacking-piers-morgan-b2336442.html">Piers Morgan, who has criticised</a> the Duke and Duchess of Sussex asking for privacy when they have also released a Netflix documentary, a broadcast interview with Oprah Winfrey and published a memoir.</p>
<p>But the courts have made the position clear, as in the case concerning Catherine Zeta-Jones and Michael Douglas after Hello! Magazine published unauthorised photographs from their wedding. The <a href="https://eprints.whiterose.ac.uk/190559/3/Final%20Edited%20Version%20-%20Celebrity%20Privacy%20and%20Celebrity%20Journalism-%20Has%20anything%20changed%20since%20the%20Leveson%20Inquiry_.pdf">court stated</a> that: “To hold that those who have sought any publicity lose all protection would be to repeal article 8’s application to very many of those who are likely to need it.” </p>
<p>There is no universal definition of privacy, but scholars have identified key concepts encompassing what privacy can entail. In my own research, I have argued that the <a href="https://eprints.whiterose.ac.uk/190559/3/Final%20Edited%20Version%20-%20Celebrity%20Privacy%20and%20Celebrity%20Journalism-%20Has%20anything%20changed%20since%20the%20Leveson%20Inquiry_.pdf">notion of choice</a> is one of these. Privacy allows us to control the spread of information about ourselves and disclose information to whom we want. </p>
<h2>Privacy and the public interest</h2>
<p>There are exceptions to these protections if the person involved had no reasonable expectation of privacy, or if it was in the public interest for this information to be revealed. There is no solid, legal definition of the “public interest”, so this is decided on a case-by-case basis.</p>
<p><a href="https://www.tandfonline.com/doi/full/10.1080/17577632.2021.1889866">In the past</a>, the public interest defence has been applied because a public figure or official has acted hypocritically and the courts have stated there is a right for a publisher to set the record straight.</p>
<p>When it comes to medical records and information concerning health, case law and journalistic <a href="https://www.ipso.co.uk/editors-code-of-practice/">editorial codes of conduct</a> are clear that this information is afforded the utmost protection.</p>
<p>Model Naomi Campbell was pictured leaving a Narcotics Anonymous meeting and these images were published by the Daily Mirror. The court found that there had been a public interest in revealing the fact she was attending these meetings, as she had previously denied substance abuse. </p>
<p>The House of Lords accepted that there was a public interest in the press “setting the record straight”. Nonetheless, the publication of additional, confidential details, and the photographs of her leaving the meeting were a <a href="https://www.theguardian.com/media/2004/may/06/mirror.pressandpublishing1">step too far</a>. The House of Lords highlighted the importance of being able to keep medical records and information private.</p>
<h2>Royal health</h2>
<p>When it comes to the royals, the history of <a href="https://www.townandcountrymag.com/society/tradition/a23798094/lindo-wing-st-marys-hospital-facts-photos/">publicity</a> around royal births, often posing with the newborn royal baby outside of the hospital, has set a precedent for what the public can expect about the royals’ medical information. When they choose to go against this tradition, it can frustrate both royal-watchers and publishers. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/who-owns-the-royal-body-public-interest-in-royal-health-reveals-anxieties-about-our-rulers-221534">Who owns the royal body? Public interest in royal health reveals anxieties about our rulers</a>
</strong>
</em>
</p>
<hr>
<p>King Charles made the choice to openly speak about his enlarged prostate to “assist public understanding”. And, as Prostate Cancer UK noted, this has worked – they noted a <a href="https://www.independent.co.uk/news/uk/home-news/king-charles-cancer-statement-treatment-b2494190.html">500% increase in people visiting their website</a>. However, he has chosen to not to divulge information about his cancer diagnosis beyond the fact that he is receiving treatment. This is his right.</p>
<p>While revealing further information might stop speculation and rumours about his health, it is not the king’s duty to divulge private, medical information. However, if his health begins to impact his ability to act as monarch, the situation could change. </p>
<p>It might be that the press finds more information about his health without his knowledge, but unless they have a genuine public interest in publishing this information, privacy should prevail. </p>
<p>You would no doubt want your private medical information kept secret, not shared around your workplace and speculated on unless it was absolutely necessary. It is thanks to these laws and court precedent that you don’t have to worry about this. The royal family, regardless of their position, should expect the same standard.</p><img src="https://counter.theconversation.com/content/224881/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Gemma Horton does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Even those who seek out the spotlight have a legal right to privacy.Gemma Horton, Impact Fellow for Centre for Freedom of the Media, University of SheffieldLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1989792023-03-02T19:38:20Z2023-03-02T19:38:20ZProtecting privacy online begins with tackling ‘digital resignation’<figure><img src="https://images.theconversation.com/files/512989/original/file-20230301-26-syl2am.jpg?ixlib=rb-1.1.0&rect=25%2C8%2C5725%2C3819&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Going online often involves surrendering some privacy, and many people are becoming resigned to the fact that their data will be collected and used without their explicit consent.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>From <a href="https://www.cnbc.com/2022/11/26/the-biggest-risks-of-using-fitness-trackers-to-monitor-health.html">smart watches</a> and meditation apps to digital assistants and social media platforms, we interact with technology daily. And some of these technologies have <a href="https://childdatacitizen.com/coerced-digital-participation/">become an essential part of our social and professional lives</a>. </p>
<p>In exchange for access to their digital products and services, many tech companies collect and use our personal information. They use that information to predict and influence our future behaviour. This kind of <a href="https://news.harvard.edu/gazette/story/2019/03/harvard-professor-says-surveillance-capitalism-is-undermining-democracy/">surveillance capitalism</a> can take the form of <a href="https://theconversation.com/the-dark-side-of-alexa-siri-and-other-personal-digital-assistants-126277">recommendation algorithms</a>, targeted advertising and <a href="https://www.mckinsey.com/capabilities/growth-marketing-and-sales/our-insights/the-future-of-personalization-and-how-to-get-ready-for-it">customized experiences</a>. </p>
<p>Tech companies claim these personalized experiences and benefits enhance the user’s experience, however <a href="https://repository.upenn.edu/cgi/viewcontent.cgi?article=1554&context=asc_papers">the vast majority of consumers are unhappy with these practices</a>, especially after learning how their data is collected.</p>
<h2>‘Digital resignation’</h2>
<p><a href="https://dx.doi.org/10.2139/ssrn.1478214">Public knowledge is lacking</a> when it comes to how data is collected. Research shows that corporations both cultivate feelings of resignation and <a href="https://repository.upenn.edu/cgi/viewcontent.cgi?article=1554&context=asc_papers">exploit this lack of literacy</a> to normalize the practice of maximizing the amount of data collected. </p>
<p>Events like the <a href="https://www.wired.com/story/cambridge-analytica-facebook-privacy-awakening/">Cambridge Analytica</a> scandal and revelations of mass government surveillance by <a href="https://www.reuters.com/article/us-usa-nsa-spying-idUSKBN25T3CK">Edward Snowden</a> shine a light on data collection practices, but they leave people powerless and resigned that their data will be collected and used without their explicit consent. This is called <a href="http://dx.doi.org/10.1177/1461444819833331">“digital resignation”</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A smartphone displaying the facebook logo." src="https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/512979/original/file-20230301-22-br1873.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">In 2022 Facebook’s parent company, Meta, agreed to pay $725 million to settle a lawsuit concerning users’ personal information to be fed to Cambridge Analytica.</span>
<span class="attribution"><span class="source">(AP Photo/Michael Dwyer, File</span></span>
</figcaption>
</figure>
<p>But while there is much discussion surrounding the collection and use of personal data, there is far less discussion about the modus operandi of tech companies. </p>
<p><a href="https://spectrum.library.concordia.ca/id/eprint/990750/">Our research</a> shows that tech companies use a variety of strategies to deflect responsibility for privacy issues, neutralize critics and prevent legislation. These strategies are designed to limit citizens’ abilities to make informed choices. </p>
<p>Policymakers and corporations themselves must acknowledge and correct these strategies. Corporate accountability for privacy issues cannot be achieved by addressing data collection and use alone. </p>
<h2>The pervasiveness of privacy violations</h2>
<p>In their study of harmful industries such as the tobacco and mining sectors, <a href="http://dx.doi.org/10.1086/653091">Peter Benson and Stuart Kirsch</a> identified strategies of denial, deflection and symbolic action used by corporations to deflect criticism and prevent legislation.</p>
<p>Our research shows that these strategies hold true in the tech industry. Facebook has a long history of <a href="https://www.theguardian.com/technology/2019/aug/23/cambridge-analytica-facebook-response-internal-document">denying and deflecting responsibility</a> for privacy issues despite its numerous scandals and criticisms.</p>
<p>Amazon has also been harshly criticized for providing <a href="https://www.theguardian.com/technology/2022/jul/13/amazon-ring-doorbell-videos-police-11-times-without-permission">Ring security camera footage to law enforcement officials without a warrant or customer consent</a>, sparking <a href="https://www.eff.org/deeplinks/2021/02/lapd-requested-ring-footage-black-lives-matter-protests">civil rights concerns</a>. The company has also created <a href="https://www.theverge.com/2022/9/20/23362010/ring-nation-mgm-amazon-mark-burnett-barry-poznick-civil-rights-cancel">a reality show using Ring security camera footage</a>. </p>
<p>Canadian and U.S. federal government employees have <a href="https://www.wsj.com/articles/canada-follows-u-s-europe-with-tiktok-ban-on-government-devices-2273b07f">recently been banned from downloading TikTok</a> onto their devices due to an “unacceptable” risk to privacy. TikTok has launched <a href="https://www.theverge.com/2023/2/2/23583491/tiktok-transparency-center-tour-photos-bytedance">an elaborate spectacle of symbolic action</a> with the opening of its <a href="https://www.youtube.com/watch?v=PxfIGVQTfWQ">Transparency and Accountability Center</a>. This cycle of denial, deflection and symbolic action normalizes privacy violations and fosters cynicism, resignation and disengagement.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A black and silver ring doorbell on a door frame." src="https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/512973/original/file-20230301-424-zveqs2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Amazon has faced criticism for creating a new reality show based on footage captured by Ring doorbells.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<h2>How to stop digital resignation</h2>
<p>Technology permeates every aspect of our daily lives. But informed consent is impossible when the average person is neither motivated nor <a href="https://ndg.asc.upenn.edu/wp-content/uploads/2018/09/Persistent-Misperceptions.pdf">knowledgeable enough</a> to read terms and conditions policies designed to confuse.</p>
<p>The <a href="https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age_en">European Union</a> has recently enacted laws that recognize these harmful market dynamics and have started holding platforms and tech companies <a href="https://www.cnn.com/2022/11/30/tech/twitter-eu-compliance-warning/index.html">accountable</a>. </p>
<p>Québec has recently revised its privacy laws with <a href="https://www.quebec.ca/gouvernement/ministeres-et-organismes/institutions-democratique-acces-information-laicite/acces-documents-protection-renseignements-personnels/pl64-modernisation-de-la-protection-des-renseignements-personnels">Law 25</a>. The law is designed to provide citizens with increased protection and control over their personal information. It gives people the ability to request their personal information and move it to another system, to rectify or delete it (<a href="https://gdpr.eu/right-to-be-forgotten/">the right to be forgotten</a>) as well as the right to be informed when being subjected to automated decision making. </p>
<p>It also requires organizations to appoint a privacy officer and committee, and conduct privacy impact assessments for every project where personal information is involved. Terms and policies must also be communicated clearly and transparently and consent must be explicitly obtained.</p>
<p>At the federal level, the government has tabled <a href="https://ised-isde.canada.ca/site/innovation-better-canada/en/canadas-digital-charter/bill-summary-digital-charter-implementation-act-2020">Bill C-27, the <em>Digital Charter Implementation Act</em></a> and is currently under review by the House of Commons. It bears many resemblances to Québec’s Law 25 and also includes additional measures to regulate technologies such as artificial intelligence systems.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A laptop showing a terms and conditions document." src="https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/512971/original/file-20230301-20-41o1s8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Online terms and conditions are often too long and difficult for consumers to understand.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>Our findings highlight the urgent need for more privacy literacy and stronger regulations that not just regulate what is permitted, but also monitor and make accountable the firms who breach consumer privacy. This would ensure informed consent to data collection and disincentivize violations. We recommend that: </p>
<p>1) Tech companies must explicitly specify what personal data will be collected and used. Only essential data should be collected and customers should be able to opt out of non-essential data collection. This is similar to the <a href="https://gdpr.eu/cookies/">EU’s General Data Protection Regulation</a> to obtain user consent before using non-essential cookies or <a href="https://support.apple.com/en-ca/HT212025">Apple’s App Tracking Transparency</a> feature which allows users to block apps from tracking them.</p>
<p>2) Privacy regulations must also recognize and address the rampant use of <a href="https://www.vox.com/recode/22351108/dark-patterns-ui-web-design-privacy">dark patterns</a> to influence people’s behaviour, such as coercing them into providing consent. This can include the use of design elements, language or features such as making it difficult to decline non-essential cookies or making the button to provide more personal data more prominent than the opt-out button.</p>
<p>3) Privacy oversight bodies such as the <a href="https://www.priv.gc.ca/en">Office of the Privacy Commissioner of Canada</a> <a href="https://www.cbc.ca/news/canada/nova-scotia/houston-privacy-commissioner-promise-may-be-softening-1.6624079">must be fully independent</a> and authorized to investigate and <a href="https://financialpost.com/news/privacy-watchdogs-lament-lack-powers-tim-hortons-probe">enforce privacy regulations</a>.</p>
<p>4) While privacy laws like Québec’s require organizations to appoint a privacy officer, the role must also be fully independent and given the power to enforce compliance with privacy laws if it is to be effective in improving accountability.</p>
<p>5) Policymakers must be more proactive in updating legislation to account for the rapid advances of digital technology. </p>
<p>6) Finally, penalties for non-compliance often pale in comparison to the profits gained and social harms from misuse of data. For example, the U.S. Federal Trade Commission (FTC) imposed <a href="https://www.ftc.gov/news-events/news/press-releases/2019/07/ftc-imposes-5-billion-penalty-sweeping-new-privacy-restrictions-facebook">a $5 billion penalty on Facebook</a> (5.8 per cent of its <a href="https://investor.fb.com/investor-news/press-release-details/2021/Facebook-Reports-Fourth-Quarter-and-Full-Year-2020-Results/default.aspx">2020 annual revenue</a>) for its role in the <a href="https://www.vox.com/policy-and-politics/2018/3/23/17151916/facebook-cambridge-analytica-trump-diagram">Cambridge Analytica scandal</a>.</p>
<p>While this fine is the highest ever given by the FTC, it is not representative of the social and political impacts of the scandal and its influence in <a href="https://www.npr.org/2018/03/20/595338116/what-did-cambridge-analytica-do-during-the-2016-election">key political events</a>. In some cases, it may be more profitable for a company to strategically pay a fine for non-compliance. </p>
<p>To make tech giants more responsible with their users’ data, the cost of breaching data privacy must outweigh the potential profits of exploiting consumer data.</p><img src="https://counter.theconversation.com/content/198979/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Many people have become resigned to the fact that tech companies collect our private data. But policymakers must do more to limit the amount of personal information corporations can collect.Meiling Fong, PhD Student, Individualized Program, Concordia UniversityZeynep Arsel, Concordia University Chair in Consumption, Markets, and Society, Concordia UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1992282023-02-21T13:25:09Z2023-02-21T13:25:09ZFlorida will no longer ask high school athletes about their menstrual cycles, but many states still do – here are 3 reasons why that’s problematic<figure><img src="https://images.theconversation.com/files/510662/original/file-20230216-20-2sy4zs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">If female athletes have to answer menstruation-related questions in order to play team sports, that could be a form of sex-based discrimination. </span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/arvada-and-columbine-face-off-at-jeffco-stadium-in-lakewood-news-photo/1431460237">AAron Ontiveroz/MediaNews Group/The Denver Post via Getty Images</a></span></figcaption></figure><p><a href="https://www.forbes.com/sites/brucelee/2022/10/05/florida-high-schools-are-asking-female-athletes-5-questions-about-their-menstrual-periods/">Concerns are being raised</a> across the U.S. about whether schools have a right to compel female athletes to provide information about their menstrual cycles.</p>
<p>The <a href="https://fhsaa.com/index.aspx">Florida High School Athletic Association</a> Board of Directors <a href="https://www.palmbeachpost.com/story/news/education/2023/02/07/florida-legislators-call-on-fhsaa-to-scrap-menstrual-history-questions/69882335007/">rejected a proposal</a> in February 2023 that would have required high school girls to answer <a href="https://fhsaa.com/documents/2023/1/19//SMAC_PPE_Draft_1_17_2023.pdf?id=3887">four questions about their menstrual cycles</a> in order to play on school sports teams. The questions had previously been optional.</p>
<p>The four questions were: Have you had a menstrual cycle? How old were you when you had your first menstrual period? When was your most recent menstrual period? How many periods have you had in the past 12 months? </p>
<p>The answers, along with the rest of students’ medical history, would have been entered into an online platform and stored on a third-party database called <a href="https://www.aktivate.com/">Aktivate</a>. <a href="https://www.nytimes.com/2023/02/09/us/florida-student-athlete-periods.html">School personnel</a> would have had access to this information.</p>
<p>While Florida decided to scrap the questions from their student forms, many states currently ask similar questions of their female athletes prior to participation in their sport.</p>
<p>As researchers who are experts in <a href="https://scholar.google.com/citations?user=dYfhb9sAAAAJ&hl=en">Title IX</a>, sports and health care equity, and <a href="https://law.umn.edu/profiles/david-schultz">constitutional law</a>, we have identified three reasons why schools and states tracking female athletes’ menstrual history may conflict with federal laws.</p>
<p><iframe id="M8CbI" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/M8CbI/2/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<h2>1. It may violate federal anti-discrimination law</h2>
<p><a href="https://www2.ed.gov/about/offices/list/ocr/docs/tix_dis.html">Title IX</a>, a federal policy passed in 1972, prohibits federally funded schools from discriminating against students based on sex, sexual orientation or gender identity. The goal of the policy is to <a href="https://www.ed.gov/news/press-releases/us-department-education-releases-proposed-changes-title-ix-regulations-invites-public-comment">end sex discrimination, sex-based harassment and sexual violence </a> in education.</p>
<p>While Title IX applies to all school settings, it is <a href="https://doi.org/10.1123/wspaj.2022-0053">often most associated with athletics</a>. </p>
<p>Requiring female student-athletes to submit menstrual cycle data to their schools could be a form of <a href="https://www2.ed.gov/policy/rights/guid/ocr/sexoverview.html">sex discrimination</a> and therefore violate <a href="https://www.nfhs.org/articles/nine-ways-title-ix-protects-high-school-students/">Title IX</a>. The reason it is potentially discriminatory is because girls are the only students at risk of being denied the opportunity to play sports if they choose not to provide schools with details about their menstrual cycles.</p>
<p>In a <a href="https://scholarworks.law.ubalt.edu/cgi/viewcontent.cgi?article=2114&context=all_fac">2020 Harvard Journal of Law and Gender study</a>, three scholars argue that schools should create educational settings free of “unnecessary anxiety about the biological process of menstruation.”</p>
<p>“Because menstruation is a biological process linked to female sex,” they write, “educational deprivations connected with schools’ treatment of menstruation should be understood as a violation of Title IX’s core proposition.” </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/510680/original/file-20230216-18-n8pjoe.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Screenshot of a medical form with questions about menstrual history" src="https://images.theconversation.com/files/510680/original/file-20230216-18-n8pjoe.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/510680/original/file-20230216-18-n8pjoe.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=383&fit=crop&dpr=1 600w, https://images.theconversation.com/files/510680/original/file-20230216-18-n8pjoe.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=383&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/510680/original/file-20230216-18-n8pjoe.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=383&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/510680/original/file-20230216-18-n8pjoe.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=481&fit=crop&dpr=1 754w, https://images.theconversation.com/files/510680/original/file-20230216-18-n8pjoe.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=481&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/510680/original/file-20230216-18-n8pjoe.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=481&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Questions about students’ menstrual history were removed from the Florida High School Athletic Association’s physical evaluation form.</span>
<span class="attribution"><a class="source" href="https://fhsaa.com/documents/2023/1/19//SMAC_PPE_Draft_1_17_2023.pdf?id=3887">Florida High School Athletic Association</a></span>
</figcaption>
</figure>
<h2>2. It threatens constitutional rights</h2>
<p>Tracking female athletes’ menstrual history may be downright unconstitutional. </p>
<p>Forcing only females to disclose private medical information may violate the <a href="https://constitutioncenter.org/the-constitution/amendments/amendment-xiv/clauses/702">equal protection clause</a> of the <a href="https://constitution.congress.gov/browse/essay/amdt14-S1-8-8-1/ALDE_00000830/">14th Amendment</a> of the U.S. Constitution, which prohibits sex-based discrimination.</p>
<p>Also, <a href="https://www.npwomenshealthcare.com/privacy-rights-in-state-constitutions-may-protect-their-abortion-access/">11 states</a> have a “right to privacy” written into their state constitutions. For example, the <a href="https://www.flsenate.gov/laws/constitution#A1S23">Florida Constitution</a> states that “all natural persons, female and male alike, are equal before the law and have inalienable rights,” including “the right to be let alone and free from governmental intrusion into the person’s private life.”</p>
<p>While other states do not explicitly provide a right to privacy in their constitutions, legal precedent has determined that this right is <a href="https://www.dataguidance.com/jurisdiction/arkansas">implicit in the U.S. Constitution</a>.</p>
<p>And finally, federal laws that protect <a href="https://www.cdc.gov/phlp/publications/topic/hipaa.html">medical</a> and <a href="https://www2.ed.gov/policy/gen/guid/fpco/ferpa/index.html">educational</a> records do not have <a href="https://rems.ed.gov/docs/2019%20HIPAA%20FERPA%20Joint%20Guidance.pdf">standards</a> for maintaining medical records that are shared with schools and stored on third-party databases. This lack of precedent may result in privacy breaches.</p>
<h2>3. It could be used against transgender students</h2>
<p>The recent passage of several anti-LGBTQ+ policies in Florida made the Florida High School Athletic Association’s attempts to track and digitally store menstrual data particularly worrisome to trans rights advocates.</p>
<p>In June 2021, Gov. Ron DeSantis <a href="https://www.flsenate.gov/Session/Bill/2021/1028">signed a bill</a> <a href="https://www.npr.org/2021/06/02/1002405412/on-the-first-day-of-pride-month-florida-signed-a-transgender-athlete-bill-into-l">prohibiting trans girls from playing on girls athletic teams</a>. </p>
<p>In March 2022, DeSantis signed the Parental Rights in Education bill, better known as the <a href="https://www.npr.org/2022/03/28/1089221657/dont-say-gay-florida-desantis">“Don’t Say Gay” bill</a>. It prohibits classroom instruction on sexual orientation and gender identity in K-3 public school classrooms. </p>
<p>And just one week after the proposed mandate was struck down, a <a href="https://www.tampabay.com/news/florida-politics/2023/02/16/fhsaa-desantis-board-private-homeschool-prayer-announcements-menstrual/">Florida House committee advanced a bill</a> that would place the Governor’s office in control of the Florida High School Athletic Association.</p>
<p>As more states try to <a href="https://apnews.com/article/ron-desantis-health-business-florida-government-and-politics-78e417a184718de8b9e71ff32efbc77f">ban trans youth from receiving gender-affirming medical care</a> – including hormone therapy, surgical procedures and other treatments – menstrual tracking in athletes could serve as another mechanism to harm and criminalize transgender youth. </p>
<p>Tracking menstrual cycles could “out” trans youth if they are required to disclose information about their menstrual cycle – whether that is the presence or absence of a cycle. If a school is responsible for outing trans kids, they violate both <a href="https://www.aclu.org/news/lgbtq-rights/trans-students-should-be-treated-with-dignity-not-outed-by-their-schools">constitutional rights</a> and <a href="https://www.knowyourix.org/college-resources/title-ix-protections-lgbtq-students/">Title IX policy</a>, and they risk endangering the outed students’ welfare. </p>
<h2>Protecting period privacy</h2>
<p>While the proposed Florida mandate was rejected, we have found that most states do in fact collect data on high school athletes’ menstrual cycles. </p>
<p>Based on our collection of sports pre-participation forms, only four states – Mississippi, New Hampshire, New York and Oklahoma – as well as Washington, D.C., do not currently ask any questions about menstrual history on the sport pre-participation medical forms provided by their state athletic association. </p>
<p>Following the vote on the Florida proposal, <a href="https://www.news-press.com/story/news/education/2023/02/09/congress-introduces-menstrual-questions-legislation-aimed-at-florida/69889797007/">three House Democrats introduced legislation</a> called the Privacy in Education Regarding Individuals’ Own Data Act, or <a href="https://www.washingtonexaminer.com/news/house/schiff-omar-bill-menstruation-desantis">PERIOD Act</a>. It would prohibit schools from collecting menstrual information altogether. </p>
<p>If this legislation is adopted, the estimated <a href="https://www.nfhs.org/media/5989280/2021-22_participation_survey.pdf">3 million American high school girls</a> who play sports in a state that still asks about menstrual history will no longer have to share this information.</p><img src="https://counter.theconversation.com/content/199228/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>When schools ask student-athletes about their menstrual cycles, they may be infringing on anti-discrimination and privacy laws.Lindsey Darvin, Assistant Professor of Sport Management, Syracuse UniversityDavid Schultz, Professor of Political Science, Hamline University Tia Spagnuolo, Doctoral Student in Community Research and Action, Binghamton University, State University of New YorkLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1992832023-02-08T01:06:45Z2023-02-08T01:06:45ZChatGPT is a data privacy nightmare. If you’ve ever posted online, you ought to be concerned<figure><img src="https://images.theconversation.com/files/508567/original/file-20230207-13-uu7jfn.jpeg?ixlib=rb-1.1.0&rect=35%2C0%2C5955%2C3988&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>ChatGPT has taken the world by storm. Within two months of its release it reached 100 million <a href="https://news.yahoo.com/chatgpt-100-million-users-january-130619073.html">active users</a>, making it the fastest-growing consumer <a href="https://www.reuters.com/technology/chatgpt-sets-record-fastest-growing-user-base-analyst-note-2023-02-01/">application ever launched</a>. Users are attracted to the tool’s <a href="https://oneusefulthing.substack.com/p/chatgtp-is-my-co-founder">advanced capabilities</a> – and concerned by its potential to cause disruption in <a href="https://theconversation.com/chatgpt-students-could-use-ai-to-cheat-but-its-a-chance-to-rethink-assessment-altogether-198019">various sectors</a>. </p>
<p>A much less discussed implication is the privacy risks ChatGPT poses to each and every one of us. Just yesterday, <a href="https://blog.google/technology/ai/bard-google-ai-search-updates/">Google unveiled</a> its own conversational AI called Bard, and others will surely follow. Technology companies working on AI have well and truly entered an arms race. </p>
<p>The problem is it’s fuelled by our personal data.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/everyones-having-a-field-day-with-chatgpt-but-nobody-knows-how-it-actually-works-196378">Everyone's having a field day with ChatGPT – but nobody knows how it actually works</a>
</strong>
</em>
</p>
<hr>
<h2>300 billion words. How many are yours?</h2>
<p>ChatGPT is underpinned by a large language model that requires massive amounts of data to function and improve. The more data the model is trained on, the better it gets at detecting patterns, anticipating what will come next and generating plausible text. </p>
<p>OpenAI, the company behind ChatGPT, fed the tool some <a href="https://www.sciencefocus.com/future-technology/gpt-3/">300 billion words</a> systematically scraped from the internet: books, articles, websites and posts – including personal information obtained without consent.</p>
<p>If you’ve ever written a blog post or product review, or commented on an article online, there’s a good chance this information was consumed by ChatGPT.</p>
<h2>So why is that an issue?</h2>
<p>The data collection used to train ChatGPT is problematic for several reasons.</p>
<p>First, none of us were asked whether OpenAI could use our data. This is a clear violation of privacy, especially when data are sensitive and can be used to identify us, our family members, or our location. </p>
<p>Even when data are publicly available their use can breach what we call <a href="https://digitalcommons.law.uw.edu/wlr/vol79/iss1/10/">contextual integrity</a>. This is a fundamental principle in legal discussions of privacy. It requires that individuals’ information is not revealed outside of the context in which it was originally produced.</p>
<p>Also, OpenAI offers no procedures for individuals to check whether the company stores their personal information, or to request it be deleted. This is a guaranteed right in accordance with the European General Data Protection Regulation (<a href="https://gdpr-info.eu/art-17-gdpr/">GDPR</a>) – although it’s still under debate whether ChatGPT is compliant <a href="https://blog.avast.com/chatgpt-data-use-legal">with GDPR requirements</a>.</p>
<p>This “right to be forgotten” is particularly important in cases where the information is inaccurate or misleading, which seems to be a <a href="https://www.fastcompany.com/90833017/openai-chatgpt-accuracy-gpt-4">regular occurrence</a> with ChatGPT. </p>
<p>Moreover, the scraped data ChatGPT was trained on can be proprietary or copyrighted. For instance, when I prompted it, the tool produced the first few passages from Joseph Heller’s book Catch-22 – a copyrighted text.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/509380/original/file-20230210-22-ylxsyx.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/509380/original/file-20230210-22-ylxsyx.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/509380/original/file-20230210-22-ylxsyx.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=263&fit=crop&dpr=1 600w, https://images.theconversation.com/files/509380/original/file-20230210-22-ylxsyx.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=263&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/509380/original/file-20230210-22-ylxsyx.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=263&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/509380/original/file-20230210-22-ylxsyx.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=330&fit=crop&dpr=1 754w, https://images.theconversation.com/files/509380/original/file-20230210-22-ylxsyx.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=330&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/509380/original/file-20230210-22-ylxsyx.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=330&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">ChatGPT doesn’t necessarily consider copyright protection when generating outputs.</span>
<span class="attribution"><span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>Finally, OpenAI did not pay for the data it scraped from the internet. The individuals, website owners and companies that produced it were not compensated. This is particularly noteworthy considering OpenAI was recently <a href="https://www.nasdaq.com/articles/microsofts-%2410-billion-investment-in-openai%3A-how-it-could-impact-the-ai-industry-and-stock">valued at US$29 billion</a>, more than double its <a href="https://www.forbes.com/sites/nicholasreimann/2023/01/05/chatgpt-creator-openai-discussing-offer-valuing-company-at-29-billion-report-says/?sh=f2ca73b11e04">value in 2021</a>. </p>
<p>OpenAI has also just <a href="https://openai.com/blog/chatgpt-plus/">announced ChatGPT Plus</a>, a paid subscription plan that will offer customers ongoing access to the tool, faster response times and priority access to new features. This plan will contribute to expected <a href="https://www.reuters.com/business/chatgpt-owner-openai-projects-1-billion-revenue-by-2024-sources-2022-12-15/">revenue of $1 billion by 2024</a>. </p>
<p>None of this would have been possible without data – our data – collected and used without our permission. </p>
<h2>A flimsy privacy policy</h2>
<p>Another privacy risk involves the data provided to ChatGPT in the form of user prompts. When we ask the tool to answer questions or perform tasks, we may inadvertently hand over <a href="https://www.forbes.com/sites/lanceeliot/2023/01/27/generative-ai-chatgpt-can-disturbingly-gobble-up-your-private-and-confidential-data-forewarns-ai-ethics-and-ai-law/?sh=5d7dd7ce7fdb">sensitive information</a> and put it in the public domain. </p>
<p>For instance, an attorney may prompt the tool to review a draft divorce agreement, or a programmer may ask it to check a piece of code. The agreement and code, in addition to the outputted essays, are now part of ChatGPT’s database. This means they can be used to further train the tool, and be included in responses to other people’s prompts.</p>
<p>Beyond this, OpenAI gathers a broad scope of other user information. According to the company’s <a href="https://openai.com/privacy/">privacy policy</a>, it collects users’ IP address, browser type and settings, and data on users’ interactions with the site – including the type of content users engage with, features they use and actions they take. </p>
<p>It also collects information about users’ browsing activities over time and across websites. Alarmingly, OpenAI states it may <a href="https://openai.com/privacy/">share users’ personal information</a> with unspecified third parties, without informing them, to meet their business objectives.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/everyones-having-a-field-day-with-chatgpt-but-nobody-knows-how-it-actually-works-196378">Everyone's having a field day with ChatGPT – but nobody knows how it actually works</a>
</strong>
</em>
</p>
<hr>
<h2>Time to rein it in?</h2>
<p>Some experts believe ChatGPT is <a href="https://hbr.org/2022/12/chatgpt-is-a-tipping-point-for-ai">a tipping point for AI</a> – a realisation of technological development that can revolutionise the way we work, learn, write and even think. Its potential benefits notwithstanding, we must remember OpenAI is a private, for-profit company whose interests and commercial imperatives do not necessarily align with greater societal needs. </p>
<p>The privacy risks that come attached to ChatGPT should sound a warning. And as consumers of a growing number of AI technologies, we should be extremely careful about what information we share with such tools. </p>
<p><em>The Conversation reached out to OpenAI for comment, but they didn’t respond by deadline.</em></p>
<hr>
<p><em>Correction: in regards to ChatGPT’s potential to generate copyrighted texts, this article previously referenced Peter Carey’s novel True History of the Kelly Gang, with a ChatGPT screenshot that was not an actual excerpt from the book. This has been changed to an accurate example referencing Joseph Heller’s book Catch-22.</em></p><img src="https://counter.theconversation.com/content/199283/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Uri Gal does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>ChatGPT is fuelled by our intimate online histories. It’s trained on 300 billion words, yet users have no way of knowing which of their data it contains.Uri Gal, Professor in Business Information Systems, University of SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1907582022-09-20T20:19:45Z2022-09-20T20:19:45ZThis law makes it illegal for companies to collect third-party data to profile you. But they do anyway<figure><img src="https://images.theconversation.com/files/485463/original/file-20220920-875-n1syu1.jpeg?ixlib=rb-1.1.0&rect=57%2C24%2C5406%2C3612&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Unsplash</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>A little-known provision of the Privacy Act makes it illegal for many companies in Australia to buy or exchange consumers’ personal data for profiling or targeting purposes. It’s almost never enforced. In a <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4224653">research paper</a> published today, I argue that needs to change. </p>
<p>“Data enrichment” is the intrusive practice of companies going behind our backs to “fill in the gaps” of the information we provide. </p>
<p>When you purchase a product or service from a company, fill out an online form, or sign up for a newsletter, you might provide only the necessary data such as your name, email, delivery address and/or payment information.</p>
<p>That company may then turn to other retailers or <a href="https://www.oracle.com/au/cx/advertising/data-enrichment-measurement/#data-enrichment">data brokers</a> to purchase or exchange extra data about you. This could include your age, family, health, habits and more. </p>
<p>This allows them to build a more detailed individual profile on you, which helps them predict your behaviour and more precisely target you with ads. </p>
<p>For almost ten years, there has been a law in Australia that makes this kind of data enrichment illegal if a company can “reasonably and practicably” request that information directly from the consumer. And at least <a href="https://consultations.ag.gov.au/rights-and-protections/privacy-act-review-discussion-paper/consultation/view_respondent?_b_index=60&uuId=926016195">one major data broker</a> has asked the government to “remove” this law. </p>
<p>The burning question is: why is there not a single published case of this law being enforced against companies “enriching” customer data for profiling and targeting purposes? </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/its-time-for-third-party-data-brokers-to-emerge-from-the-shadows-94298">It's time for third-party data brokers to emerge from the shadows</a>
</strong>
</em>
</p>
<hr>
<h2>Data collection ‘only from the individual’</h2>
<p>The relevant law is Australian Privacy Principle 3.6 and is part of the federal <a href="https://www.legislation.gov.au/Details/C2022C00199">Privacy Act</a>. It applies to most organisations that operate businesses with annual revenues higher than A$3 million, and smaller data businesses. </p>
<p>The law says such organisations:</p>
<blockquote>
<p>must collect personal information about an individual only from the individual […] unless it is unreasonable or impracticable to do so.</p>
</blockquote>
<p>This “direct collection rule” protects individuals’ privacy by allowing them some control over information collected about them, and avoiding a combination of data sources that could reveal sensitive information about their vulnerabilities. </p>
<p>But this rule has received almost no attention. There’s only one published determination of the federal privacy regulator on it, and that was against the <a href="https://www.austlii.edu.au/cgi-bin/viewdoc/au/cases/cth/AICmr/2020/69.html">Australian Defence Force</a> in a different context.</p>
<p>According to Australian Privacy Principle 3.6, it’s only legal for an organisation to collect personal information from a third party if it would be “unreasonable or impracticable” to collect that information from the individual alone. </p>
<p>This exception was intended to apply to <a href="https://www.oaic.gov.au/privacy/australian-privacy-principles-guidelines/chapter-3-app-3-collection-of-solicited-personal-information#collecting-directly-from-the-individual">limited situations</a>, such as when:</p>
<ul>
<li>the individual is being investigated for some wrongdoing<br></li>
<li>the individual’s address needs to be updated for delivery of legal or official documents. </li>
</ul>
<p>The exception shouldn’t apply simply because a company wants to collect extra information for profiling and targeting, but realises the customer would probably refuse to provide it.</p>
<h2>Who’s bypassing customers for third-party data?</h2>
<p>Aside from data brokers, companies also exchange information with each other about their respective customers to get extra information on customers’ lives. This is often referred to as “data matching” or “data partnerships”.</p>
<p>Companies tend to be very vague about who they share information with, and who they get information from. So we don’t know for certain who’s buying data-enrichment services from data brokers, or “matching” customer data. </p>
<p>Major companies such as <a href="https://www.amazon.com.au/gp/help/customer/display.html?nodeId=202075050&ref_=footer_iba">Amazon Australia</a>, <a href="https://www.ebay.com.au/help/policies/member-behaviour-policies/user-privacy-notice-privacy-policy?id=4260&mkevt=1&mkcid=1&mkrid=705-53470-19255-0&campid=5337590774&customid=&toolid=10001#section4">eBay Australia</a>, <a href="https://www.facebook.com/privacy/policy/?subpage=1.subpage.4-InformationFromPartnersVendors">Meta</a> (Facebook), <a href="https://www.viacomcbsprivacy.com/en/policy">10Play Viacom</a> and <a href="https://twitter.com/en/privacy#twitter-privacy-1">Twitter</a> include terms in the fine print of their privacy policies that state they collect personal information from third parties, including demographic details and/or interests.</p>
<p><a href="https://policies.google.com/privacy?hl=en-US#infocollect">Google</a>, <a href="https://preferences.news.com.au/privacy">News Corp</a>, <a href="https://www.sevenwestmedia.com.au/privacy-policies/privacy">Seven</a>, <a href="https://login.nine.com.au/privacy?client_id=smh">Nine</a> and others also say they collect personal information from third parties, but are more vague about the nature of that information.</p>
<p>These privacy policies don’t explain why it would be unreasonable or impracticable to collect that information directly from customers. </p>
<h2>Consumer ‘consent’ is not an exception</h2>
<p>Some companies may try to justify going behind customers’ backs to collect data because there’s an obscure term in their privacy policy that mentions they collect personal information from third parties. Or because the company <em>disclosing</em> the data has a privacy policy term about sharing data with “trusted data partners”.</p>
<p>But even if this amounts to consumer “consent” under the relatively weak standards for consent in our current privacy law, this is not an exception to the direct collection rule. </p>
<p>The law allows a “consent” exception for government agencies under a separate part of the direct collection rule, but <em>not</em> for private organisations. </p>
<h2>Data enrichment involves personal information</h2>
<p>Many companies with third-party data collection terms in their privacy policies acknowledge this is personal information. But some may argue the collected data isn’t “personal information” under the Privacy Act, so the direct collection rule doesn’t apply.</p>
<p>Companies often exchange information about an individual without using the individual’s legal name or email. Instead they may use a unique advertising identifier for that individual, or <a href="https://help.abc.net.au/hc/en-us/articles/4402890310671">“hash” the email address</a> to turn it into a unique string of numbers and letters. </p>
<p>They essentially allocate a “code name” to the consumer. So the companies can exchange information that can be linked to the individual, yet say this information wasn’t connected to their actual name or email. </p>
<p>However, this information should still be treated as personal information because it can be linked back to the individual when combined with other <a href="https://www.austlii.edu.au/cgi-bin/viewdoc/au/cases/cth/FCAFC/2017/4.html">information about them</a>. </p>
<h2>At least one major data broker is against it</h2>
<p>Data broker <a href="https://www.experian.com.au/business/solutions/audience-targeting/digital-solutions-sell-side/digital-audiences-ss">Experian Australia</a> has asked the government to “remove” Australian Privacy Principle 3.6 “altogether”. In its <a href="https://consultations.ag.gov.au/rights-and-protections/privacy-act-review-discussion-paper/consultation/view_respondent?_b_index=60&uuId=926016195">submission</a> to the Privacy Act Review in January, Experian argued:</p>
<blockquote>
<p>It is outdated and does not fit well with modern data uses.</p>
</blockquote>
<p>Others who profit from data enrichment or data matching would probably agree, but prefer to let sleeping dogs lie.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A screenshot shows six different categories of consumer data offered by Experian." src="https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=369&fit=crop&dpr=1 600w, https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=369&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=369&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=463&fit=crop&dpr=1 754w, https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=463&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=463&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">On its website, Experian claims to offer a ‘combination of demographic, geographic, financial and market research data - both online and offline’.</span>
<span class="attribution"><span class="source">Screenshot/Experian</span></span>
</figcaption>
</figure>
<p>Experian argued the law favours large companies with direct access to lots of customers and opportunities to pool data collected from across their own corporate group. It said companies with access to fewer consumers and less data would be disadvantaged if they can’t purchase data from brokers. </p>
<p>But the fact that some digital platforms impose extensive personal data collection on customers supports the case for stronger privacy laws. It doesn’t mean there should be a data free-for-all. </p>
<h2>Our privacy regulator should take action</h2>
<p>It has been three years since the consumer watchdog recommended <a href="https://www.accc.gov.au/system/files/Digital%20platforms%20inquiry%20-%20final%20report.pdf">major reforms</a> to our privacy laws to reduce the disadvantages consumers suffer from invasive data practices. These reforms are probably still years away, if they eventuate at all.</p>
<p>The direct collection rule is a very rare thing. It is an existing Australian privacy law that favours consumers. The privacy regulator should prioritise the enforcement of this law for the benefit of consumers.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/amazon-just-took-over-a-primary-healthcare-company-for-a-lot-of-money-should-we-be-worried-187627">Amazon just took over a primary healthcare company for a lot of money. Should we be worried?</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/190758/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, and the Australian Privacy Foundation.</span></em></p>The terms of the Australian Privacy Principle 3.6 are quite clear. So why is there not a single published case of this law being enforced?Katharine Kemp, Senior Lecturer, Faculty of Law & Justice, UNSW, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1882792022-08-23T18:27:55Z2022-08-23T18:27:55ZA new US data privacy bill aims to give you more control over information collected about you – and make businesses change how they handle data<figure><img src="https://images.theconversation.com/files/480484/original/file-20220822-88277-9t0pw.jpg?ixlib=rb-1.1.0&rect=53%2C0%2C6000%2C3736&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The U.S. could soon catch up to the European Union in protecting people's data privacy.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/cybersecurity-data-security-and-data-access-must-be-royalty-free-image/1366362135">Teera Konakan/Moment via Getty Images</a></span></figcaption></figure><p>Data privacy in the U.S. is, in many ways, a legal void. While there are limited protections for health and financial data, the cradle of the world’s largest tech companies, like Apple, Amazon, Google, and Meta (Facebook), <a href="https://www.dli.tech.cornell.edu/post/us-data-privacy-law-federal-and-state-legislation-impact-and-risk-mitigation">lacks any comprehensive federal data privacy law</a>. This leaves U.S. citizens with minimal <a href="https://www.nytimes.com/2019/06/08/opinion/sunday/privacy-congress-facebook-google.html">data privacy</a> protections <a href="https://scholarship.law.edu/cgi/viewcontent.cgi?article=1061&context=jlt">compared with citizens of other nations</a>. But that may be about to change. </p>
<p>With rare <a href="https://www.jdsupra.com/legalnews/bipartisan-u-s-federal-privacy-bill-9169312/">bipartisan support</a>, the <a href="https://www.congress.gov/bill/117th-congress/house-bill/8152/actions">American Data and Privacy Protection Act</a> moved out of the U.S. House of Representatives Committee on Energy and Commerce <a href="https://www.natlawreview.com/article/house-committee-passes-comprehensive-federal-privacy-legislation">by a vote of 53-2</a> on July 20, 2022. The bill still needs to pass the full House and the Senate, and <a href="https://subscriber.politicopro.com/article/2022/06/lawmakers-reach-bipartisan-compromise-on-privacy-bill-with-preemption-right-to-sue-00036563">negotiations are ongoing</a>. Given the Biden administration’s <a href="https://www.csoonline.com/article/3664175/u-s-data-privacy-and-security-solutions-emerging-at-the-federal-level.html">responsible data practices strategy</a>, White House support is likely if a version of the bill passes.</p>
<p>As a legal scholar and attorney who <a href="https://papers.ssrn.com/sol3/cf_dev/AbsByAuth.cfm?per_id=2643050">studies and practices technology and data privacy law</a>, I’ve been closely following the act, known as ADPPA. If passed, it will fundamentally alter U.S. data privacy law. </p>
<p>ADPPA fills the data privacy void, builds in federal preemption over some state data privacy laws, allows individuals to file suit over violations and substantially changes data privacy law enforcement. Like all big changes, ADPPA is getting mixed reviews from <a href="https://www.wired.com/story/american-data-privacy-protection-act-adppa/">media</a>, <a href="https://truthonthemarket.com/2022/06/22/adppa-mimics-gdprs-flaws-and-goes-further-still/">scholars</a> and <a href="https://www.cnbc.com/2022/06/09/bipartisan-privacy-proposal-is-unworkable-chamber-of-commerce-says.html">businesses</a>. But many see the bill as a triumph for U.S. data privacy that provides a needed national standard for data practices.</p>
<h2>Who and what will ADPPA regulate?</h2>
<p>ADPPA would apply to “covered” entities, meaning any entity collecting, processing or transferring covered data, including nonprofits and sole proprietors. It also regulates cellphone and internet providers and other <a href="https://www.law.cornell.edu/uscode/text/47/153">common carriers</a>, with <a href="https://iapp.org/news/a/advocates-concerned-with-telecom-data-oversight-in-proposed-adppa/">potentially concerning changes to federal communications regulation</a>. It does not apply to government entities.</p>
<p>ADPPA defines “covered” data as any information or device that identifies or can be reasonably linked to a person. It also protects biometric data, genetic data and geolocation information.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a city street view with a young woman looking down at her phone in focus while passersby are out of focus" src="https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Protected data includes your location.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/woman-with-smartphone-royalty-free-image/657929972">Christoph Hetzmannseder/Moment via Getty Images</a></span>
</figcaption>
</figure>
<p>The bill excludes three big data categories: deidentified data, employee data and publicly available information. That last category includes social media accounts with privacy settings open to public viewing. While <a href="https://georgetownlawtechreview.org/re-identification-of-anonymized-data/GLTR-04-2017/">research</a> has repeatedly shown <a href="https://www.theregister.com/2021/09/16/anonymising_data_feature/">deidentified data can be easily reidentified</a>, the ADPPA attempts to address that by requiring covered entities to take “reasonable technical, administrative, and physical measures to ensure that the information cannot, at any point, be used to re-identify any individual or device.”</p>
<h2>How ADPPA protects your data</h2>
<p>The act would require data collection to be as minimal as possible. The bill allows covered entities to collect, use or share an individual’s data only when reasonably necessary and proportionate to a product or service the person requests or to respond to a communication the person initiates. It allows collection for authentication, security incidents, prevention of illegal activities or serious harm to persons, and compliance with legal obligations.</p>
<p>People would gain rights to access and have some control over their data. ADPPA gives users the right to correct inaccuracies and potentially delete their data held by covered entities.</p>
<p>The bill permits data collection as part of research for public good. It allows data collection for peer-reviewed research or research done in the public interest – for example, testing whether a website is unlawfully discriminating. This is important for researchers who might otherwise run afoul of site terms or hacking laws.</p>
<p>The ADPPA also has a provision that <a href="https://www.wired.com/story/american-data-privacy-protection-act-adppa/">tackles the service-conditioned-on-consent problem</a> – those annoying “I Agree” boxes that force people to accept a jumble of legal terms. When you click one of those boxes, you contractually waive your privacy rights as a condition to simply use a service, visit a website or buy a product. The bill will prevent covered entities from using contract law to get around the bill’s protections.</p>
<h2>Looking to federal electronic surveillance law for guidance</h2>
<p>The U.S.’s <a href="https://www.law.cornell.edu/uscode/text/18/part-I/chapter-119">Electronic Communications Privacy Act</a> can provide federal law makers guidance in finalizing ADPPA. Like the ADPPA, the 1986 ECPA legislation involved a massive overhaul of U.S. electronic privacy law to address adverse effects to individual privacy and civil liberties posed by advancing surveillance and communication technologies. Once again, advances in surveillance and data technologies, such as artificial intelligence, are significantly affecting citizens’ rights.</p>
<p>ECPA, still in effect today, provides a baseline national standard for electronic surveillance protections. ECPA protects communications from interception unless one party to the communication consents. But ECPA does not preempt states from passing more protective laws, so states can choose to provide greater privacy rights. The end result: Roughly a quarter of U.S. states require consent of all parties to intercept a communication, thus providing their citizens increased privacy rights.</p>
<p>ECPA’s federal/state balance has worked for decades now, and ECPA has not overwhelmed the courts or destroyed commerce. </p>
<h2>National preemption</h2>
<p>As drafted, ADPPA preempts some state data privacy legislation. This affects <a href="https://oag.ca.gov/privacy/ccpa">California’s Consumer Privacy Act</a>, although it does not preempt the <a href="https://www.ilga.gov/legislation/ilcs/ilcs3.asp?ActID=3004&ChapterID=57">Illinois Biometric Information Privacy Act</a> or state laws specifically regulating facial recognition technology. The preemption provisions, however, are in flux as members of the House continue to negotiate the bill.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/S8D7I-FGKOM?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">The federal bill could end up preempting parts of California’s tougher state data privacy law.</span></figcaption>
</figure>
<p>ADPPA’s national standards provide uniform compliance requirements, serving economic efficiency; but its preemption of most state laws has <a href="https://teachprivacy.com/a-faustian-bargain-is-preemption-too-high-a-price-for-a-federal-privacy-law/">some scholars concerned</a>, and <a href="https://www.natlawreview.com/article/california-privacy-protection-agency-holds-public-meeting-to-formally-oppose-federal">California opposes its passage</a>. </p>
<p>If preemption stands, any final version of the ADPPA will be the law of the land, limiting states from more firmly protecting their citizens’ data privacy.</p>
<h2>Private right of action and enforcement</h2>
<p>ADDPA provides for a <a href="https://crsreports.congress.gov/product/pdf/LSB/LSB10776">private right of action</a>, allowing people to sue covered entities who violate their rights under ADPPA. That gives the bill’s enforcement mechanisms a big boost, although it has significant restrictions.</p>
<p>The <a href="https://www.cnbc.com/2022/06/09/bipartisan-privacy-proposal-is-unworkable-chamber-of-commerce-says.html">U.S. Chamber of Commerce</a> and the tech industry oppose a private right of action, preferring ADPPA enforcement be restricted to the Federal Trade Commission. But the FTC has far less staff and far fewer resources than U.S. trial attorneys do.</p>
<p>ECPA, for comparison, has a private right of action. It has not overwhelmed courts or businesses, and entities likely comply with ECPA to avoid civil litigation. Plus, courts have honed ECPA’s terms, providing clear precedent and understandable compliance guidelines. </p>
<h2>How big are the changes?</h2>
<p>The changes to U.S. data privacy law are big, but ADPPA affords much-needed security and data protections to U.S. citizens, and I believe that it is workable with tweaks. </p>
<p>Given how the internet works, data routinely flows across international borders, so many U.S. companies have already built compliance with other nations’ laws into their systems. This includes the <a href="https://gdpr-info.eu/">E.U.’s General Data Protection Regulation</a> – a law similar to the ADPPA. Facebook, for example, provides E.U. citizens with GDPR’s protections, but it does not give U.S. citizens those protections, because it is not required to do so.</p>
<p>Congress has done little with data privacy, but ADPPA is poised to change that.</p><img src="https://counter.theconversation.com/content/188279/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Anne Toomey McKenna is affiliated faculty with Penn State University 's Institute for Computational and Data Sciences, a Visiting Law Professor at University of Richmond's Law School, and she co-chairs IEEE-USA's AI Policy Subcommittee or Privacy, Equity, and Justice in AI. The views expressed herein are the author's own.</span></em></p>Data collection is big business in the US, but a bipartisan data privacy bill rapidly moving through Congress promises to affect the information websites, social media platforms and all other businesses collect.Anne Toomey McKenna, Visiting Professor of Law, University of RichmondLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1855102022-06-27T02:32:16Z2022-06-27T02:32:16ZFacial recognition is on the rise – but the law is lagging a long way behind<figure><img src="https://images.theconversation.com/files/471008/original/file-20220627-14-q7vf1z.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C4481%2C3216&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/iot-machine-learning-human-object-recognition-794528230">Shutterstock</a></span></figcaption></figure><p>Private companies and public authorities are quietly using facial recognition systems around Australia. </p>
<p>Despite the growing use of this controversial technology, there is little in the way of specific regulations and guidelines to govern its use.</p>
<h2>Spying on shoppers</h2>
<p>We were reminded of this fact recently when consumer advocates at CHOICE <a href="https://www.choice.com.au/consumers-and-data/data-collection-and-use/how-your-data-is-used/articles/kmart-bunnings-and-the-good-guys-using-facial-recognition-technology-in-store">revealed</a> that major retailers in Australia are using the technology to identify people claimed to be thieves and troublemakers. </p>
<p>There is no dispute about the goal of reducing harm and theft. But there is also little transparency about how this technology is being used. </p>
<p>CHOICE found that most people have no idea their faces are being scanned and matched to stored images in a database. Nor do they know how these databases are created, how accurate they are, and how secure the data they collect is. </p>
<p>As CHOICE discovered, the notification to customers is inadequate. It comes in the form of small, hard-to-notice signs in some cases. In others, the use of the technology is announced in online notices rarely read by customers. </p>
<p>The companies clearly don’t want to draw attention to their use of the technology or to account for how it is being deployed.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/bunnings-kmart-and-the-good-guys-say-they-use-facial-recognition-for-loss-prevention-an-expert-explains-what-it-might-mean-for-you-185126">Bunnings, Kmart and The Good Guys say they use facial recognition for 'loss prevention'. An expert explains what it might mean for you</a>
</strong>
</em>
</p>
<hr>
<h2>Police are eager</h2>
<p>Something similar is happening with the use of the technology by Australian police. Police in New South Wales, for example, have embarked on a “low-volume” <a href="https://www.theguardian.com/australia-news/2021/jul/01/calls-to-stop-nsw-police-trial-of-national-facial-recognition-system-over-lack-of-legal-safeguards">trial</a> of a nationwide face-recognition database. This trial took place despite the fact that the enabling legislation for the national database has not yet been passed.</p>
<p>In South Australia, controversy over Adelaide’s plans to upgrade its CCTV system with face-recognition capability led the city council to <a href="https://www.abc.net.au/news/2022-06-22/adelaide-city-council-votes-no-to-facial-recognition-in-cctv/101172924?utm_source=pocket_mylist">vote</a> not to purchase the necessary software. The council has also asked South Australia Police not to use face-recognition technology until legislation is in place to govern its use. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1540320043052826624"}"></div></p>
<p>However, SA Police have <a href="https://www.abc.net.au/news/2022-06-22/adelaide-city-council-votes-no-to-facial-recognition-in-cctv/101172924?utm_source=pocket_mylist">indicated</a> an interest in using the technology. </p>
<p>In a public <a href="https://www.itnews.com.au/news/sa-police-ignore-adelaide-council-plea-for-facial-recognition-ban-on-cctv-581559">statement</a>, the police described the technology as a potentially useful tool for criminal investigations. The statement also noted: </p>
<blockquote>
<p>There is no legislative restriction on the use of facial recognition technology in South Australia for investigations. </p>
</blockquote>
<h2>A controversial tool</h2>
<p>Adelaide City Council’s call for regulation is a necessary response to the expanding use of automated facial recognition. </p>
<p>This is a powerful technology that promises to fundamentally change our experience of privacy and anonymity. There is already a large gap between the amount of personal information collected about us every day and our own knowledge of how this information is being used, and facial recognition will only make the gap bigger.</p>
<p>Recent events suggest a reluctance on the part of retail outlets and public authorities alike to publicise their use of the technology. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/large-scale-facial-recognition-is-incompatible-with-a-free-society-126282">Large-scale facial recognition is incompatible with a free society</a>
</strong>
</em>
</p>
<hr>
<p>Although it is seen as a potentially useful tool, it can be a controversial one. A world in which remote cameras can identify and track people as they move through public space seems alarmingly Orwellian. </p>
<p>The technology has also been criticised for being invasive and, in some cases, <a href="https://www.marketplace.org/shows/marketplace-tech/bias-in-facial-recognition-isnt-hard-to-discover-but-its-hard-to-get-rid-of/">biased</a> and inaccurate. In the US, for example, people have already been <a href="https://www.wired.com/story/wrongful-arrests-ai-derailed-3-mens-lives/">wrongly arrested</a> based on matches made by face-recognition systems.</p>
<h2>Public pushback</h2>
<p>There has also been widespread public opposition to the use of the technology in some cities and states in the US, which have gone so far as to impose <a href="https://www.wired.com/story/face-recognition-banned-but-everywhere/">bans</a> on its use.</p>
<p>Surveys show the Australian public have <a href="https://securitybrief.com.au/story/australians-uneasy-about-facial-recognition-tech-report">concerns</a> about the invasiveness of the technology, but that there is also support for its potential use to increase public safety and security.</p>
<p>Facial-recognition technology isn’t going away. It’s likely to become less expensive and more accurate and powerful in the near future. Instead of implementing it piecemeal, under the radar, we need to directly confront both the potential harms and benefits of the technology, and to provide clear rules for its use.</p>
<h2>What would regulations look like?</h2>
<p>Last year, then human rights commissioner Ed Santow called for <a href="https://www.itnews.com.au/news/human-rights-commission-calls-for-temporary-ban-on-high-risk-govt-facial-recognition-565173">a partial ban</a> on the use of facial-recognition technology. He is now developing model legislation for how it might be regulated in Australia. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1536850168942718976"}"></div></p>
<p>Any regulation of the technology will need to consider both the potential benefits of its use and the risks to privacy rights and civic life. </p>
<p>It will also need to consider enforceable standards for its proper use. These could include the right to correct inaccurate information, the need to provide human confirmation for automated forms of identification, and the setting of minimum standards of accuracy. </p>
<p>They could also entail improving public consultation and consent around the use of the technology, and a requirement for the performance of systems to be accountable to an independent authority and to those researching the technology.</p>
<p>As the reach of facial recognition expands, we need more public and parliamentary debate to develop appropriate regulations for governing its use.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/darwins-smart-city-project-is-about-surveillance-and-control-127118">Darwin's 'smart city' project is about surveillance and control</a>
</strong>
</em>
</p>
<hr>
<hr>
<p><em>If you’re in Adelaide, there will be a public forum on regulating facial recognition technology at the Town Hall <a href="https://www.eventbrite.com.au/e/regulating-facial-recognition-technology-in-adelaideand-beyond-tickets-360120358687">tonight</a> (Monday, June 27). Ed Santow and his colleague Lauren Perry will present their model legislation, and they will be joined in discussion by South Australian parliamentarian Tammy Franks and Law Society of South Australia president Justin Stewart-Rattray.</em></p><img src="https://counter.theconversation.com/content/185510/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Mark Andrejevic receives funding from the Australian Research Council.</span></em></p><p class="fine-print"><em><span>Gavin JD Smith receives funding from the Australian Research Council. </span></em></p>Private companies and public authorities are beginning to implement facial recognition technology, even without rules to govern what they can do.Mark Andrejevic, Professor, School of Media, Film, and Journalism, Monash University, Monash UniversityGavin JD Smith, Associate Professor in Sociology, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1832042022-06-15T12:26:49Z2022-06-15T12:26:49ZPrivacy isn’t in the Constitution – but it’s everywhere in constitutional law<figure><img src="https://images.theconversation.com/files/468077/original/file-20220609-18254-mfvhp4.jpg?ixlib=rb-1.1.0&rect=28%2C7%2C4655%2C3707&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Who's allowed to watch what you do and say?</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/two-women-looking-over-fence-royalty-free-image/200365509-005">Shannon Fagan/The Image Bank via Getty Images</a></span></figcaption></figure><p>Almost all American adults – including parents, medical patients and people who are sexually active – regularly exercise their right to privacy, even if they don’t know it.</p>
<p>Privacy is not specifically mentioned in the <a href="https://constitution.congress.gov/constitution/">U.S. Constitution</a>. But for half a century, the Supreme Court has recognized it as an outgrowth of protections for individual liberty. As I have studied in my <a href="https://www.cambridge.org/core/books/privacy-at-the-margins/821035ECA5D61516D87C454DD1FF8167">research</a> on <a href="https://scholar.google.com/citations?user=XT6-THQAAAAJ&hl=en&oi=ao">constitutional privacy rights</a>, this implied right to privacy is the source of many of the nation’s most cherished, contentious and commonly used rights – including the right to have an abortion – until the court’s June 24, 2022, ruling in <a href="https://www.oyez.org/cases/2021/19-1392">Dobbs v. Jackson</a>.</p>
<h2>A key component of liberty</h2>
<p>The Supreme Court first formally identified what is called “<a href="https://supreme.justia.com/cases/federal/us/429/589/">decisional privacy</a>” – the right to independently control the most personal aspects of our lives and our bodies – in 1965, saying it was <a href="https://www.law.cornell.edu/wex/griswold_v_connecticut_%281965%29">implied from other explicit constitutional rights</a>.</p>
<p>For instance, the <a href="https://constitution.congress.gov/constitution/amendment-1/">First Amendment</a> rights of speech and assembly allow people to privately decide what they’ll say, and with whom they’ll associate. The <a href="https://constitution.congress.gov/constitution/amendment-4/">Fourth Amendment</a> limits government intrusion into people’s private property, documents and belongings.</p>
<p>Relying on these explicit provisions, the court concluded in <a href="https://www.law.cornell.edu/wex/griswold_v_connecticut_%281965%29">Griswold v. Connecticut</a> that people have privacy rights preventing the government from forbidding married couples from using contraception. </p>
<p>In short order, the court clarified its understanding of the constitutional origins of privacy. In the 1973 Roe v. Wade decision protecting the right to have an <a href="https://www.law.cornell.edu/supremecourt/text/410/113">abortion</a>, the court held that the right of decisional privacy is based in the Constitution’s assurance that people cannot be “deprived of life, liberty or property, without due process of law.” That phrase, called the due process clause, <a href="https://www.law.cornell.edu/wex/due_process">appears twice in the Constitution</a> – in the <a href="https://constitution.congress.gov/constitution/amendment-5/">Fifth</a> and <a href="https://constitution.congress.gov/constitution/amendment-14/">14th Amendments</a>. </p>
<p>Decisional privacy also provided the basis for other decisions protecting many crucial, and everyday, activities. </p>
<p>The right to privacy protects the ability to have consensual sex <a href="https://supreme.justia.com/cases/federal/us/539/558/#tab-opinion-1961305">without being sent to jail</a>. And privacy buttresses the <a href="https://www.law.cornell.edu/supct/pdf/14-556.pdf">ability to marry</a> regardless of race or gender.</p>
<p>The right to privacy is also key to a person’s ability to keep their family together without undue government interference. For example, in 1977, the court relied on the right to private family life to rule that a <a href="https://supreme.justia.com/cases/federal/us/431/494/#tab-opinion-1952239">grandmother could move her grandchildren into her home to raise them</a> even though it violated a local zoning ordinance. </p>
<p>Under a combination of privacy and liberty rights, the Supreme Court has also protected a person’s freedom in medical decision-making. For example, in 1990, the court concluded “that a competent person has a <a href="https://supreme.justia.com/cases/federal/us/497/261/">constitutionally protected liberty interest</a> in refusing unwanted medical treatment.” </p>
<h2>Limiting government disclosure</h2>
<p>The right to decisional privacy is not the only constitutionally protected form of privacy. As then-Supreme Court Justice William Rehnquist <a href="https://supreme.justia.com/cases/federal/us/433/425/">noted in 1977</a>, the “concept of ‘privacy’ can be a coat of many colors, and quite differing kinds of rights to ‘privacy’ have been recognized in the law.” </p>
<p>This includes what is called a right to “informational privacy” – letting a person limit government disclosure of information about them. </p>
<p>According to some authority, the right extends even to prominent public and political figures. In one key decision, in 1977, Chief Justice Warren Burger and Rehnquist – both conservative justices – <a href="https://supreme.justia.com/cases/federal/us/433/425/">suggested</a> in dissenting opinions that former President Richard Nixon had a privacy interest in documents made during his presidency that touched on his personal life. <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2691422">Lower courts</a> have relied on the right of informational privacy to limit the government’s ability to disclose someone’s sexual orientation or HIV status. </p>
<p>All told, though the word isn’t in the Constitution, privacy is the foundation of many constitutional protections for our most important, sensitive and intimate activities. If the right to privacy is eroded – such as in a future Supreme Court decision – many of the rights it’s connected with may also be in danger.</p>
<p><em>This story was updated on June 24, 2022, to reflect the Supreme Court’s decision in Dobbs v. Jackson Women’s Health.</em></p><img src="https://counter.theconversation.com/content/183204/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Scott Skinner-Thompson serves on the advisory board of the Electronic Privacy Information Center (EPIC). </span></em></p>The Supreme Court has found protections for people’s privacy in several constitutional amendments – and used it as a basis for some pretty fundamental protections.Scott Skinner-Thompson, Associate Professor of Law, University of Colorado BoulderLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1851262022-06-15T07:07:57Z2022-06-15T07:07:57ZBunnings, Kmart and The Good Guys say they use facial recognition for ‘loss prevention’. An expert explains what it might mean for you<figure><img src="https://images.theconversation.com/files/468915/original/file-20220615-14-ex57sp.jpg?ixlib=rb-1.1.0&rect=348%2C128%2C4215%2C2524&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption"></span> <span class="attribution"><a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>Once the purview of law enforcement and intelligence agencies, facial recognition is now being used to identify consumers in Australian stores. </p>
<p>If you’ve seen the movie Minority Report, you’ll remember how Tom Cruise’s character John Anderton is identified through iris recognition to perform his duties, and later tracked with it when he’s a wanted man. When he replaces his eyes to evade identification, Anderton is bombarded with advertisements targeting his new assumed identity.</p>
<p>This once-futuristic idea from a movie could soon be a reality in our lives. An investigative report published by consumer magazine <a href="https://www.choice.com.au/consumers-and-data/data-collection-and-use/how-your-data-is-used/articles/kmart-bunnings-and-the-good-guys-using-facial-recognition-technology-in-store">Choice</a> reveals three major retailers (out of 25 queried), Kmart, Bunnings and The Good Guys, have admitted using facial recognition technology on customers for “loss prevention”. </p>
<p>The companies say they advise consumers of the use of the technology as a condition of entry. But do consumers really know what this entails, and how or where their images could be used or stored?</p>
<h2>What is facial recognition and why do we care?</h2>
<p>We’ve grown accustomed to our phones and cameras using facial detection software to put our faces into focus. But facial <em>recognition</em> technology takes this a step further by matching our unique identifying information to a stored digital image.</p>
<p>Facial recognition has come a long way. It was initially used in 2001 to identify relationships between gamblers and employees in Las Vegas casinos, where there was suspected collusion. </p>
<p>The United States government would eventually use <a href="https://www.infoworld.com/article/2628017/innovation-that-matters--jeff-jonas-connects-the-invisible-dots.html">the same</a> technology to <a href="https://www.nationalgeographic.com/science/article/140505-jeff-jonas-big-data-gambling-computers-technology-ibm">identify the 9/11 hijackers</a>. It’s now widely adopted by law enforcement and intelligence communities.</p>
<p>Currently, software such as Clearview AI and PimEyes are being used in highly sophisticated ways, including by Ukrainian and Russian forces to <a href="https://www.washingtonpost.com/technology/2022/04/15/ukraine-facial-recognition-warfare/">identify combatants in Ukraine</a>. </p>
<h2>But what is this technology doing in Bunnings?</h2>
<p>As with its early use in casinos, Kmart, Bunnings and The Good Guys told Choice their facial recognition software is used for “loss prevention”.</p>
<p>Images captured on store surveillance devices and body cameras could be used to identify in-store individuals engaged in theft, or other criminal activities. Real-time identification could allow law enforcement to quickly identify shoppers with unpaid tickets, outstanding warrants, or existing criminal complaints.</p>
<p>Bunnings chief operating officer Simon McDowell told SBS News the technology was used “solely to keep team and customers safe and prevent unlawful activity in our stores”. Both The Good Guys and Kmart told <a href="https://www.theguardian.com/technology/2022/jun/15/bunnings-kmart-and-the-good-guys-using-facial-recognition-technology-to-crack-down-on-theft-choice-says">news outlets</a> they were using it for the same reasons, in a select number of stores – and that customers were notified through signage. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/468933/original/file-20220615-25-71yxl3.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/468933/original/file-20220615-25-71yxl3.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/468933/original/file-20220615-25-71yxl3.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=354&fit=crop&dpr=1 600w, https://images.theconversation.com/files/468933/original/file-20220615-25-71yxl3.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=354&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/468933/original/file-20220615-25-71yxl3.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=354&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/468933/original/file-20220615-25-71yxl3.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=445&fit=crop&dpr=1 754w, https://images.theconversation.com/files/468933/original/file-20220615-25-71yxl3.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=445&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/468933/original/file-20220615-25-71yxl3.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=445&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Choice supplied this photo of a sign, which it said was taken at a Kmart in Marrickville, NSW.</span>
<span class="attribution"><span class="source">CHOICE</span></span>
</figcaption>
</figure>
<p>Choice confirmed there were some signs disclosing use of the technology – but reported these signs were small and would be missed by most shoppers. </p>
<p>The news has stoked shoppers’ fears of how their image data may be used. As in Minority Report, images captured in a store could theoretically be used for targeted advertising and to “enhance” <a href="https://www.wired.com/2011/11/malls-track-phone-signals/">the shopping experience</a>.</p>
<p>It’s likely images and video collected through standard in-store surveillance are either matched immediately against a remote database using specialised facial recognition software, or analysed against a database of tagged and catalogued images later on. Ideally, the images would be encoded and stored in a file that’s readable only by the algorithm specific to the device or software processor.</p>
<h2>Potential for misuse</h2>
<p>We have already seen online retailers use this tactic through <a href="https://theconversation.com/googles-scrapping-third-party-cookies-but-invasive-targeted-advertising-will-live-on-156530">cookies</a> and linking our purchase history on <a href="https://theconversation.com/smartphone-data-tracking-is-more-than-creepy-heres-why-you-should-be-worried-91110">electronic devices</a>. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/is-your-phone-really-listening-to-your-conversations-well-turns-out-it-doesnt-have-to-162172">Is your phone really listening to your conversations? Well, turns out it doesn't have to</a>
</strong>
</em>
</p>
<hr>
<p>We have also seen companies correlate our social media profiles and our other online experiences across various websites. Australian stores employing facial recognition could use collected information internally to track:</p>
<ul>
<li>the number of visits by a person</li>
<li>the times of those visits</li>
<li>pattern or behavioural analysis (such as a consumer’s reaction to pricing or signage) and</li>
<li>associations with other shoppers (such as friends, family and anyone else with them). </li>
</ul>
<p>Retailers could also use this identity data to extract information from social media, where most people have images of themselves uploaded. They could then perform risk analysis based on the credit and financial reporting access of that specific shopper. </p>
<p>Externally, the images and associated consumer information could be merged with financial, economic, social and political data already collected by commercial data aggregators – adding to the already massive data aggregation market.</p>
<p>Current Australian privacy laws require retailers to disclose what data are being collected, retained and protected, as well as how it might be used outside of a loss prevention model.</p>
<p>A Bunnings spokesperson told The Guardian the technology was being used in line with the Australian Privacy Act. Choice has reached out to the Office of the Australian Information Commissioner to determine whether the use of the technology is indeed consistent with the Privacy Act.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/shadow-profiles-facebook-knows-about-you-even-if-youre-not-on-facebook-94804">Shadow profiles - Facebook knows about you, even if you're not on Facebook</a>
</strong>
</em>
</p>
<hr>
<h2>What to do?</h2>
<p>While the retailers highlighted in Choice’s investigation state consumers must agree to the collection of their images as a condition of entry, the reality is the collection, retention, and use of their images are not usually disclosed in any explicit way. </p>
<p>As far as data collection in retail settings goes, there should be a precondition for all stores to make sure consumers are made aware of:</p>
<ul>
<li>the specific information that is collected while they are visiting</li>
<li>how it might be aggregated and combined with other relevant information from third parties</li>
<li>how long the images or data will be retained, retrieved, or accessed and by whom, and </li>
<li>what security precautions are being used to secure the data.</li>
</ul>
<p>Furthermore, as with their online shopping experience, consumers should be given the option to opt-out of such data collection. </p>
<p>Until then, consumers may try to avoid collection by donning hats, sunglasses and face masks. But considering the rate at which facial recognition technology is advancing – and how large the personal data market has already grown – retail cameras may soon be able to see through these disguises, too.</p><img src="https://counter.theconversation.com/content/185126/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dennis B Desmond previously received funding from the United States Department of Defense.</span></em></p>Australia’s consumer advocacy group Choice identified three Australian retailers who use facial recognition to identify consumers. What are the privacy concerns?Dennis B. Desmond, Lecturer, Cyberintelligence and Cybercrime Investigations, University of the Sunshine CoastLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1788662022-06-03T12:20:19Z2022-06-03T12:20:19ZGenetic paparazzi are right around the corner, and courts aren’t ready to confront the legal quagmire of DNA theft<figure><img src="https://images.theconversation.com/files/466687/original/file-20220601-48041-5tdwjf.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C2309%2C1299&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">DNA is a trove of personal information that can be hard to keep track of and protect. </span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/dna-royalty-free-image/1369527112">Boris Zhitkov/Moment via Getty Images</a></span></figcaption></figure><p>Every so often stories of <a href="https://ssrn.com/abstract=1684337">genetic theft</a>, or extreme precautions taken to avoid it, make headline news. So it was with a <a href="https://www.theweek.in/news/world/2022/02/12/explained-what-is-dna-theft-why-did-macron-refuse-russian-covid-test.html">picture</a> of French President Emmanuel Macron and Russian President Vladimir Putin sitting at <a href="https://www.reuters.com/world/europe/putin-kept-macron-distance-snubbing-covid-demands-sources-2022-02-10/">opposite ends of a very long table</a> after Macron declined to take a Russian PCR COVID-19 test in 2022. Many <a href="https://www.firstpost.com/world/dna-theft-fears-why-french-and-german-leaders-refused-to-take-russian-covid-test-10386501.html">speculated</a> that Macron refused due to security concerns that the Russians would take and use his DNA for nefarious purposes. German Chancellor Olaf Scholz <a href="https://www.theweek.co.uk/news/world-news/russia/955813/why-world-leaders-refuse-give-russia-dna">similarly refused</a> to take a Russian PCR COVID-19 test.</p>
<p>While these concerns may seem relatively new, pop star celebrity Madonna has been raising alarm bells about the potential for nonconsensual, surreptitious collection and testing of DNA for over a decade. She has <a href="https://geneticliteracyproject.org/2016/02/19/madonna-may-suffer-dna-paranoia/">hired cleaning crews</a> to sterilize her dressing rooms after concerts and requires her own new toilet seats at each stop of her tours. </p>
<p>At first, Madonna was ridiculed for having <a href="https://www.dailymail.co.uk/tvshowbiz/article-2163460/Paranoia-Madonna-orders-sterile-sweep-dressing-room-gig-prevent-fans-stealing-DNA.html">DNA paranoia</a>. But as more advanced, faster and cheaper genetic technologies have reached the consumer realm, these concerns seem not only reasonable, but justified.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/466685/original/file-20220601-66680-bioj3t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Putin and Macron sitting at opposite ends of a long table" src="https://images.theconversation.com/files/466685/original/file-20220601-66680-bioj3t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/466685/original/file-20220601-66680-bioj3t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=380&fit=crop&dpr=1 600w, https://images.theconversation.com/files/466685/original/file-20220601-66680-bioj3t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=380&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/466685/original/file-20220601-66680-bioj3t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=380&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/466685/original/file-20220601-66680-bioj3t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=478&fit=crop&dpr=1 754w, https://images.theconversation.com/files/466685/original/file-20220601-66680-bioj3t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=478&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/466685/original/file-20220601-66680-bioj3t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=478&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">For some, keeping one’s distance might be a preferable alternative to getting one’s DNA stolen.</span>
<span class="attribution"><a class="source" href="https://newsroom.ap.org/detail/RussiaUkraineTalkingtoPutin/0778415f155a4cff94894c58f9fb6bb8">AP Photo/Pool Sputnik Kremlin</a></span>
</figcaption>
</figure>
<p><a href="https://law.emory.edu/faculty/faculty-profiles/vertinsky-profile.html">We are</a> <a href="https://scholar.google.com/citations?user=OKxLE-QAAAAJ&hl=en">law professors</a> who study how emerging technologies like genetic sequencing are regulated. We <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3559405">believe that</a> growing public interest in genetics has increased the likelihood that <a href="https://www.cambridge.org/core/books/abs/consumer-genetic-technologies/genetic-paparazzi/7B0D35C61C3CBD9DA3FE0D457C22BB9B">genetic paparazzi</a> with DNA collection kits may soon become as ubiquitous as ones with cameras. </p>
<p>While courts have for the most part <a href="https://www.forbes.com/sites/michellefabio/2018/04/23/madonna-loses-fight-to-reclaim-tupacs-letter-other-highly-personal-items/">managed to evade</a> dealing with the complexities of surreptitious DNA collection and testing of public figures, they won’t be able to avoid dealing with it for much longer. And when they do, they are going to run squarely into the <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3559405">limitations of existing legal frameworks</a> when it comes to genetics.</p>
<h2>Genetic information troves</h2>
<p>You <a href="https://journalofethics.ama-assn.org/article/shedding-privacy-along-our-genetic-material-what-constitutes-adequate-legal-protection-against/2016-03">leave your DNA behind you</a> everywhere you go. The strands of hair, fingernails, dead skin and saliva you shed as you move through your day are all collectible trails of DNA.</p>
<p>Genetic analysis can reveal not only personal information, such as existing health conditions or risk for developing certain diseases, but also core aspects of a person’s identity, such as their ancestry and the potential traits of their future children. In addition, as genetic technologies continue to evolve, fears about using surreptitiously collected genetic material for <a href="https://news.gsu.edu/2020/04/28/genetic-paparazzi-could-celebrity-dna-become-public-domain/">reproductive purposes</a> via <a href="https://doi.org/10.1093/jlb/lsv057">in vitro gametogenesis</a> become more than just paranoia.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/Eb_o8hQNUFI?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">In vitro gametogenesis (IVG), while still in development, could allow prospective parents to create egg or sperm from other parts of the body, like skin.</span></figcaption>
</figure>
<p>Ultimately, taking an individual’s genetic material and information without their consent is an intrusion into a legal domain that is still considered <a href="https://doi.org/10.1017/9781108874106.012">deeply personal</a>. Despite this, there are <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3559405">few laws</a> protecting the interests of individuals regarding their genetic material and information. </p>
<h2>Existing legal frameworks</h2>
<p>When disputes involving genetic theft from public figures inevitably reach the courtroom, judges will need to confront fundamental questions about how genetics relates to personhood and identity, property, health and disease, intellectual property and reproductive rights. Such questions have already been raised in cases involving the <a href="https://www.virginialawreview.org/articles/genetic-privacy-after-carpenter/">use of genetics in law enforcement</a>, the <a href="https://www.science.org/content/article/us-supreme-court-strikes-down-human-gene-patents">patentability of DNA</a> and ownership of <a href="https://scholarship.law.nd.edu/ndlr/vol93/iss3/5/">discarded genetic materials</a>. </p>
<p>In each of these cases, courts focused on <a href="https://columbialawreview.org/content/dna-by-the-entirety-2/">only one dimension</a> of genetics, such as privacy rights or the value of genetic information for biomedical research. But this limited approach disregards <a href="https://doi.org/10.1111/j.1748-720x.2007.00161.x">other aspects</a>, such as the privacy of family members with shared genetics, or property and identity interests someone may have in genetic material discarded as part of a medical procedure.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1336309197697413123"}"></div></p>
<p>In the case of genetic paparazzi, courts will presumably try to fit complex questions about genetics into the legal framework of <a href="https://scholarship.law.upenn.edu/jcl/vol19/iss4/4/">privacy rights</a> because this is how they have approached other intrusions into the lives of public figures in the past. </p>
<p>Modern <a href="https://heinonline.org/HOL/LandingPage?handle=hein.journals/hclwpo11&div=16&id=&page=">U.S. privacy law</a> is a complex web of state and federal regulations governing how information can be acquired, accessed, stored and used. The right to privacy is limited by First Amendment protections on the freedom of speech and press, as well as Fourth Amendment prohibitions on unreasonable searches and seizure. <a href="https://scholarship.law.upenn.edu/cgi/viewcontent.cgi?article=1633&context=jcl">Public figures</a> face further restrictions on their privacy rights because they are objects of legitimate public interest. On the other hand, they also have publicity rights that control the commercial value of their unique personally identifying traits.</p>
<p>People whose genetic material has been taken without their consent may also raise a <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3357566">claim of conversion</a> that their property has been interfered with and lost. Courts in Florida are currently considering a conversion claim in a <a href="https://gizmodo.com/how-a-legal-brawl-between-two-rich-guys-could-change-ho-1824191082">private dispute</a> where the former CEO of Marvel Entertainment and his wife accused a millionaire businessman of stealing their DNA to prove that they were slandering him through a hate-mail campaign. This approach replaces the narrow legal framework of privacy with an even narrower framework of property, reducing genetics to an object that someone possesses.</p>
<h2>What the future may hold</h2>
<p>Under existing laws and the current state of genetic technology, most people don’t need to worry about surreptitious collection and use of genetic material in the way that public figures might. But genetic paparazzi cases will likely play an important role in determining what rights everyone else will or will not have.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/466689/original/file-20220601-48776-susuv1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Open 23andMe genetic testing kit" src="https://images.theconversation.com/files/466689/original/file-20220601-48776-susuv1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/466689/original/file-20220601-48776-susuv1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/466689/original/file-20220601-48776-susuv1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/466689/original/file-20220601-48776-susuv1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/466689/original/file-20220601-48776-susuv1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/466689/original/file-20220601-48776-susuv1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/466689/original/file-20220601-48776-susuv1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">As DNA testing technology advances, questions about genetic privacy and ownership will only become more complex.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/this-illustration-picture-shows-a-saliva-collection-kit-for-news-photo/1074407824">Eric Baradat/AFP via Getty Images</a></span>
</figcaption>
</figure>
<p>The U.S. Supreme Court is very unlikely to recognize new rights, or even affirm previously recognized rights, that are <a href="https://www.washingtonpost.com/outlook/2022/03/25/ketanji-brown-jackson-roe/">not explicitly mentioned in the Constitution</a>. Therefore, at least at the federal level, individual protections for genetic material and information are not likely to adapt to changing times.</p>
<p>This means that cases involving genetics are likely to fall within the purview of state legislatures and courts. But none of the states have <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3559405">adequately grappled</a> with the complexities of genetic legal claims. Even in states with laws specifically designed to protect genetic privacy, regulations cover only a <a href="https://doi.org/10.1038/nrg3113">narrow range</a> of genetic interests. Some laws, for example, may prohibit disclosure of genetic information, but not collection.</p>
<p>For better or for worse, how the courts rule in genetic paparazzi cases will shape how society thinks about genetic privacy and about individual rights regarding genetics more broadly.</p><img src="https://counter.theconversation.com/content/178866/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Both Macron and Madonna have expressed concerns about genetic privacy. As DNA collection and sequencing becomes increasingly commonplace, what may seem paranoid may instead be prescient.Liza Vertinsky, Professor of Law, University of MarylandYaniv Heled, Professor of Law, Georgia State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1762032022-02-07T14:35:56Z2022-02-07T14:35:56ZCookies: I looked at 50 well-known websites and most are gathering our data illegally<figure><img src="https://images.theconversation.com/files/444547/original/file-20220204-25-3us4d4.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">It takes the biscuit. </span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/cube-dice-block-cookie-icon-on-1954978213">stockwerk-fotodesign</a></span></figcaption></figure><p>The owners of Google and Facebook were <a href="https://www.cnil.fr/en/cookies-cnil-fines-google-total-150-million-euros-and-facebook-60-million-euros-non-compliance">both heavily fined</a> for using cookies illegally at the tail end of 2021 by the French data protection authority, <a href="https://www.cnil.fr/en/cookies-cnil-fines-google-total-150-million-euros-and-facebook-60-million-euros-non-compliance">Commission Nationale de l’Informatique et des Liberté</a> (CNIL). On the French versions of Google, its sister platform YouTube, and Facebook, users were being asked to consent to cookies in such a way that it was much easier for them to accept than reject the request. They could accept cookies with just one click but there was a more laborious process for refusing. </p>
<hr>
<iframe id="noa-web-audio-player" style="border: none" src="https://embed-player.newsoveraudio.com/v4?key=x84olp&id=https://theconversation.com/cookies-i-looked-at-50-well-known-websites-and-most-are-gathering-our-data-illegally-176203&bgColor=F5F5F5&color=D8352A&playColor=D8352A" width="100%" height="110px"></iframe>
<p><em>You can listen to more articles from The Conversation, narrated by Noa, <a href="https://theconversation.com/uk/topics/audio-narrated-99682">here</a>.</em></p>
<hr>
<p>Google owner Alphabet was fined €150 million (£125 million) and Facebook owner Meta €60 million. Alphabet was fined more because its breaches affected more people and it had been in trouble for violations <a href="https://www.cnil.fr/en/cookies-council-state-confirms-sanction-imposed-cnil-2020-google">in the past</a>. Both companies were also given three months to change their systems to make it as easy for users to reject cookie requests. </p>
<p>Meta and Alphabet have yet to comply, though they have until April to do so. The law in the UK and the rest of the EU is also the same as in France, so it is going to be interesting to see what they do in these jurisdictions too. </p>
<p>In the meantime, I looked at what many other companies were doing and found that many are still collecting data using cookies in similar ways. So what’s going on? </p>
<h2>Cookie laws and workarounds</h2>
<p>Cookies are small text files stored by websites on our internet browsers, which allow the website to gather information about us. Some cookies <a href="https://www.cookiepro.com/knowledge/what-are-strictly-necessary-cookies/#:%7E:text=Examples%20of%20strictly%20necessary%20cookies,a%20website%20through%20logging%20in.">are necessary</a> for us to be able to browse the site in question – for example, to add items to a shopping cart. </p>
<p>More <a href="https://www.dataguard.co.uk/blog/data-protection-third-party-cookies-vs.-first-party-cookies">contentious cookies</a> track a user’s <a href="https://gdpr.eu/cookies/">browsing behaviour</a>. There are first-person cookies, where the site in question tracks users’ behaviour to offer them relevant products; and third-party cookies, where this is done by another company to allow others to advertise to the user instead – the classic example is Google Ads. </p>
<p>Cookies gather so much information that it is usually more than enough to identify the person behind the device. Besides visits to particular web pages, they <a href="https://privacy.net/stop-cookies-tracking/">can also record</a> a person’s search queries, goods or services purchased, IP address and exact location. </p>
<p>From this, it is possible to infer a person’s name, nationality, language, religion, sexual orientation and other intimate details – most of which are <a href="https://gdpr-info.eu/art-9-gdpr/">special categories</a> of personal data that cannot be processed without the explicit consent of the individual under EU <a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32002L0058&from=EN">ePrivacy Directive</a> and the EU and UK’s General Data Protection Regulation (GDPR). </p>
<p>The GDPR requires <a href="https://gdpr-info.eu/recitals/no-32/">such consent</a> to be specific, informed, unambiguous and <a href="https://gdpr-info.eu/art-7-gdpr/">given freely</a> – requiring affirmative action by the user. Unfortunately, this is not giving us a great deal of protection.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/444548/original/file-20220204-13-btb6yl.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Artist's representation of the GDPR attached to various padlock symbols" src="https://images.theconversation.com/files/444548/original/file-20220204-13-btb6yl.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/444548/original/file-20220204-13-btb6yl.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=300&fit=crop&dpr=1 600w, https://images.theconversation.com/files/444548/original/file-20220204-13-btb6yl.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=300&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/444548/original/file-20220204-13-btb6yl.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=300&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/444548/original/file-20220204-13-btb6yl.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=377&fit=crop&dpr=1 754w, https://images.theconversation.com/files/444548/original/file-20220204-13-btb6yl.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=377&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/444548/original/file-20220204-13-btb6yl.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=377&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Worth the paper it’s written on?</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/3d-illustration-abstract-network-protected-against-1064933888">Olivier le Moal</a></span>
</figcaption>
</figure>
<p>Websites have used various methods to get around the requirements. Most cookie consent requests used to be presented with pre-selected tick boxes that, by default, made individuals accept cookies on their devices. In 2019 the <a href="https://curia.europa.eu/juris/document/document.jsf;jsessionid=85B99257798E49292E725F5425756780?text=&docid=218462&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=9152887">Court of Justice of the European Union (CJEU)</a> decided websites could no longer do this, since it avoided the GDPR’s affirmative action requirement. But such is the value of the data that can be gathered using cookies that websites merely switched to different workarounds instead. </p>
<p>The popular option is the one that saw Facebook and Google sanctioned by the CNIL in France. The CNIL essentially said that when it comes to refusing cookie consent, two clicks are too many: it meant that people are being pressured into consenting, and was therefore contrary to the GDPR’s free consent requirement. This presumably explains why, from a <a href="https://arxiv.org/abs/2001.02479">2020 experimental study</a> of users who had lived in the EU, 93% accepted cookies regardless of having a second window option for managing them. </p>
<h2>The wider issue</h2>
<p>The French interpretation of the GDPR is not binding on the British courts, the CJEU or other regulators in Europe. So, once the CNIL’s three-month deadline runs out, websites with similar imbalanced cookie consent in other GDPR countries might claim there is an ambiguity in the law around what counts as consent. But really the law is quite clear and the French interpretation should be a strong signal that other privacy authorities will reach a similar conclusion. </p>
<p>And yet, when I looked at 50 randomly chosen well-known websites, only 15 (30%) appear to comply with the EU/UK data privacy laws. Some of those sites which are compliant, such as <a href="https://www.ebay.co.uk/">ebay.co.uk</a>, provide “Accept” and “Decline” buttons in the same banner. Others such as <a href="https://www.bbc.co.uk/">bbc.co.uk</a> make it more difficult to reject cookies but allow users to browse without consenting to them. </p>
<p>As many as 32 (64%) of the sites did not appear to comply with EU and UK cookies laws. These include Google, Facebook and Twitter, as well as other major businesses such as <a href="https://www.ryanair.com/gb/en">Ryanair</a> and the website of <a href="https://www.mirror.co.uk/">the Daily Mirror</a>. </p>
<p>Twitter, for example, merely notifies the user of consent in a banner that states: “By using Twitter’s services, you agree to our cookies use”. Other companies, including Google and Facebook, hide the refuse/decline button in a second window. Still others, such as Ryanair, create a cookies wall where visitors may use the site only if they choose “Yes, I agree” or go to the “View cookies setting” to select their preferences. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/444549/original/file-20220204-25-4v62gt.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Screenshot of Ryanair cookie request window" src="https://images.theconversation.com/files/444549/original/file-20220204-25-4v62gt.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/444549/original/file-20220204-25-4v62gt.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=284&fit=crop&dpr=1 600w, https://images.theconversation.com/files/444549/original/file-20220204-25-4v62gt.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=284&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/444549/original/file-20220204-25-4v62gt.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=284&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/444549/original/file-20220204-25-4v62gt.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=357&fit=crop&dpr=1 754w, https://images.theconversation.com/files/444549/original/file-20220204-25-4v62gt.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=357&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/444549/original/file-20220204-25-4v62gt.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=357&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Ryanair</span></span>
</figcaption>
</figure>
<p>There were a further three websites where it was either unclear or borderline as to whether they were within the rules. <a href="https://open.spotify.com/">Spotify</a>, like the BBC, has a typical cookies banner but lets users browse without accepting the cookies. But its cookies banner covers half of the device screen. This reduces the quality of the user’s browsing experience and could potentially be regarded as a coercive practice.</p>
<p>The fact that big tech companies are not complying with cookies laws suggests that millions of citizens are likely having their personal data gathered unlawfully. It is hard not to wonder if some companies are knowingly breaching the rules because they generate so much revenue from their cookies that it’s worth risking a sanction for a privacy breach. </p>
<p>They may also be betting that the relevant authorities are too underfunded or understaffed to enforce the rules. For example, a <a href="https://www.nationaleombudsman.nl/system/files/bijlage/Nationale%20ombudsman%20-%20Rapport%20Autoriteit%20Persoonsgegevens%20Voor%20een%20dichte%20deur_0.pdf">recent report</a> by the Dutch ombudsman highlighted that the relevant authority in that country had 9,800 unresolved privacy complaints at the end of 2020. And <a href="https://www.iccl.ie/wp-content/uploads/2021/09/Europes-enforcement-paralysis-2021-ICCL-report-on-GDPR-enforcement.pdf">according to</a> the Irish Council for Civil Liberties, “almost all (98%) major GDPR cases referred to Ireland remain unresolved” – in part due to lack of budget and sufficient specialist staff. The situation is unlikely to be radically different in other EU countries. </p>
<p>If the UK and EU are serious about protecting citizens’ privacy, they need to amend the rules to be more specific about what a consent window should look like, and run information campaigns to make it clear to citizens that withholding consent cannot in any way limit their browsing experience. They should also allocate the required resources to enforce the rules. Only then will the laws around these little-understood tools for harvesting our data be fit for purpose. </p>
<hr>
<p><em>We asked Meta, Alphabet, Ryanair, Twitter and Daily Mirror publisher Reach if they would like to comment. Reach declined and Alphabet, Twitter and Ryanair did not respond. Meta said:</em> </p>
<blockquote>
<p>We are reviewing the [CNIL’s] decision, and remain committed to working with relevant authorities. Our cookie consent controls provide people with greater control over their data, including a new settings menu on Facebook and Instagram where people can revisit and manage their decisions at any time, and we continue to develop and improve these controls.</p>
</blockquote><img src="https://counter.theconversation.com/content/176203/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Asress Adimi Gikay does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The laws about cookies are fairly clear in EU and UK, but many big companies are breaking them anyway.Asress Adimi Gikay, Lecturer in AI, Disruptive Innovation and Law| Brunel Law School| Centre for AI: Social and Digital Innovation, Brunel University LondonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1730952021-12-02T18:04:22Z2021-12-02T18:04:22ZMeghan Markle: Mail on Sunday loses appeal in privacy case – the judgment explained<p>Associated Newspapers Limited (ANL) the publisher of the Daily Mail, the Mail on Sunday and MailOnline, has lost an appeal in its three-year legal battle against Meghan Markle, the Duchess of Sussex, over the Mail on Sunday’s publication of extracts from a letter written by the duchess to her father, Thomas Markle, in August 2018.</p>
<p>The duchess sued for copyright infringement and breach of privacy, arguing that the letter to her father had been “private and personal”. High Court judge Mr Justice Warby issued a <a href="https://www.judiciary.uk/wp-content/uploads/2021/02/Duchess-of-Sussex-v-Associated-2021-EWCH-273-Ch.pdf">summary judgment</a> in favour of the duchess in February 2021 which upheld her claim of breach of privacy and copyright infringement. </p>
<p>A <a href="https://www.jmw.co.uk/services-for-you/media-law/blog/meghan-markle-awarded-summary-judgment-against-mail-what-does-it-mean">summary judgment means</a> that in this case the judge ruled there was no need to go to trial in order to reach a determination. This is because ANL’s defence had no real prospect of success.</p>
<p>ANL was given permission to appeal and brought fresh evidence to support its argument that the case should go to trial. But the Court of Appeal <a href="https://www.judiciary.uk/wp-content/uploads/2021/12/Sussex-v-Associated-News-judgment-021221.pdf">has stated</a> that, even in light of the new evidence, Warby had been correct in granting the summary judgment, and therefore that the Mail on Sunday was liable for copyright infringement and breach of privacy. </p>
<h2>Copyright infringement</h2>
<p>Copyright protects things such as writings – including, as in this case, a letter. The copyright in the contents of the letter belongs to the person who wrote the letter – not the recipient. To use someone’s copyright-protected letter without their permission is copyright infringement, unless an exception applies. </p>
<p>ANL argued that its use of the letter fell within a copyright exception, known as “<a href="https://www.copyrightuser.org/understand/exceptions/news-reporting/">fair dealing</a>” for the purposes of reporting current events. But for this exception to apply, certain criteria have to be met – including that the purpose of the use is for reporting current events and the amount taken was fair. </p>
<p>The summary judgment found that the Mail on Sunday’s printing of the letter had been for the purpose of reporting its contents – which was not a current event. Warby also ruled that the amount of material from the letter published by the newspaper had been too great to be fair and was irrelevant and disproportionate to any legitimate reporting purpose.</p>
<p>Appealing his decision, ANL argued that Meghan’s father, Thomas Markle, wanted to publish the letter because he felt an article published in the US by <a href="https://people.com/royals/meghan-markle-dad-thomas-markle-letter-after-wedding/">People Magazine</a>, featuring interviews with friends of the duchess portraying her as a “caring daughter” who had intended the letter as an “olive branch”, had been inaccurate. But the court ruled that the contents of the letter did not support this point, as it mostly reinforced the points made against him in the People Article.</p>
<p>The court also disagreed with ANL’s submission that the use of material from the letter was in the public interest. The public interest defence can be used to stop copyright enforcement in the name of free speech – but it only applies in special circumstances. It is a very rare for this defence to justify copyright infringement, particularly where a fair dealing defence also fails. So, it is unsurprising that the Mail on Sunday also failed on this defence. </p>
<p>So the Court of Appeal ruled that none of these defences applied and upheld Warby’s judgment that the Mail on Sunday had infringed Meghan Markle’s copyright in the contents of the letter when they published it. </p>
<h2>Privacy</h2>
<p>The duchess also sued ANL for misuse of private information. To make this claim, a claimant must demonstrate “<a href="https://www.supremecourt.uk/docs/speech_100825.pdf">a reasonable expectation of privacy</a>”. ANL argued that Meghan thought the letter might be leaked and therefore did not have a reasonable expectation of privacy in the contents of the letter. </p>
<p>To support this contention, it presented evidence from Jason Knauf, former communications secretary to the Sussexes, who claimed in a witness statement that the letter had been written with the expectation it might become public.</p>
<p>But court of appeal judges, Sir Geoffrey Vos, Dame Victoria Sharp and Lord Justice Bean, upheld Warby’s decision to grant summary judgment, and ruled that the duchess had a “reasonable expectation of privacy” in the contents of the letter. “Those contents were personal, private and not matters of legitimate public interest,” Vos, the Master of the Rolls, said in a statement read aloud in court. </p>
<p>ANL also argued in its appeal that the duchess had shared the letter with Omid Scobie and Caroline Durand, authors of a book about the duke and duchess called Finding Freedom. To make this point, ANL also relied on Knauf’s evidence, which disclosed that he had provided some information to the authors of the book with Meghan’s knowledge. This, the publisher argued, destroyed her reasonable expectation of privacy by putting the letter into the public domain. </p>
<p>But the court found that even if Meghan had shared a quote from the letter with the authors of the book, she still had a reasonable expectation of privacy in the detailed contents of the letter, so the Mail on Sunday had breached Meghan’s privacy rights by publishing its contents. </p>
<p>Having lost the case on appeal, the Mail on Sunday will have to do four things:
1) publish a correction and apology
2) pay damages (likely in the form of account of profits)
3) destroy copies of the letter in their possession
4) be subject to an injunction that stops them from infringing Meghan’s copyright and privacy rights in the future. </p>
<p>It is not yet known whether ANL will pursue a further appeal to the UK Supreme Court. In order to do this, they would need to seek permission from the Court of Appeal.</p><img src="https://counter.theconversation.com/content/173095/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Hayleigh Bosher does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>How the UK Court of Appeal reached its decision.Hayleigh Bosher, Senior Lecturer in Intellectual Property Law, Brunel University LondonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1707112021-10-27T04:22:07Z2021-10-27T04:22:07ZA new proposed privacy code promises tough rules and $10 million penalties for tech giants<figure><img src="https://images.theconversation.com/files/428675/original/file-20211027-21-chefvu.jpeg?ixlib=rb-1.1.0&rect=5%2C2%2C1991%2C1353&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>This week the federal government <a href="https://consultations.ag.gov.au/rights-and-protections/online-privacy-bill-exposure-draft/">announced</a> proposed legislation to develop an online privacy code (or “OP Code”) setting tougher privacy standards for Facebook, Google, Amazon and many other online platforms. </p>
<p>These companies collect and use vast amounts of consumers’ personal data, much of it without their knowledge or real consent, and the code is intended to guard against privacy harms from these practices.</p>
<p>The higher standards would be backed by increased penalties for interference with privacy under the Privacy Act and greater enforcement powers for the federal privacy commissioner. Serious or repeated breaches of the code could carry penalties of up to A$10 million or 10% of turnover for companies.</p>
<p>However, relevant companies are likely to try to avoid obligations under the OP Code by drawing out the process for drafting and registering the code. They are also likely to try to exclude themselves from the code’s coverage, and argue about the definition of “personal information”.</p>
<p>The current definition of “personal information” under the Privacy Act does not clearly include technical data such as IP addresses and device identifiers. Updating this will be important to ensure the OP Code is effective.</p>
<h2>Which organisations would be covered and why?</h2>
<p>The code is intended to address some clear online privacy dangers, while we await broader changes from the <a href="https://consultations.ag.gov.au/rights-and-protections/privacy-act-review-discussion-paper/">current broader review of the Privacy Act</a> that would apply across all sectors.</p>
<p>The OP Code would target online platforms that “collect a high volume of personal information or trade in personal information”, including:</p>
<ul>
<li><p>social media networks such as Facebook; dating apps like Bumble; online blogging or forum sites like Reddit; gaming platforms; online messaging and videoconferencing services such as WhatsApp and Zoom</p></li>
<li><p><a href="https://theconversation.com/its-time-for-third-party-data-brokers-to-emerge-from-the-shadows-94298">data brokers</a> that trade in personal information, including Quantium, Acxiom, Experian and Nielsen Corporation</p></li>
<li><p>other large online platforms that collect personal information and have more than 2.5 million annual users in Australia, such as Amazon, Google and Apple.</p></li>
</ul>
<p>The OP Code would impose higher standards for these companies than otherwise apply under the Privacy Act.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/its-time-for-third-party-data-brokers-to-emerge-from-the-shadows-94298">It's time for third-party data brokers to emerge from the shadows</a>
</strong>
</em>
</p>
<hr>
<h2>Higher standards for consent - maybe</h2>
<p>The OP Code would set out details about how these organisations must meet obligations under the Privacy Act. This would include higher standards for what constitutes users’ “consent” for how their data are used.</p>
<p>The government’s <a href="https://consultations.ag.gov.au/rights-and-protections/online-privacy-bill-exposure-draft/user_uploads/online-privacy-bill-explanatory-paper.pdf">explanatory paper</a> says the OP Code would require consent to be “voluntary, informed, unambiguous, specific and current”. (Unfortunately, the draft legislation itself doesn’t actually say that, and will require some amendment to achieve this.)</p>
<p>This description draws on the definition of consent in the European Union’s <a href="https://gdpr.eu/what-is-gdpr/">General Data Protection Regulation</a>.</p>
<p>In the EU, for example, <a href="https://gdpr-info.eu/issues/consent/">“unambiguous” consent</a> means a person must take clear, affirmative action – for instance by ticking a box or clicking a button – to consent to a use of their information. </p>
<p>Consent must also be “specific”, so companies cannot, for example, require consumers to consent to unrelated uses (such as market research) when their data is only needed to process a specific purchase.</p>
<h2>Requests to stop using and disclosing personal information</h2>
<p>The ACCC recommended we should have a right to erase our personal data as a means of reducing the power imbalance between consumers and large platforms. In the EU, the “right to be forgotten” by search engines and the like is part of this erasure right. The government has not adopted this recommendation.</p>
<p>However, the OP Code would include an obligation for organisations to comply with a consumer’s reasonable request to stop using and disclosing their personal data. Companies would be allowed to charge a “non-excessive” fee for fulfilling these requests. This is a very weak version of the EU right to be forgotten.</p>
<p>For example, Amazon currently states in its <a href="https://www.amazon.com.au/gp/help/customer/display.html?nodeId=GX7NJQ4ZB8MHFRNJ#GUID-C3396B35-7018-45C5-999A-5989043DA870__SECTION_C877F3A6113249BF905B04840EFB3496">privacy policy</a> that it uses customers’ personal data in its advertising business and discloses the data to its vast Amazon.com corporate group. The proposed OP Code would mean Amazon would have to stop this, at a customer’s request, unless it had reasonable grounds for refusing.</p>
<p>Ideally, the code should also allow consumers to ask a company to stop <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3905693">collecting their personal information from third parties</a>, as they currently do, to build profiles on us.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-one-simple-rule-change-could-curb-online-retailers-snooping-on-you-166174">How one simple rule change could curb online retailers' snooping on you</a>
</strong>
</em>
</p>
<hr>
<h2>Increased protections for children and vulnerable groups</h2>
<p>The draft bill also includes a vague provision for the OP Code to add protections for kids and other vulnerable people who are not capable of making their own privacy decisions.</p>
<p>A more controversial proposal would require new consents and verification for kids using social media services such as Facebook and WhatsApp. These services would be required to:</p>
<ul>
<li><p>take reasonable steps to verify the age of social media users</p></li>
<li><p>obtain parental consent before collecting, using or disclosing personal information of a child under 16</p></li>
<li><p>ensure its data practices are “fair and reasonable in the circumstances”, with the best interests of the child as the primary consideration.</p></li>
</ul>
<h2>What is ‘personal information’?</h2>
<p>A key tactic companies will likely use to avoid the new rules is to claim that the information they use is not truly “personal”, since the OP Code and the Privacy Act only apply to “personal information”, as defined in the Act. </p>
<p>The companies may claim the data they collect is only connected to our individual device or to an online identifier they’ve allocated to us, rather than our legal name. However, the effect is the same. The data is used to build a more detailed profile on an individual and to have effects on that individual.</p>
<p>Australia needs to update the definition of “personal information” to clarify it includes data such as IP addresses, device identifiers, location data, and any other online identifiers that may be used to identify an individual or to interact with them on an individual basis. Data should only be de-identified if no individual is identifiable from that data. </p>
<h2>Increased penalties and upgraded enforcement</h2>
<p>The government has pledged to give tougher powers to the privacy commissioner, and to hit companies with tougher penalties for breaching their obligations once the code comes into effect.</p>
<p>The maximum civil penalty for a serious and/or repeated interference with privacy will be increased up to the equivalent penalties in the Australian Consumer Law. </p>
<p>For individuals, the maximum penalty will increase to more than A$500,000. For corporations, the maximum will be the greater of A$10 million, or three times the value of the benefit received from the breach, or (if this value cannot be determined) 10% of the company’s annual turnover.</p>
<p>The privacy commissioner could also issue infringement notices for failing to provide relevant information to an investigation. The maximum penalty will be A$2,644 for individuals or A$13,320 for companies.</p>
<p>Such civil penalty provisions will make it unnecessary for the Commissioner to resort to prosecution of a criminal offence, or to civil litigation, in these cases. </p>
<h2>Don’t hold your breath</h2>
<p>Once legislation is passed, it will take around 12 months for the code to be developed and registered.</p>
<p>The tech giants will have plenty of opportunity to create delay in this process. Companies are likely to challenge the content of the code, and whether they should even be covered by it at all.</p><img src="https://counter.theconversation.com/content/170711/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, the Centre for Law, Markets & Regulation and the Australian Privacy Foundation.</span></em></p><p class="fine-print"><em><span>Graham Greenleaf is a board member of the NGO, the Australian Privacy Foundation.</span></em></p>A proposed online privacy code would give consumers more control over how tech companies collect and use their dataKatharine Kemp, Senior Lecturer, Faculty of Law & Justice, UNSW, UNSW SydneyGraham Greenleaf, Professor of Law and Information Systems, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1701802021-10-25T15:03:34Z2021-10-25T15:03:34ZSmart doorbells: how to use them without infringing a neighbour’s privacy<figure><img src="https://images.theconversation.com/files/428274/original/file-20211025-19-1cnnke9.jpg?ixlib=rb-1.1.0&rect=10%2C0%2C7170%2C4791&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/woman-rings-house-intercom-camera-installed-1747420274">RossHelen/Shutterstock</a></span></figcaption></figure><p>As any local solicitor can tell you, some of the most bitter legal disputes originate from <a href="http://www.problemneighbours.co.uk/neighbourissuescategory.html">disagreements between neighbours</a>. Whether it’s property boundaries, loud music or parking spaces, what might initially be minor irritations can gradually lead to a full-blown court battle.</p>
<p>A relatively recent development in neighbour conflicts are clashes centred on home surveillance products, such as <a href="https://www.pcmag.com/picks/the-best-smart-home-security-systems">CCTV cameras</a> and <a href="https://www.pcmag.com/picks/the-best-video-doorbells">smart doorbells</a>. These technologies, which may capture footage beyond the householder’s property, can pit householders against neighbours who feel their homes and private lives are being unfairly spied upon.</p>
<p>Indeed, a UK judge <a href="https://www.bbc.co.uk/news/technology-58911296">recently ruled</a> that a man’s home security system invaded his neighbour’s privacy, and he now faces having to pay potential damages of <a href="https://www.manchestereveningnews.co.uk/news/uk-news/brits-video-doorbells-installed-incorrectly-21861280">up to £100,000</a>. So what are the privacy implications of this technology, and what do people need to know if they have, or are considering installing, a smart doorbell?</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/zaos-deepfake-face-swapping-app-shows-uploading-your-photos-is-riskier-than-ever-122334">Zao's deepfake face-swapping app shows uploading your photos is riskier than ever</a>
</strong>
</em>
</p>
<hr>
<p>The use of surveillance technologies is governed by a range of measures. Some provide advice and guidance, like the surveillance camera <a href="https://www.gov.uk/government/publications/surveillance-camera-code-of-practice">code of practice</a>, which sets out <a href="https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/368115/Leaflet_v6_WEB.pdf">principles for operators</a> to follow. Others are legal requirements, such as the rules for collection and processing of personal data under the <a href="https://ico.org.uk/for-organisations/guide-to-data-protection/introduction-to-data-protection/about-the-dpa-2018/">Data Protection Act 2018</a> and the <a href="https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/">UK General Data Protection Regulation</a> (GDPR).</p>
<p>These measures aim to ensure that any use of surveillance technologies is for legitimate purposes, proportionate, and compliant with relevant legal obligations. A key concern is that surveillance should, as far as possible, be with the informed consent of those surveilled.</p>
<h2>Where do smart doorbells fit in?</h2>
<p>Purely “domestic use” of personal data by a private individual <a href="https://gdpr-info.eu/recitals/no-18/">is exempted</a> from the data protection legislation – for example your list of addresses for sending Christmas cards. But it’s well established that home surveillance systems, including CCTV and smart doorbells, are subject to UK data protection legislation. </p>
<p>A <a href="https://curia.europa.eu/juris/liste.jsf?num=C-212/13">key case</a> in 2014 looked at the use of a home CCTV system by a Mr Ryneš in the Czech Republic. The Court of Justice of the European Union (CJEU) held that while Ryneš’ CCTV system was installed for a legitimate purpose – the protection of his property and personal security – the data collection went beyond that permitted solely for domestic use. This is because it collected personal data from a public space, including a footpath and the entrance to his neighbour’s house opposite. </p>
<p>With this ruling, the CJEU confirmed that domestic surveillance systems fall within the scope of the data protection legislation where they capture data beyond the boundaries of the homeowner’s property. This interpretation remains applicable under UK law for now, although the UK government could potentially alter the scope of the “domestic exemption” now that the UK has left the EU.</p>
<figure class="align-center ">
<img alt="A person using a smartphone." src="https://images.theconversation.com/files/428125/original/file-20211024-19-1u9kr72.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/428125/original/file-20211024-19-1u9kr72.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/428125/original/file-20211024-19-1u9kr72.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/428125/original/file-20211024-19-1u9kr72.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/428125/original/file-20211024-19-1u9kr72.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=501&fit=crop&dpr=1 754w, https://images.theconversation.com/files/428125/original/file-20211024-19-1u9kr72.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=501&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/428125/original/file-20211024-19-1u9kr72.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=501&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Smart doorbells notify the homeowner via an app when someone is outside their property.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/young-teenager-girl-hand-using-mobile-1935518875">siamionau pavel/Shutterstock</a></span>
</figcaption>
</figure>
<p>While this case <a href="https://www.scotcourts.gov.uk/search-judgments/judgment?id=ecb629a7-8980-69d2-b500-ff0000d74aa7">and others</a> that have followed since didn’t involve smart doorbells specifically, the principle is the same. The case of <a href="https://www.judiciary.uk/wp-content/uploads/2021/10/Fairhurst-v-Woodard-Judgment-1.pdf">Fairhurst vs Woodard</a> in the English County Court in October 2021 reinforces the view that the courts are likely to take a dim view of those who fail to use home surveillance equipment in a way that respects the rights of other people, including their neighbours. </p>
<p>Woodard installed a range of surveillance technology, including CCTV cameras and a smart doorbell, for home security purposes. But these could record video and audio well beyond the boundaries of his property. He then actively misled his neighbour, Fairhurst, as to how and when the cameras operated. The court found Woodard to have breached his data protection obligation to process data in a lawful and transparent way, and to have collected personal data without a specified or lawful purpose, as required by the Data Protection Act 2018 and the GDPR.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-does-gdpr-mean-for-me-an-explainer-96630">What does GDPR mean for me? An explainer</a>
</strong>
</em>
</p>
<hr>
<p>The court did recognise that home security could be a legitimate purpose for collection of data that would otherwise breach a neighbour’s right to privacy, if the collection was reasonable and proportionate for that purpose. For example, in relation to Woodard’s Amazon Ring doorbell, the court held that capture of incidental personal data (such as video of Fairhurst walking past) was permissible. However, the capture of audio at a significant distance exceeded what was reasonable for the purposes, as did the fact the system’s viewing range recorded large areas of Fairhurst’s property, including her side gate, garden and parking space.</p>
<p>It’s worth noting that Woodard’s use of his home surveillance system, and his interaction with Fairhurst concerning that use, also led to a successful action for harassment against him.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1448692404932988930"}"></div></p>
<h2>Some tips</h2>
<p>If you’re considering installing a home surveillance system, such as a smart doorbell, you should:</p>
<ul>
<li><p>identify a clear and justified purpose for your use of CCTV, such as home security;</p></li>
<li><p>when purchasing a system, consider the scope of data it can capture, whether this is reasonable for your intended purpose, and if the system can be tailored to protect other people’s privacy rights. For example, with some systems it’s possible to disable audio, and to set “privacy” zones which are not recorded;</p></li>
<li><p>ensure there is signage stating recording is taking place, and why;</p></li>
<li><p>keep all data collected secure and accessible only to those who need it, and delete it when no longer needed;</p></li>
<li><p>comply with requirements of the Data Protection Act 2018 and the GDPR, such as responding to requests from individuals about data you may hold on them, and deleting data if requested to do so.</p></li>
</ul>
<p>The <a href="https://ico.org.uk/your-data-matters/domestic-cctv-systems-guidance-for-people-using-cctv/">Information Commissioner’s Office</a> has also produced some helpful advice for people installing home CCTV systems. </p>
<p>Pleasingly, providers are becoming more aware of the risks and requirements of home surveillance technologies and are building in new features which may encourage lawful use. For example, Amazon has recently added <a href="https://www.theverge.com/2021/7/13/22574629/ring-end-to-end-encryption-video-streams-us-global">end-to-end encryption</a> to its smart doorbell technologies. This aims to keep personal data captured secure against misuse by third parties by restricting access to video and audio streams to specified devices and permitted users.</p>
<hr>
<p><em>Correction: this article originally said that a man faced a £100,000 fine from a UK judge. This should have said damages instead of a fine, and has now been changed.</em></p><img src="https://counter.theconversation.com/content/170180/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Andrew Charlesworth does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A UK court recently ruled that a man’s smart doorbell invaded his neighbour’s privacy, and he now faces being required to pay damages. But this kind of situation is avoidable.Andrew Charlesworth, Professor of Law, Innovation and Society, University of BristolLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1653172021-08-04T20:08:28Z2021-08-04T20:08:28ZHow far should compulsory proof of vaccination go — and what rights do New Zealanders have?<figure><img src="https://images.theconversation.com/files/414224/original/file-20210802-26-gf5bdu.jpg?ixlib=rb-1.1.0&rect=9%2C0%2C6281%2C3791&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>With greater numbers of people being vaccinated and countries looking to reopen borders safely, the introduction of some form of <a href="https://www.nytimes.com/2021/02/04/travel/coronavirus-vaccine-passports.html">vaccine passport</a> seems increasingly likely.</p>
<p>For New Zealand, where the elimination strategy has been largely successful but which remains vulnerable to border breaches, proof of vaccination may well be a condition of entry. </p>
<p>Health Minister Chris Hipkins has said this would be “<a href="https://www.tvnz.co.nz/one-news/new-zealand/vaccination-status-may-factor-travellers-borders-open">almost an inevitability</a>” within the next year. <a href="https://www.rnz.co.nz/news/national/436921/air-nz-to-trial-digital-health-passport-app">Air New Zealand</a> is one of a number of airlines already trialling the <a href="https://www.iata.org/en/programs/passenger/travel-pass/">IATA travel pass initiative</a>. </p>
<p>Some countries are also requiring “<a href="https://www.bbc.com/news/world-europe-56522408">health passes</a>”, mandatory proof of vaccination or a negative test, including for indoor events (such as sports games and concerts) and hospitality — triggering <a href="https://www.washingtonpost.com/world/2021/07/31/coronavirus-protests-france-vaccine/">anti-restriction protests</a> in the process.</p>
<p>In Britain, the <a href="https://royalsociety.org/-/media/policy/projects/set-c/set-c-vaccine-passports.pdf?la=en-GB&hash=A3319C914245F73795AB163AD15E9021">Royal Society</a> has warned of the potential of vaccine passports to restrict the freedoms of some individuals, or to create a distinction between individuals based on health status. </p>
<p>Furthermore, vaccine passports use sensitive personal information, and recent <a href="https://www.rnz.co.nz/news/national/445735/waikato-dhb-ransomware-attack-documents-released-online">cyber attacks</a> on health sectors in New Zealand and <a href="https://www.irishtimes.com/news/health/hse-may-be-impacted-for-six-months-by-cyberattack-says-reid-1.4594901">overseas</a> are a reminder that data security is not always guaranteed.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1421556519804485636"}"></div></p>
<h2>Vaccine passports aren’t new</h2>
<p>We should remember, however, that freedom of movement across borders has been routinely regulated <a href="https://www.theguardian.com/travel/2006/nov/17/travelnews">throughout history</a>. Modern passports for international travel have been in use for over 100 years. </p>
<p>Proof of vaccination is nothing new, either. Some countries have required certificates for <a href="https://www.who.int/ith/ith_country_list.pdf">yellow fever vaccination</a> for a number of decades, and the World Health Organization’s “yellow card” vaccination document is familiar to many international travellers.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/frances-covid-health-pass-raises-serious-ethical-questions-165116">France’s Covid health pass raises serious ethical questions</a>
</strong>
</em>
</p>
<hr>
<p>In New Zealand, <a href="https://www.health.govt.nz/our-work/preventative-health-wellness/immunisation/national-immunisation-register">immunisation registers</a> document vaccination records for public health purposes. And in Australia, a “<a href="https://www.ncirs.org.au/public/no-jab-no-play-no-jab-no-pay">No Jab No Play/No Jab No Pay</a>” policy governs eligibility for child welfare payments.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1420495150837473280"}"></div></p>
<h2>Rights and freedoms in a public health emergency</h2>
<p>The right to freedom of movement is recognised in the <a href="https://www.un.org/en/about-us/universal-declaration-of-human-rights">Universal Declaration on Human Rights</a> and the <a href="https://www.ohchr.org/en/professionalinterest/pages/ccpr.aspx">International Covenant on Civil and Political Rights</a>, as well as in other core <a href="https://www.mfat.govt.nz/en/peace-rights-and-security/human-rights/#bookmark1">UN human rights treaties</a> that New Zealand has accepted. </p>
<p>The <a href="https://www.legislation.govt.nz/act/public/1990/0109/latest/DLM225517.html">New Zealand Bill of Rights Act</a> also includes the right to freedom of movement.</p>
<p>But border closures and lockdowns clearly demonstrate that this right can be limited, although this “must be necessary and have a legitimate aim, be proportionate and be based in law”. </p>
<p>In reality, any requirement that citizens use vaccine passports or passes will involve balancing various rights.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/before-we-introduce-vaccine-passports-we-need-to-know-how-theyll-be-used-156197">Before we introduce vaccine passports we need to know how they'll be used</a>
</strong>
</em>
</p>
<hr>
<h2>Competing rights and duties</h2>
<p>Limiting the right to freedom of movement can be justified on the grounds of
<a href="https://www.ohchr.org/en/professionalinterest/pages/ccpr.aspx">public health</a>. The <a href="https://www.ohchr.org/en/professionalinterest/pages/cescr.aspx">International Covenant on Economic, Social and Cultural Rights</a> actually requires states to prevent, treat and control epidemic diseases as one means of ensuring the right to the highest attainable standard of health.</p>
<p>Because the <a href="https://www.legislation.govt.nz/act/public/1990/0109/latest/DLM225501.html">New Zealand Bill of Rights Act</a> also permits demonstrably justifiable restrictions on the right to freedom of movement, the various COVID-19 measures adopted by the government under the Health Act had to meet that requirement.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-would-digital-covid-vaccine-passports-work-and-whats-stopping-people-from-faking-them-156032">How would digital COVID vaccine passports work? And what's stopping people from faking them?</a>
</strong>
</em>
</p>
<hr>
<p>And <a href="https://nzhistory.govt.nz/politics/treaty/the-treaty-in-brief">Te Tiriti o Waitangi</a> underpins the <a href="https://www.health.govt.nz/our-work/populations/maori-health/he-korowai-oranga/strengthening-he-korowai-oranga/treaty-waitangi-principles">principle</a> of active protection that “requires the Crown to act, to the fullest extent practicable, to achieve equitable health outcomes for Māori”.</p>
<p>While the New Zealand Bill of Rights Act does not contain a right to privacy, one of aims of the <a href="https://www.legislation.govt.nz/act/public/2020/0031/latest/LMS23227.html">Privacy Act</a> is to give effect to international obligations and standards, including the International Covenant on Civil and Political Rights. </p>
<h2>No grounds for discrimination</h2>
<p>Any initiatives to introduce vaccine passports or health passes must be underpinned by the right to be free from discrimination, as provided for in international human rights law, as well as the New Zealand Bill of Rights Act. The Human Rights Act prohibits discrimination on the basis of <a href="https://www.legislation.govt.nz/act/public/1993/0082/latest/DLM304474.html">physical illness</a>. </p>
<p>These requirements extend to the private sector, with particular rules at play around the provision of <a href="https://www.legislation.govt.nz/act/public/1993/0082/latest/DLM304620.html">goods and services</a> and access by the public to <a href="https://www.legislation.govt.nz/act/public/1993/0082/latest/DLM304617.html">places, vehicles and facilities</a> — an exception to the latter being the risk of infecting others with an illness. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/are-covid-19-vaccine-passports-fair-163838">Are COVID-19 vaccine passports fair?</a>
</strong>
</em>
</p>
<hr>
<p>But it’s not only a question of avoiding discrimination. Just as there have been questions about <a href="https://www.nzma.org.nz/journal-articles/will-access-to-covid-19-vaccine-in-aotearoa-be-equitable-for-priority-populations-open-access">equitable access</a> to COVID-19 vaccines themselves, universal access to the digital technology underpinning passports or passes presents a challenge.</p>
<p>More generally, as a recent <a href="https://www.ohchr.org/EN/NewsEvents/Pages/DisplayNews.aspx?NewsID=26101">UN report warned</a>, “big data and artificial intelligence are entrenching racial inequality, discrimination and intolerance”. </p>
<p>Ultimately, it will be a balancing act, not a case of absolutes. </p>
<p>Digital vaccination certificates will help in the effort to reopen borders and protect public health. But there are significant implications for our rights as individuals. Careful and transparent decisions will be crucial.</p><img src="https://counter.theconversation.com/content/165317/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Claire Breen does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>With proof of vaccination likely to become mandatory for travel – and possibly other activities – a careful balancing of individual and collective rights will be essential.Claire Breen, Professor of Law, University of WaikatoLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1640902021-07-26T15:23:29Z2021-07-26T15:23:29ZThe ‘privacy by design’ approach for mobile apps: why it’s not enough<figure><img src="https://images.theconversation.com/files/412626/original/file-20210722-13-o8wjl7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Mobile apps on smartphones are threats to digital privacy </span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/8th-grade-teacher-uses-a-smartphone-to-mark-a-swahili-news-photo/1211271275?adppopup=true">Yasuyoshi Chiba/AFP via Getty Images </a></span></figcaption></figure><p>The mobile apps installed on our smartphones are one of the biggest threats to our <a href="https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2013/wp202_en.pdf">digital privacy</a>. They are capable of collecting vast amounts of personal data, often highly sensitive. </p>
<p>The consent model on which privacy laws are based doesn’t work. App users remain concerned about privacy, as a recent <a href="https://www.yellowbrick.com/press-releases/yellowbrick-survey-pandemic-era-consumers-love-apps-but-have-security-concerns/">survey</a> shows, but they still aren’t very good at protecting it. They may lack the technical know-how or the time to review privacy terms, or they may lack the willpower to resist the lure of trending apps and personalised in-app offers.</p>
<p>As a result privacy laws have become more detailed, imposing additional requirements about notice, data minimisation, and user rights. Penalties have become harsher. And the laws are often global in reach, such as the <a href="https://www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/childrens-online-privacy-protection-rule">US Children’s Online Privacy Protection Rule</a> and the EU’s <a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679">General Data Protection Regulation</a>. For instance, a South African developer of an app downloaded by children in the US and the EU must comply with both and with <a href="https://www.gov.za/documents/protection-personal-information-act#:%7E:text=The%20Protection%20of%20Personal%20Information,by%20public%20and%20private%20bodies%3B&text=to%20regulate%20the%20flow%20of,provide%20for%20matters%20connected%20therewith.">South Africa’s Protection of Personal Information Act</a>. This complexity can create a significant compliance burden. </p>
<p>But the real problem, according to a <a href="https://www.enisa.europa.eu/publications/privacy-and-data-protection-in-mobile-applications">report</a> by the EU Agency for Cybersecurity, is that lawyers and app developers don’t speak the same language. An app developer may have no idea how to translate abstract legal principles into concrete engineering steps.</p>
<p>As a result regulators have looked to the concept of <a href="https://iapp.org/media/pdf/resource_center/pbd_implement_7found_principles.pdf">“privacy by design”</a> as a way to bridge this divide. The concept was coined in the late 1990s by Ann Cavoukian when she was the Information and Privacy Commissioner for Ontario, Canada. Privacy by design goes beyond privacy policies and in-app permission settings. It requires developers to think about privacy from the first moment of the design process. </p>
<p>Cavoukian set out seven foundational principles for a privacy by design approach. But it is the second principle, “privacy as a default setting”, that really sets the bar for a privacy-friendly app.</p>
<blockquote>
<p>Build in the maximum degree of privacy into the default settings for any system or business practice. Doing so will keep a user’s privacy intact, even if they choose to do nothing.</p>
</blockquote>
<p>This places the responsibility on the app developer to think about the user’s privacy upfront, and design the app in such a way that privacy is protected automatically, while still offering a fully functional app experience.</p>
<p>But <a href="https://researchspace.ukzn.ac.za/xmlui/handle/10413/19431">my research</a> showed that design decisions made by app developers are constrained by existing technologies and platform rules designed by others. These include the device hardware and operating system, the software development kit, ad libraries and app store review policies.</p>
<p>The answer is <a href="https://iapp.org/resources/article/06-22-2012-privacy-by-redesign-a-practical-framework-for-implementation/">privacy by (re)design</a>, where all roleplayers in the ecosystem take privacy seriously and redesign existing platforms and technologies. But enforcing that approach will require tighter legal regulation of third party data sharing.</p>
<h2>Change of mindset</h2>
<p>Applying a privacy by design approach requires a change of mindset by developers. They must be proactive, rather than responding after the fact to a data breach that could have been prevented. The days of collecting as much personal data as possible in the hope that it might prove valuable later are gone. Developers must align data collection to a specific purpose for which the data is needed and communicate that to app users. They should also anonymise or delete the data as soon as possible. </p>
<p>Privacy should become a key component of design methodology, selection of technical tools, and organisational value statements.</p>
<p>These are important changes, endorsed in guidelines for mobile app developers published by the <a href="https://iapp.org/media/pdf/resource_center/gsmaprivacydesignguidelinesformobileapplicationdevelopmentv1%20%281%29.pdf">Global System for Mobile Communications</a> and by regulators in the <a href="https://www.ftc.gov/sites/default/files/documents/public_statements/privacy-design-and-new-privacy-framework-u.s.federal-trade-commission/120613privacydesign.pdf">US</a>, the <a href="https://ico.org.uk/media/for-organisations/documents/1596/privacy-in-mobile-apps-dp-guidance.pdf">UK</a>, <a href="https://www.oaic.gov.au/privacy/guidance-and-advice/mobile-privacy-a-better-practice-guide-for-mobile-app-developers/">Australia</a> and <a href="https://www.ipc.on.ca/wp-content/uploads/Resources/pbd-asu-mobile.pdf">Canada</a>, among others. In the EU “data protection by design and by default” is now <a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679">a legal obligation</a> of the General Data Protection Regulation.</p>
<p>But, as my research shows, this might not be enough without the redesign of the app ecosystem to address data sharing, a view supported by other research. According to <a href="https://dl.acm.org/doi/10.1145/3201064.3201089">one study</a> most apps transmit data directly to third parties, like Google, Facebook and ad exchanges, via trackers embedded in the app code. But I found that privacy laws do not comprehensively or consistently address this third party sharing. </p>
<p>The term “third party” is not defined in the Protection of Personal Information Act, but would include ad networks, content-sharing sites and social networking platforms. Third parties are thus distinguished from downstream processors who may perform specified data processing on your behalf under a contract. </p>
<p>It is difficult to enforce legal liability against these third parties, who are often outside the country where the app was developed. Their terms and conditions typically place full responsibility for privacy compliance by the app on the app developer. This may leave app users unprotected. But it could also expose the app developer to unforeseen legal liability. </p>
<p>Liability for the app developer arises because under both the Protection of Personal Information Act and General Data Protection Regulation if you played a role in determining “the purpose or means” of data processing you are a “joint” responsible party (data controller) for the data processed by the third party. </p>
<p>The European Court of Justice has twice held small businesses liable as “joint controllers” for Facebook’s collection of data, via a <a href="https://curia.europa.eu/juris/liste.jsf?num=C-210/16">fan page</a> and a <a href="https://curia.europa.eu/juris/liste.jsf?num=C-40/17">like</a> button. Although the judgments stress that joint control is not necessarily “equal liability”, this should still be a concern for app developers.</p>
<p>For example, app developers using the Facebook Software Development Kit are sharing personal data with Facebook. Event logs such as “app installed”, “SDK initialised” and “app deactivated” give detailed demographic and behavioural insights about an app user. In 2018 Privacy International <a href="https://privacyinternational.org/report/2647/how-apps-android-share-data-facebook-report">reported</a> that the setting to delay transmission of logged events until after the user has consented was only added by Facebook 35 days after General Data Protection Regulation came into force, and then only if enabled by the developer for SDK version 4.34 or higher. This change appears to have followed repeated bug reports filed on the developer’s platform. </p>
<h2>Take aways</h2>
<p>The takeaway here for developers following a privacy by design approach is to “<a href="https://iapp.org/media/pdf/resource_center/pbd_implement_7found_principles.pdf">trust but verify</a>”:</p>
<ul>
<li><p>Check contract terms and third party code carefully;</p></li>
<li><p>Monitor developer platforms for security and privacy updates;</p></li>
<li><p>Only work with organisations that offer adequate privacy guarantees;</p></li>
<li><p>Notify your users about data transfers to third parties and provide easy to use privacy controls.</p></li>
<li><p>Keep logs so that you can respond promptly if an app user requests details of the personal data you hold and the recipients (or categories of recipients) of that data.</p></li>
</ul>
<p>Prosecuting app developers who breach data laws is important but not enough. Ultimately the parties who design the technologies and platforms on which mobile apps are built and marketed must be brought within the legal accountability framework to close the privacy loop.</p><img src="https://counter.theconversation.com/content/164090/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The financial assistance of the National Research Foundation (NRF) and University Capacity Development Programme (UCDP) is hereby gratefully acknowledged. Opinions expressed and conclusions arrived at are those of the author and are not to be attributed to the NRF. </span></em></p>Parties who design the technologies and platforms on which mobile apps are built and marketed must be brought within the legal accountability framework to close the privacy loop.Dusty-Lee Donnelly, Lecturer in Law & Advocate, High Court of South Africa, University of KwaZulu-NatalLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1634442021-06-25T15:26:40Z2021-06-25T15:26:40ZWhy Matt Hancock’s private life is very much in the public interest<p>The Sun newspaper’s publication of images from Matt Hancock’s office inside the UK Department of Health, which appear to show the health secretary in a <a href="https://www.thesun.co.uk/news/15392668/matt-hancock-breaks-silence-sorry-affair/">“steamy clinch”</a> with a colleague, has led to his resignation. But it has also once again raised the issue of privacy from press intrusion.</p>
<p>Allegations regarding the health secretary’s relationship with a university friend and aide, Gina Coladangelo, were featured on the front page of the newspaper and went viral within minutes of publication online.</p>
<p>It is alleged the couple, who are both married and have children, kissed and embraced in Hancock’s office in May, breaking COVID-19 social distancing rules put in place by his own department.</p>
<p>Before his resignation, Hancock initially apologised in a statement, saying: “I accept that I breached the social distancing guidance in these circumstances. I have let people down and am very sorry. I remain focused on working to get the country out of this pandemic, and would be grateful for privacy for my family on this personal matter.”</p>
<p>The exposure of a government minister’s alleged infidelities is nothing new. In journalistic terms, the story is a great example of old fashioned, tabloid newspaper reporting. But front-page stories like this are now something of a rarity as newspapers compete to publish first online, rarely holding back true exclusives for print.</p>
<p>Another reason is the tightening of privacy laws.</p>
<h2>How we are protected</h2>
<p>In 2000, the <a href="https://www.legislation.gov.uk/ukpga/1998/42/contents">Human Rights Act</a> incorporated the European Convention on Human Rights into UK law. Article 8 covers the right to privacy, stating: “Everyone has the right to respect for his private and family life, his home and his correspondence.”</p>
<p>In effect, Article 8 protects everyone’s privacy from interference by “public authority”, which includes the media. This extends to extramarital relationships, and in recent years judges have become more likely to rule adulterous affairs are private matters. </p>
<p>The 2008 case of the late Max Mosley set an important precedent in this area of law. The former Formula 1 boss <a href="https://www.theguardian.com/uk/2008/jul/24/mosley.privacy">successfully sued</a> the News of the World for breach of privacy after its exposé about his participation in sadomasochistic activities with prostitutes, effectively ruling what he did in his private life was exactly that – private.</p>
<p>But there are exceptions. In 2010, a temporary injunction was imposed on the media, banning reporting of married Chelsea footballer John Terry’s affair with a teammate’s girlfriend. It was later <a href="http://news.bbc.co.uk/1/hi/uk/8488232.stm">lifted</a> after a high court judge said rumours about the affair were already in the public domain, and that Terry was more concerned about his sponsorship deals than anything else.</p>
<p>Had Hancock been a relatively unknown government worker who applied for an injunction against The Sun preventing it from publishing the images, he would very likely have won his case. He’d have had a “reasonable expectation of privacy”.</p>
<p>But he wasn’t. He was a high profile, frontbench MP, elected by members of the public and leading the country’s response to a global health crisis which has killed millions of people. Here, the media has a very useful weapon in its armoury if it can persuade the court there is a strong public interest in publishing the story. This is also backed up by the journalists’ code, which sets out stringent professional standards for ethical practices in reporting.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1408339966208401414"}"></div></p>
<h2>Codes of conduct</h2>
<p>The <a href="https://www.ipso.co.uk/editors-code-of-practice/">Editors’ Code of Practice</a> is set out by the Independent Press Standards Commission (IPSO), and newspapers and magazines which sign up to be regulated by IPSO (including The Sun) agree to abide by the guidelines.</p>
<p>The clause is similar to Article 8 in its wording, which states everyone is “entitled to respect for their private and family life.”</p>
<p>Editors are expected to justify intrusions into any individual’s private life without consent, but if the material complained about is already in the public domain – and there are claims the Hancock affair was being discussed in DHSC WhatsApp groups – any complaint could fail.</p>
<p>The code goes on to say that it is unacceptable to photograph individuals, without their consent, in public or private places where there is a “reasonable expectation of privacy”.</p>
<p>However, if newspaper editors can demonstrate publication both “serves” and “is proportionate to” the public interest, they can argue that public interest exceptions apply. </p>
<p>Among others, these exemptions include exposing serious crime or serious impropriety, protecting public health or safety, disclosing a person or organisation’s failure to comply with any obligation to which they are subject, raising or contributing to a matter of public debate – including impropriety and unethical conduct or incompetence concerning the public.</p>
<p>The newspaper would no doubt argue this story is very much in the public interest for several reasons: </p>
<p>The alleged affair has taken place on government property during working hours, calling into question Hancock’s focus while responsible for leading the government’s response to a global health crisis.</p>
<p>It also draws attention to the spending of taxpayers’ money on the appointment of Coladangelo to the Department of Health’s oversight board. It is alleged she was appointed by Hancock as a paid, non-executive director last September.</p>
<p>And uniquely to this situation, it raises the question of hypocrisy on Hancock’s part. The footage allegedly shows him breaking social distancing and mask-wearing rules with someone from another household before such lockdown rules were lifted. </p>
<p>Whatever happens next, one thing is for sure – the public most certainly had an interest.</p>
<p><em>This article has been updated to reflect Matt Hancock’s resignation.</em></p><img src="https://counter.theconversation.com/content/163444/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Polly Rippon does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A media law expert explains why the Sun was right to report on Health Secretary Matt Hancock’s personal life.Polly Rippon, University Teacher in Journalism, University of SheffieldLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1608462021-05-25T12:11:34Z2021-05-25T12:11:34ZBody cameras help monitor police but can invade people’s privacy<figure><img src="https://images.theconversation.com/files/400380/original/file-20210512-24-vmvfml.jpg?ixlib=rb-1.1.0&rect=36%2C13%2C2975%2C2063&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Police see some difficult scenes; body cameras can record those and make them public.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/diversey/48968390892/">Tony Webster via Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>In the course of their work, police officers encounter people who are intoxicated, distressed, injured or abused. The officers routinely ask for key identifying information like addresses, dates of birth and driver’s license numbers, and they frequently enter people’s homes and other private spaces. </p>
<p>With the advent of police body cameras, this information is often captured in police video recordings – which some states’ open-records laws make available to the public. </p>
<p>Starting in the summer of 2014, as part of research on police adoption of body-worn cameras within two agencies in Washington state, <a href="https://scholar.google.com/citations?user=7kICf7kAAAAJ&hl=en&oi=ao">I</a> spent hours <a href="https://www.repository.law.indiana.edu/ilj/vol92/iss4/2">riding in patrol vehicles</a>, hanging out at police stations, <a href="https://scholarship.law.unc.edu/nclr/vol96/iss5/8">interviewing officers</a>, observing police officers while they worked and <a href="https://doi.org/10.1177/1461444818786477">administering surveys</a>.</p>
<p>One of the most striking findings of my study was about the unintended effects of these cameras and associated laws. Body-worn cameras and freedom of information laws do enable oversight and accountability of the police. But, as I outline in my new book, “<a href="https://www.ucpress.edu/book/9780520382909/police-visibility">Police Visibility: Privacy, Surveillance, and the False Promise of Body-Worn Cameras</a>,” they also hold the potential to force sensitive data and stressful episodes in private citizens’ lives into public view, easily accessible online.</p>
<h2>Accountability, with visibility</h2>
<p>Body-worn cameras have been issued to <a href="https://bja.ojp.gov/program/body-worn-cameras-bwcs/overview">police all over the United States</a>, with a patchwork of regulations and laws governing their operation and the video they record. The goal is often to make officers accountable for their actions, though <a href="https://www.bloomberg.com/opinion/articles/2020-07-29/police-body-cameras-why-don-t-they-improve-accountability">their effectiveness at doing so has been questioned</a>. </p>
<p>Opinions and laws also differ on <a href="https://theconversation.com/police-and-civilians-disagree-on-when-body-camera-footage-should-be-made-public-157111">when body camera footage should be made public</a>. And, even when it is, interpreting what the footage depicts <a href="https://theconversation.com/from-rodney-king-to-george-floyd-how-video-evidence-can-be-differently-interpreted-in-courts-159794">can be complicated</a>. Nevertheless, the cameras have the potential to make police work, <a href="https://www.nytimes.com/2021/04/12/us/brooklyn-center-police-shooting-minnesota.html">including misconduct and police violence</a>, more visible.</p>
<p>I found that within weeks of adopting body-worn cameras, the police agencies I studied began receiving requests under local and state public records laws, seeking all of the footage recorded. In response, the departments began to release the videos, under the provisions of <a href="https://www.ncsl.org/research/civil-and-criminal-justice/body-worn-cameras-interactive-graphic.aspx">state public records laws</a> with few – if any – redactions to protect citizens’ sensitive personal information. The primary instigator of these initial requests <a href="https://www.wired.com/2015/05/the-body-cam-hacker-who-schooled-the-police/">posted the disclosed video to a publicly accessible YouTube channel</a>.</p>
<p>One patrol officer told me, “I personally would never provide my personal information to an officer with a camera. It all ends up on the internet. That is wrong and unsafe.”</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/400591/original/file-20210513-20-6vorg0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A woman gestures in a bedroom" src="https://images.theconversation.com/files/400591/original/file-20210513-20-6vorg0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/400591/original/file-20210513-20-6vorg0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/400591/original/file-20210513-20-6vorg0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/400591/original/file-20210513-20-6vorg0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/400591/original/file-20210513-20-6vorg0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/400591/original/file-20210513-20-6vorg0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/400591/original/file-20210513-20-6vorg0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An image from body-worn camera footage recorded during a prostitution sting in Bellingham, Wash., which later appeared on YouTube.com. The young woman’s face is obscured in this image to help preserve her privacy.</span>
<span class="attribution"><span class="source">Bryce Newell</span></span>
</figcaption>
</figure>
<h2>‘Say hi to the camera, honey!’</h2>
<p>One winter afternoon in 2015, I accompanied a Spokane, Washington police officer on a domestic violence call. After parking by the curb, we walked up the driveway to where a man was standing. </p>
<p>The officer I was shadowing turned on his body camera and informed the man that he had activated his camera and would be recording their conversation. </p>
<p>The man we had approached yelled down the driveway to his wife, “Smile and say hi to the camera, honey!” </p>
<p>The woman had allegedly taken a metal baseball bat and smashed in the man’s face across his eye. He had blood leaking from his eye and eyebrow and rolling down his nose and cheek. His eyebrow looked caved in; the bone was obviously broken. After a few minutes of questioning, the medics arrived and quickly rushed him to the ambulance. </p>
<p>The officer and I followed them to the ambulance, where the officer continued to question the injured man, seeking to get a statement or confession out of him on camera. His body camera continued to record everything in front of the officer, including the man and the inside of the ambulance.</p>
<p>When the ambulance left, we entered the home, where the woman was being questioned. The officer continued to record in case the woman might offer her own statement or confession.</p>
<p>Although much of what was recorded on the officer’s camera in this case occurred outside, within view of neighbors and others present on the street, it still was a traumatic, personal and embarrassing moment in the lives of both victim and alleged offender. </p>
<p>But the fact that a camera recorded it made these events much more visible, to a wider audience, for a longer time. Officers sometimes showed each other videos at the end of their shifts while writing reports, often to simply decompress after a long shift or bond with their colleagues. In addition, the footage could potentially become public under state open records laws at the time it was recorded.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/400593/original/file-20210513-21-tagp7s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Three images, one with a man with his arms spread wide, then the man running away, then a police officer with a Taser pointed at the man" src="https://images.theconversation.com/files/400593/original/file-20210513-21-tagp7s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/400593/original/file-20210513-21-tagp7s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=146&fit=crop&dpr=1 600w, https://images.theconversation.com/files/400593/original/file-20210513-21-tagp7s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=146&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/400593/original/file-20210513-21-tagp7s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=146&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/400593/original/file-20210513-21-tagp7s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=184&fit=crop&dpr=1 754w, https://images.theconversation.com/files/400593/original/file-20210513-21-tagp7s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=184&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/400593/original/file-20210513-21-tagp7s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=184&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">These screen captures are from a body-worn camera video recorded during a police contact and foot chase in Bellingham, Wash. Faces have been obscured.</span>
<span class="attribution"><span class="source">Bryce Newell</span></span>
</figcaption>
</figure>
<h2>‘Maybe I should stop drinking’</h2>
<p>On another winter evening, I found myself standing inside another couple’s living room with two officers as the man and woman, separately, tried to explain why the wife had called 911 and accused the husband of threatening violence. </p>
<p>The husband was drunk – and drinking continuously while talking to the officer, who was wearing a camera on his chest. He told a rambling story about how much trouble his wife had caused him over the years, musing that perhaps he should leave her and move on, but perhaps he loves her. On the other hand, he said, she had caused him nothing but grief and made his life miserable. Moments later, he continued, “Maybe what I really should do is stop drinking,” and he took another sip from his beer can.</p>
<p>Even if he had been sober, he probably would not have realized that this conversation might end up on YouTube with virtually unlimited visibility. If he had, would he or his wife have let the police into their house in the first place? Would the wife even have called to report her husband’s threats? </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/400594/original/file-20210513-20-1ro1s3o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A police officer gives a field sobriety test to a person" src="https://images.theconversation.com/files/400594/original/file-20210513-20-1ro1s3o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/400594/original/file-20210513-20-1ro1s3o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/400594/original/file-20210513-20-1ro1s3o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/400594/original/file-20210513-20-1ro1s3o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/400594/original/file-20210513-20-1ro1s3o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=425&fit=crop&dpr=1 754w, https://images.theconversation.com/files/400594/original/file-20210513-20-1ro1s3o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=425&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/400594/original/file-20210513-20-1ro1s3o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=425&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">This image is from body-worn camera footage of a field sobriety test in Bellingham, Wash., which later appeared on YouTube.com.</span>
<span class="attribution"><span class="source">Bryce Newell</span></span>
</figcaption>
</figure>
<p>There are potential social costs to deploying body-worn cameras, including possible invasions of privacy when sensitive moments are recorded or made public, and increasing police surveillance of communities already subjected to heightened police attention. When body cameras are introduced, careful attention to existing laws and policies, including public records laws, can help minimize harm to the public while increasing the transparency of police work. </p>
<p>As I discuss in <a href="https://www.ucpress.edu/book/9780520382909/police-visibility">my book</a>, one possible solution could be redacting personal information about victims, witnesses, bystanders and even suspects, as long as it is not related to law enforcement officer conduct. Other options include creating independent oversight groups to review footage before its release, giving victims and their families access to footage, and erring on the side of nondisclosure when body cameras record in private spaces or in particularly sensitive contexts. </p>
<p>I believe these are possible without limiting public access to procedural information about how officers conduct their activities, to enable oversight and accountability. </p>
<p>Just as <a href="https://theconversation.com/why-cellphone-videos-of-black-peoples-deaths-should-be-considered-sacred-like-lynching-photographs-139252">videos of Black people’s deaths at the hands of the police should be treated with more care</a>, the decision to make police video that captures sensitive and traumatic moments of people’s lives public should be a measured and considered one. In my view, there is little need to force civilians onto the public stage simply because they are contacted by a police officer.</p>
<p>[<em>The Conversation’s Politics + Society editors pick need-to-know stories.</em> <a href="https://theconversation.com/us/newsletters/politics-weekly-74/?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=politics-need-to-know">Sign up for Politics Weekly</a>.]</p><img src="https://counter.theconversation.com/content/160846/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bryce C. Newell received funding for some parts of this research from the University of Washington's Information School and the Dutch Research Council (NWO). </span></em></p>Police body cameras have the potential to make private details about people’s lives, including some of the most stressful experiences of their lives, public and easily accessible onlineBryce C. Newell, Assistant Professor of Media Law and Policy, University of OregonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1498042021-03-23T12:32:00Z2021-03-23T12:32:00ZPrivacy may be under threat, but its protection alone isn’t enough to preserve civil liberties<figure><img src="https://images.theconversation.com/files/390448/original/file-20210318-23-ybwzfq.jpg?ixlib=rb-1.1.0&rect=0%2C4%2C2995%2C1989&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Demonstrators shine their cellphones during a protest in St. Louis in 2020.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/protesters-light-up-their-cell-phones-during-a-protest-news-photo/1228695076?adppopup=true">Michael B. Thomas/Getty Images</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em></p>
<h2>The big idea</h2>
<p>While the battle over privacy is everywhere in American life, it’s actually a relatively new concept that didn’t become grounded in law until over a century after the Declaration of Independence. </p>
<p>Privacy is supposedly a core American value, forged in the country’s founding. For example, <a href="https://www.google.com/books/edition/American_Privacy/b7CE5PqvVw8C?hl=en">historians claim</a> that privacy concerns drove the American Revolution. Colonists were reacting to British troops invading their warehouses and shops in search of taxable goods, and to British demands that the Colonists shelter soldiers in their homes. </p>
<p>And today, <a href="https://www.aclu.org/blog/national-security/qa-daniel-solove-how-bad-security-arguments-are-undermining-our-privacy">civil liberties advocates argue</a> that democracy requires privacy. They believe privacy is necessary to create independent-minded, free-thinking citizens who vote as they wish.</p>
<p>Yet the term “privacy” is not mentioned in the Constitution. A <a href="https://doi.org/10.2307/1321160">legal right to privacy</a> wasn’t articulated until 1890. And it came to be robustly <a href="https://supreme.justia.com/cases/federal/us/381/479/">defended by the Supreme Court</a> only in the 1960s. </p>
<p>These are among the many things I discovered while researching “<a href="https://www.cambridge.org/us/academic/subjects/law/e-commerce-law/life-after-privacy-reclaiming-democracy-surveillance-society?format=PB&isbn=9781108811910">Life after Privacy: Reclaiming Democracy in a Surveillance Society</a>,” which explores the nature of privacy, its history and its uncertain future. I also learned that privacy remains an ill-formed and embattled concept. </p>
<h2>Why it matters</h2>
<p><a href="https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/">Americans feel</a> their privacy is gravely endangered in the digital age. Corporations use increasingly sophisticated methods of data collection to <a href="https://www.forbes.com/sites/ianmorris/2016/12/31/facebook-knows-when-you-fall-in-love-and-thats-pretty-creepy/?sh=21021cf6f525">analyze and influence people’s behavior</a>.</p>
<p>This ability can be used both to bolster and hamper democracy. For example, Facebook used its deep knowledge of user data to <a href="https://www.nature.com/news/facebook-experiment-boosts-us-voter-turnout-1.11401">boost voter turnout</a> in 2010. Four years later, data firm Cambridge Analytica used the same technique to <a href="https://www.theguardian.com/uk-news/2018/mar/23/leaked-cambridge-analyticas-blueprint-for-trump-victory">target voters</a> with Donald Trump campaign ads.</p>
<p>In my research, I learned that political liberty relies much less on privacy than on people’s ability and willingness to demonstrate and deliberate in the public realm. By that I mean protecting privacy alone will not help with consumer and citizen freedom. I believe people need to use the power of public protests to gain and maintain their civil liberties.</p>
<p>The <a href="https://daily.jstor.org/the-stonewall-riots-didnt-start-the-gay-rights-movement/">gay rights movement</a> demonstrated this power in the past century. Throughout the 20th century in much of America, people were <a href="https://journalofethics.ama-assn.org/article/decriminalization-sodomy-united-states/2014-11">prosecuted for homosexual behavior</a> in their private lives. The aggressive work of <a href="https://www.nytimes.com/interactive/2020/04/13/t-magazine/act-up-aids.html">ACT UP</a> and other gay rights activist groups led to legal protections for people to live and love as they wished. And in 2003, the Supreme Court <a href="https://supreme.justia.com/cases/federal/us/539/558/">overruled all state laws</a> that had prohibited homosexuality. </p>
<p>Civil and <a href="https://www.history.com/topics/19th-century/labor">labor rights campaigns</a> in the 20th century had similar outcomes. Despite civil rights leaders’ being <a href="https://taylorbranch.com/king-era-trilogy/parting-the-waters/">spied on and hounded</a> from the start, they used their power of coordination and public organizing to trump their lack of privacy. Their organizational roots, built over many decades, enabled them to withstand repeated assault and launch <a href="https://www.history.com/topics/black-history/the-greensboro-sit-in">disciplined</a>, <a href="https://www.biography.com/news/black-history-birmingham-childrens-crusade-1963">creative</a> protests. </p>
<p>In other words, privacy is not so much a prerequisite for democracy as it is a product of democratic action. </p>
<h2>What still isn’t known</h2>
<p>It is still unclear how digital technology has changed the nature of political protest, and whether it has made it more or less effective. </p>
<p>As scholar <a href="https://yalebooks.yale.edu/book/9780300259292/twitter-and-tear-gas">Zeynep Tufekci notes</a>, modern, internet-fueled “networked protests” like <a href="https://www.washingtonpost.com/national/on-leadership/what-is-occupy-wall-street-the-history-of-leaderless-movements/2011/10/10/gIQAwkFjaL_story.html">Occupy Wall Street</a> and the <a href="https://www.aljazeera.com/news/2020/12/17/what-is-the-arab-spring-and-how-did-it-start">Arab Spring</a> used social media to quickly organize massive protests, but with <a href="https://www.vice.com/en/article/nze9em/twitter-makes-it-easy-to-start-a-revolution-without-finishing-it">limited long-term success</a>. </p>
<h2>What’s next</h2>
<p>Digital technology has changed Americans’ behavior in surprising ways, including when it comes to privacy. People share intimate details about their lives on social media. Meanwhile, digital media <a href="https://www.pnas.org/content/118/9/e2023301118">has also given rise</a> to hardened partisanship and political radicalization.</p>
<p>I believe philosophers need to look ahead and consider what other new behaviors digital technology is inspiring. Perhaps consumers and citizens will become more predictable, as <a href="https://www.theguardian.com/books/2019/feb/02/age-of-surveillance-capitalism-shoshana-zuboff-review">data analysts believe</a>. Alternatively, people may rise up and rebel against constant surveillance and the efforts of spying governments and marketers to control them.</p>
<p>[<em>Deep knowledge, daily.</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=deepknowledge">Sign up for The Conversation’s newsletter</a>.]</p><img src="https://counter.theconversation.com/content/149804/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Firmin DeBrabander does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A privacy expert says citizens will need to exercise their right to public protest if they want to preserve their privacy.Firmin DeBrabander, Professor of Philosophy, Maryland Institute College of ArtLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1495352020-11-11T02:53:50Z2020-11-11T02:53:50Z83% of Australians want tougher privacy laws. Now’s your chance to tell the government what you want<figure><img src="https://images.theconversation.com/files/368451/original/file-20201110-24-98dvaw.jpg?ixlib=rb-1.1.0&rect=17%2C0%2C5973%2C3988&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Federal Attorney-General Christian Porter has <a href="https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988">called for submissions</a> to the long-awaited review of the federal Privacy Act 1988.</p>
<p>This is the first wide-ranging review of privacy laws since the Australian Law Reform Commission produced a <a href="https://www.alrc.gov.au/publication/for-your-information-australian-privacy-law-and-practice-alrc-report-108/">landmark report</a> in 2008.</p>
<p>Australia has in the past often hesitated to adopt a strong privacy framework. The new review, however, provides an opportunity to improve data protection rules to an internationally competitive standard. </p>
<p>Here are some of the ideas proposed — and what’s at stake if we get this wrong.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/its-time-for-privacy-invasion-to-be-a-legal-wrong-31288">It's time for privacy invasion to be a legal wrong</a>
</strong>
</em>
</p>
<hr>
<h2>Australians care deeply about data privacy</h2>
<p>Personal information has never had a more central role in our society and economy, and the government has a strong mandate to update Australia’s framework for the protection of personal information. </p>
<p>In the Australian Privacy Commissioner’s 2020 survey, <a href="https://www.oaic.gov.au/assets/engage-with-us/research/acaps-2020/Australian-Community-Attitudes-to-Privacy-Survey-2020.pdf">83% of Australians said they’d like the government to do more</a> to protect the privacy of their data. </p>
<p>The intense debate about the COVIDSafe app <a href="https://auspublaw.org/2020/05/covidsafe-and-identity-governance-beyond-privacy/">earlier this year</a> also shows Australians care deeply about their private information, even in a time of crisis. </p>
<p>Privacy laws and enforcement can hardly keep up with the ever-increasing digitalisation of our lives. Data-driven innovation provides valuable services that many of us use and enjoy. However, the government’s <a href="https://www.ag.gov.au/system/files/2020-10/privacy-act-review--issues-paper-october-2020.pdf">issues paper</a> notes: </p>
<blockquote>
<p>As Australians spend more of their time online, and new technologies emerge, such as artificial intelligence, more personal information about individuals is being captured and processed, raising questions as to whether Australian privacy law is fit for purpose.</p>
</blockquote>
<p>The pandemic has accelerated the existing trend towards digitalisation and created a range of <a href="https://iapp.org/resources/article/white-paper-privacy-risks-to-individuals-in-the-wake-of-covid-19/">new privacy issues</a> including working or studying at home, and the use of personal data in contact tracing.</p>
<p>Australians are <a href="https://www.oaic.gov.au/engage-with-us/research/australian-community-attitudes-to-privacy-survey-2020-landing-page/2020-australian-community-attitudes-to-privacy-survey/">rightly concerned</a> they are losing control over their personal data. </p>
<p>So there’s no question the government’s review is sorely needed. </p>
<h2>Issues of concern for the new privacy review</h2>
<p>The government’s review follows the Australian Competition and Consumer Commission’s <a href="https://www.accc.gov.au/publications/digital-platforms-inquiry-final-report">Digital Platforms Inquiry</a>, which found that some data practices of digital platforms are unfair and undermine consumer trust. We rely heavily on digital platforms such as Google and Facebook for information, entertainment and engagement with the world around us. </p>
<p>Our interactions with these platforms leave countless digital traces that allow us to be <a href="https://theconversation.com/heres-how-tech-giants-profit-from-invading-our-privacy-and-how-we-can-start-taking-it-back-120078">profiled and tracked</a> for profit. The Australian Competition and Consumer Commission (ACCC) <a href="https://www.accc.gov.au/publications/digital-platforms-inquiry-final-report">found</a> that the digital platforms make it hard for consumers to resist these practices and to make free and informed decisions regarding the collection, use and disclosure of their personal data. </p>
<p>The government has <a href="https://www.communications.gov.au/departmental-news/government-response-accc-digital-platforms-inquiry">committed</a> to implement most of the ACCC’s <a href="https://theconversation.com/the-federal-governments-response-to-the-acccs-digital-platforms-inquiry-is-a-let-down-128775">recommendations for stronger privacy laws</a> to give us greater consumer control.</p>
<p>However, the reforms must go further. The review also provides an opportunity to address some long-standing weaknesses of Australia’s privacy regime.</p>
<p>The government’s <a href="https://www.ag.gov.au/system/files/2020-10/privacy-act-review--issues-paper-october-2020.pdf">issues paper</a>, released to inform the review, identified several areas of particular concern. These include:</p>
<ul>
<li><p>the scope of application of the Privacy Act, in particular the definition of “personal information” and current private sector exemptions</p></li>
<li><p>whether the Privacy Act provides an effective framework for promoting good privacy practices</p></li>
<li><p>whether individuals should have a direct right to sue for a breach of privacy obligations under the Privacy Act</p></li>
<li><p>whether a statutory tort for serious invasions of privacy should be introduced into Australian law, allowing Australians to go to court if their privacy is invaded</p></li>
<li><p>whether the enforcement powers of the Privacy Commissioner should be strengthened.</p></li>
</ul>
<p>While most recent attention relates to improving consumer choice and control over their personal data, the review also brings back onto the agenda some never-implemented recommendations from the Australian Law Reform Commission’s 2008 review. </p>
<p>These include introducing a <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3705881">statutory tort for serious invasions of privacy</a>, and extending the coverage of the Privacy Act.</p>
<h2>Exemptions for small business and political parties should be reviewed</h2>
<p>The Privacy Act currently contains several exemptions that limit its scope. The two most contentious exemptions have the effect that political parties and most business organisations need not comply with the general data protection standards under the Act.</p>
<p>The small business exemption is intended to reduce red tape for small operators. However, <a href="https://www.oaic.gov.au/engage-with-us/research/australian-community-attitudes-to-privacy-survey-2020-landing-page/2020-australian-community-attitudes-to-privacy-survey/">largely unknown</a> to the Australian public, it means the vast majority of Australian businesses are not legally obliged to comply with standards for fair and safe handling of personal information.</p>
<p>Procedures for compulsory venue check-ins under COVID health regulations are just one recent illustration of why this is a problem. Some people have raised <a href="https://www.abc.net.au/news/2020-10-31/covid-19-check-in-data-using-qr-codes-raises-privacy-concerns/12823432">concerns</a> that customers’ contact-tracing data, in particular collected via QR codes, may be exploited by marketing companies for targeted advertising.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A woman uses a QR code at a restaurant" src="https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=245&fit=crop&dpr=1 600w, https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=245&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=245&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=308&fit=crop&dpr=1 754w, https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=308&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/368460/original/file-20201110-20-93cuui.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=308&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Under current privacy laws, cafe and restaurant operators are exempt from complying with certain privacy obligations.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>Under current privacy laws, cafe and restaurant operators are generally exempt from complying with privacy obligations to undertake due diligence checks on third-party providers used to collect customers’ data.</p>
<p>The political exemption is another area of need of reform. As the <a href="https://www.theguardian.com/technology/2020/sep/14/facebook-suffers-blow-in-australia-legal-fight-over-cambridge-analytica">Facebook/Cambridge Analytica scandal</a> showed, political campaigning is becoming <a href="https://tacticaltech.org/#/projects/data-politics">increasingly tech-driven</a>.</p>
<p>However, Australian political parties are exempt from complying with the Privacy Act and anti-spam legislation. This means voters cannot effectively protect themselves against data harvesting for political purposes and micro-targeting in election campaigns <a href="https://theconversation.com/how-political-parties-legally-harvest-your-data-and-use-it-to-bombard-you-with-election-spam-148803">through unsolicited text messages</a>. </p>
<p>There is a <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3717561">good case for arguing</a> political parties and candidates should be subject to the same rules as other organisations. It’s what most Australians would like and, in fact, <a href="https://www.oaic.gov.au/engage-with-us/research/australian-community-attitudes-to-privacy-survey-2020-landing-page/2020-australian-community-attitudes-to-privacy-survey/">wrongly believe is already in place</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-political-parties-legally-harvest-your-data-and-use-it-to-bombard-you-with-election-spam-148803">How political parties legally harvest your data and use it to bombard you with election spam</a>
</strong>
</em>
</p>
<hr>
<h2>Trust drives innovation</h2>
<p>Trust in digital technologies is undermined when data practices come across as opaque, <a href="https://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=1098&context=yjolt">creepy</a> or unsafe. </p>
<p>There is increasing recognition that data protection <a href="https://www.researchgate.net/publication/314636773_Privacy_and_Innovation_From_Disruption_to_Opportunities">drives innovation</a> and adoption of modern applications, rather than impedes it. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A woman looks at her phone in the twilight." src="https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/368471/original/file-20201110-18-12myfs6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Trust in digital technologies is undermined when data practices come across as opaque, creepy, or unsafe.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>The <a href="https://www.covidsafe.gov.au/">COVIDSafe app</a> is a good example.
When that app was debated, the government accepted that <a href="https://www.oaic.gov.au/privacy/guidance-and-advice/privacy-obligations-regarding-covidsafe-and-covid-app-data/">robust privacy protections</a> were necessary to achieve a strong uptake by the community. </p>
<p>We would all benefit if the government saw that this same principle applies to other areas of society where our precious data is collected.</p>
<hr>
<p><em>Information on how to make a submission to the federal government review of the Privacy Act 1988 can be found <a href="https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988">here</a>.</em></p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/people-want-data-privacy-but-dont-always-know-what-theyre-getting-143782">People want data privacy but don't always know what they're getting</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/149535/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Normann Witzleb consults to the Office of the Australian Information Commissioner. He has received research funding from the International Association of Privacy Professionals.</span></em></p>Australia has hesitated in the past to adopt a strong privacy framework. A new government review provides an opportunity to improve data protection rules to an internationally competitive standard.Normann Witzleb, Associate Professor in Law, Monash UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1478052020-10-15T12:38:23Z2020-10-15T12:38:23ZWhat is HIPAA? 5 questions answered about the medical privacy law that protects Trump’s test results and yours<figure><img src="https://images.theconversation.com/files/363533/original/file-20201014-21-q2f1h4.jpg?ixlib=rb-1.1.0&rect=473%2C0%2C6875%2C4912&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Doctors can share your medical information, with your permission.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/bad-news-royalty-free-image/486418295">sturti/E+ via Getty Images</a></span></figcaption></figure><p><em>When President Trump was hospitalized with COVID-19, his doctor pointed to “<a href="https://www.cnn.com/2020/10/07/politics/hipaa-trump-conley/index.html">HIPAA rules and regulations</a>” as the reason he couldn’t speak more freely about Trump’s condition. HIPAA is a medical privacy law, but people often misunderstand what it does and doesn’t do.</em></p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1313205926309093378"}"></div></p>
<p><em>Margaret Riley is a <a href="https://www.law.virginia.edu/faculty/profile/mf9c/1202931">law professor at the University of Virginia</a> who specializes in health law. She spends a lot of time teaching future lawyers and medical professionals how medical privacy laws work. Here are the basics.</em></p>
<h2>1. What is HIPAA and why did Congress pass it?</h2>
<p>The <a href="https://www.hhs.gov/hipaa/for-professionals/privacy/laws-regulations/index.html">Health Insurance Portability and Accountability Act’s</a> Privacy Rule is a federal law that <a href="https://www.hipaajournal.com/when-was-hipaa-enacted/">went into force in 2003</a>. The need for such a law had been underscored when tennis star <a href="https://www.nytimes.com/1992/04/09/sports/an-emotional-ashe-says-that-he-has-aids.html">Arthur Ashe’s HIV status was revealed publicly</a> and country music star <a href="http://www.cmt.com/news/1475729/medical-worker-sentenced-over-wynette-medical-records/">Tammy Wynette’s health records were sold</a> to tabloids for a few thousand dollars. People were also starting to worry about genetic privacy. And Congress recognized that the internet would make it easier for health care privacy breaches to occur.</p>
<p>The law prohibits health care providers and businesses and people working with them – including administrative staff, laboratories, pharmacies, health insurers and so on – from disclosing your health information without your permission. That includes information about your COVID-19 symptoms and test results – though there are some exceptions.</p>
<h2>2. Is all my medical info protected by HIPAA?</h2>
<p>No, HIPAA protects only health care information that is held by specific kinds of health care providers. For example, health care data that may be on your Apple Watch or Fitbit are usually not covered by HIPAA. Similarly, genetic data you enter on websites like Ancestry.com are not covered by HIPAA.</p>
<p>Even some apps that do things like help you maintain your blood sugar may not be covered by HIPAA if you aren’t using them at the direction of your health care provider. Other laws or agreements like the privacy disclosures required on many apps (although <a href="https://theconversation.com/nobody-reads-privacy-policies-heres-how-to-fix-that-81932">many people don’t read them</a>) may protect that information, but HIPAA does not.</p>
<p>Employers are generally not covered health providers, so HIPAA does not apply to them. If necessary to protect others, your work could share that you have an illness. That said, other laws like the Americans with Disabilities Act may prevent your employer from disclosing identifiable health information about you that you may have shared with them.</p>
<h2>3. Who can disclose what under HIPAA?</h2>
<p>HIPAA gives you the right to control your health information disclosures so you can tell your health care provider what to share. </p>
<p>For example, you may be willing to have your health care provider share some of your health information with family members, but you might not want to share all of it; you can tell your health care provider not to share any stigmatizing information or procedures that your family might not know about. You need to be very clear with your health care provider if you want to exclude some information. Some information, like psychotherapy notes or giving your data to marketing companies, requires written authorization. </p>
<p>Sometimes people try to use HIPAA as an excuse for actions that it doesn’t in fact cover. In 2020, for instance, some people confronted with rules about wearing masks in stores assert that they don’t need to wear one and <a href="https://www.usatoday.com/story/news/factcheck/2020/07/19/fact-check-asking-face-masks-wont-violate-hipaa-4th-amendment/5430339002/">don’t need to explain why because of HIPAA</a>. That’s not actually how this privacy law works.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/363537/original/file-20201014-21-1mxfqp8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="exterior of a medical center with mask sign" src="https://images.theconversation.com/files/363537/original/file-20201014-21-1mxfqp8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/363537/original/file-20201014-21-1mxfqp8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/363537/original/file-20201014-21-1mxfqp8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/363537/original/file-20201014-21-1mxfqp8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/363537/original/file-20201014-21-1mxfqp8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/363537/original/file-20201014-21-1mxfqp8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/363537/original/file-20201014-21-1mxfqp8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Even during the pandemic, your personal medical information is largely protected.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/bad-news-royalty-free-image/486418295">Spencer Plat/Getty Images News via Getty Images</a></span>
</figcaption>
</figure>
<h2>4. Could my health care provider be required to disclose any of my info without my permission?</h2>
<p><a href="https://www.hhs.gov/hipaa/for-individuals/guidance-materials-for-consumers/index.html">There are exceptions</a> to HIPAA’s nondisclosure requirements. For example, HIPAA regulations allow covered health care providers to disclose patient information to help treat another person, to protect public health and for certain law enforcement purposes.</p>
<p>There are additional exceptions that apply during a pandemic. For instance, while health departments may have access to information about people in their district who’ve tested positive for COVID-19, HIPAA and other privacy laws require them not to release any more information than is needed to keep people safe. So, health departments will provide information about how many people have tested positive and how many people are hospitalized, but they won’t release any names to the general public. Health department contact tracers may reveal identities of individuals if it’s really necessary to alert specific people that they may have been exposed.</p>
<p>[<em>Deep knowledge, daily.</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=deepknowledge">Sign up for The Conversation’s newsletter</a>.]</p>
<p>HIPAA covers President Trump just as it does you and me. There may be good reasons that people want to know more about the president’s health, but his health providers can provide the public only with information about his health that he has allowed them to share. They shouldn’t say anything that isn’t true, but they can certainly omit information.</p>
<h2>5. What if someone violates my rights under HIPAA?</h2>
<p>Only the government can bring a claim if an individual’s protected health information is breached. So to bring a federal claim, you would need to work with the Office of Civil Rights at the U.S. Department of Health and Human Services. You may be able to sue under state law and use the breach of your HIPAA rights as evidence.</p>
<p>Some people who are particularly worried about their privacy may ask health care providers to sign a nondisclosure agreement that gives them additional claims and the right to sue directly if there is a breach.</p><img src="https://counter.theconversation.com/content/147805/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Margaret Riley does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A health law expert explains what the regulation does and doesn’t protect.Margaret Riley, Professor of Law, Public Health Sciences, and Public Policy, University of VirginiaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1462922020-09-16T07:30:46Z2020-09-16T07:30:46ZTowards a post-privacy world: proposed bill would encourage agencies to widely share your data<p>The federal government has <a href="https://www.abc.net.au/news/2020-09-16/government-draft-law-share-personal-data-between-agencies/12666792">announced a plan</a> to increase the sharing of citizen data across the public sector. </p>
<p>This would include data sitting with agencies such as Centrelink, the Australian Tax Office, the Department of Home Affairs, the Bureau of Statistics and potentially other external “accredited” parties such as universities and businesses. </p>
<p>The draft <a href="https://www.datacommissioner.gov.au/data-sharing/legislation">Data Availability and Transparency Bill</a> released today will not fix ongoing problems in public administration. It won’t solve many problems in public health. It is a worrying shift to a post-privacy society. </p>
<p>It’s a matter of arrogance, rather than effectiveness. It highlights deficiencies in Australian law that need fixing.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australians-accept-government-surveillance-for-now-110789">Australians accept government surveillance, for now</a>
</strong>
</em>
</p>
<hr>
<h2>Making sense of the plan</h2>
<p>Australian governments on all levels have built huge silos of information about us all. We supply the data for these silos each time we deal with government. </p>
<p>It’s difficult to exercise your rights and responsibilities without providing data. If you’re a voter, a director, a doctor, a gun owner, on welfare, pay tax, have a driver’s licence or Medicare card – our governments have data about you. </p>
<p>Much of this is supplied on a legally mandatory basis. It allows the federal, state, territory and local governments to provide pensions, elections, parks, courts and hospitals, and to collect rates, fees and taxes. </p>
<p>The proposed Data Availability and Transparency Bill will authorise large-scale sharing of data about citizens and non-citizens across the public sector, between both public and private bodies. Previously called the “<a href="https://www.datacommissioner.gov.au/sites/default/files/2019-09/Data%20Sharing%20and%20Release%20Legislative%20Reforms%20Discussion%20Paper%20-%20Accessibility.pdf">Data Sharing and Release</a>” legislation, the word “transparency” has now replaced “release” to allay public fears.</p>
<p>The legislation would allow sharing between Commonwealth government agencies that are currently constrained by a range of acts overseen (weakly) by the <a href="https://www.zdnet.com/article/senators-concerned-oaic-will-remain-under-resourced-despite-hiring-31-staff/">under-resourced</a> Australian Information Commissioner (OAIC).</p>
<p>The acts often only apply to specific agencies or data. Overall we have a threadbare patchwork of law that is supposed to respect our privacy but often isn’t effective. It hasn’t kept pace with law in <a href="https://theconversation.com/data-privacy-stricter-european-rules-will-have-repercussions-in-australia-as-global-divisions-grow-142980">Europe</a> and elsewhere in the world.</p>
<p>The plan also envisages sharing data with trusted third parties. They might be universities or other research institutions. In future, the sharing could extend to include state or territory agencies and the private sector, too. </p>
<p>Any public or private bodies that receive data can then share it forward. Irrespective of whether one has anything to hide, this plan is worrying.</p>
<h2>Why will there be sharing?</h2>
<p>Sharing isn’t necessarily a bad thing. But it should be done accountably and appropriately. </p>
<p>Consultations over the past two years have highlighted the value of inter-agency sharing for law enforcement and for research into health and welfare. Universities have identified a range of uses regarding urban planning, environment protection, crime, education, employment, investment, disease control and medical treatment.</p>
<p>Many researchers will be delighted by the prospect of accessing data more cheaply than doing onerous small-scale surveys. IT people have also been enthusiastic about money that could be made helping the databases of different agencies talk to each other. </p>
<p>However, the reality is more complicated, as researchers and <a href="https://www.datacommissioner.gov.au/sites/default/files/2019-11/79_0.pdf">civil society</a> advocates have pointed out. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/358305/original/file-20200916-24-o601t8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Person hitting a 'share' button on a keyboard." src="https://images.theconversation.com/files/358305/original/file-20200916-24-o601t8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/358305/original/file-20200916-24-o601t8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/358305/original/file-20200916-24-o601t8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/358305/original/file-20200916-24-o601t8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/358305/original/file-20200916-24-o601t8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/358305/original/file-20200916-24-o601t8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/358305/original/file-20200916-24-o601t8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">In a July speech to the Australian Society for Computers and Law, former High Court Justice Michael Kirby highlighted a growing need to fight for privacy, rather than let it slip away.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>Why should you be worried?</h2>
<p>The plan for comprehensive data sharing is founded on the premise of accreditation of data recipients (entities deemed trustworthy) and oversight by the Office of the National Data Commissioner, under the proposed act. </p>
<p>The draft bill announced today is open for a short period of public comment before it goes to parliament. It features a <a href="https://www.datacommissioner.gov.au/exposure-draft/accreditation">consultation paper</a> alongside a disquieting consultants’ report about the bill. In this <a href="https://www.datacommissioner.gov.au/sites/default/files/2020-09/Privacy%20Impact%20Assessment_exposure%20draft%20Data%20Availability%20and%20Transparency%20Bill%202020.pdf">report</a>, the consultants refer to concerns and “high inherent risk”, but unsurprisingly appear to assume things will work out. </p>
<p>Federal Minister for Government Services Stuart Roberts, who presided over the tragedy known as the <a href="https://www.sbs.com.au/news/nothing-to-apologise-for-minister-backs-stuart-robert-over-failed-robodebt-scheme">RoboDebt scheme</a>, is optimistic about the bill. He dismissed critics’ concerns by <a href="https://www.abc.net.au/news/2020-09-16/government-draft-law-share-personal-data-between-agencies/12666792">stating</a> consent is implied when someone uses a government service. This seems disingenuous, given people typically don’t have a choice. </p>
<p>However, the bill does exclude some data sharing. If you’re a criminologist researching law enforcement, for example, you won’t have an open sesame. Experience with the national Privacy Act and other Commonwealth and state legislation tells us such exclusions weaken over time</p>
<p>Outside the narrow exclusions centred on law enforcement and national security, the bill’s default position is to share widely and often. That’s because the accreditation requirements for agencies aren’t onerous and the bases for sharing are very broad. </p>
<p>This proposal exacerbates ongoing questions about day-to-day privacy protection. Who’s responsible, with what framework and what resources? </p>
<p>Responsibility is crucial, as national and state agencies recurrently experience data breaches. Although as RoboDebt revealed, they often stick to denial. Universities are also often wide open to <a href="https://www.theguardian.com/australia-news/2019/jun/04/australian-national-university-hit-by-huge-data-breach">data breaches</a>.</p>
<p>Proponents of the plan argue privacy can be protected through robust de-identification, in other words removing the ability to identify specific individuals. However, <a href="https://pursuit.unimelb.edu.au/articles/the-simple-process-of-re-identifying-patients-in-public-health-records">research</a> has recurrently shown “de-identification” is no silver bullet. </p>
<p>Most bodies don’t recognise the scope for re-identification of de-identified personal information and lots of sharing will emphasise data matching. </p>
<h2>Be careful what you ask for</h2>
<p>Sharing <em>may</em> result in social goods such as better cities, smarter government and healthier people by providing access to data (rather than just money) for service providers and researchers. </p>
<p>That said, our history of aspirational statements about privacy protection without meaningful enforcement by watchdogs should provoke some hard questions. It wasn’t long ago the government <a href="https://www.theguardian.com/australia-news/2020/sep/10/service-nsw-hack-could-have-been-prevented-with-simple-security-measures">failed</a> to prevent hackers from accessing sensitive data on more than 200,000 Australians.</p>
<p>It’s true this bill would ostensibly provide transparency, but it won’t provide genuine accountability. It shouldn’t be taken at face value.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/seven-ways-the-government-can-make-australians-safer-without-compromising-online-privacy-111091">Seven ways the government can make Australians safer – without compromising online privacy</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/146292/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bruce Baer Arnold is affiliated with the Australian Privacy Foundation and is a member of OECD data protection working parties. </span></em></p>The new bill would open the gates for your data to freely exchange hands between any ‘accredited’ agency. The proposal is more arrogant than it is effective.Bruce Baer Arnold, Assistant Professor, School of Law, University of CanberraLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1429802020-07-30T19:56:07Z2020-07-30T19:56:07ZData privacy: stricter European rules will have repercussions in Australia as global divisions grow<figure><img src="https://images.theconversation.com/files/350358/original/file-20200730-33-1ta31ui.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">shutterstock</span> </figcaption></figure><p>A big year for privacy just got bigger. On July 16, Europe’s top court <a href="https://curia.europa.eu/jcms/upload/docs/application/pdf/2020-07/cp200091en.pdf">ruled</a> on the legality of two mechanisms for cross-border transfers of personal data. </p>
<p>The Court of Justice of the European Union (CJEU) struck down the “EU-US Privacy Shield”, an intergovernmental agreement on which thousands of US companies based their data processing with EU trading partners and consumers. At the same time, the CJEU generally upheld so-called “standard contractual clauses” (SCC) for data exports but imposed new requirements on their use. </p>
<p>The decision has an immediate impact on data flows between the USA and the EU. But it will also create new challenges for Australian companies that engage with Europe. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/tough-new-eu-privacy-regulations-could-lead-to-better-protections-in-australia-96992">Tough new EU privacy regulations could lead to better protections in Australia</a>
</strong>
</em>
</p>
<hr>
<h2>The global reach of European privacy laws</h2>
<p>In 2018, the EU <a href="https://theconversation.com/tough-new-eu-privacy-regulations-could-lead-to-better-protections-in-australia-96992">brought into force</a> the General Data Protection Regulation (GDPR), one of the world’s strongest privacy protection frameworks. This latest decision provides further evidence that the GDPR has impact far beyond the EU. It allows data about European citizens to be exported outside the bloc only if an adequate level of data protection is guaranteed. </p>
<p>Adequacy can be demonstrated at country level, and some major trading partners of the EU (such as Japan, Canada and New Zealand) have been <a href="https://ec.europa.eu/info/law/law-topic/data-protection/international-dimension-data-protection/adequacy-decisions_en">certified</a> by the EU as having a comparable level of privacy protection. Until a fortnight ago, US companies could likewise rely on an adequacy decision for the <a href="https://www.privacyshield.gov/welcome">EU-US Privacy Shield</a>. The Privacy Shield allowed companies to self-certify their data practices against a set of minimum criteria and enhanced US regulatory oversight. The Court has now held that this is not enough.</p>
<h2>What does this mean for Australia?</h2>
<p>Australian companies and consumers need to be mindful of the new CJEU decision. Data exports are very common, particularly where companies operate multi-nationally, outsource some of their data processing or store data on overseas cloud servers. </p>
<p>Australia was not a party to the EU-US Privacy Shield. It also <a href="https://www.oaic.gov.au/privacy/guidance-and-advice/australian-entities-and-the-eu-general-data-protection-regulation/">does not have EU adequacy status</a>. This is because our Privacy Act does not apply to small businesses, employee data, and political parties, amongst others. An EU entity that seeks to export personal data to Australia therefore needs to use other safeguards to ensure that EU personal data remains protected. </p>
<p>This is commonly done in the form of standard contractual clauses, by which the sender and recipient of data agree that their data processing meets GDPR standards. The CJEU has now clarified that companies and regulators must verify in each case that the clauses stand up in light of the recipient country’s data laws. </p>
<p>Governmental surveillance programs and access to effective legal remedies are a particular concern. Privacy professionals around the world <a href="https://iapp.org/news/a/schrems-ustaran-react-to-cjeus-ruling-on-privacy-shield-sccs/">now have to work out</a> what this new requirement means.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/heres-what-a-privacy-policy-thats-easy-to-understand-could-look-like-97251">Here's what a privacy policy that's easy to understand could look like</a>
</strong>
</em>
</p>
<hr>
<h2>Deepening global divisions and the trend to data localisation</h2>
<p>To comply with the ruling, companies need to engage in a more detailed risk analysis than before. In some cases, data may no longer be transferred. This is likely to contribute to an international trend to house critical data locally. A recent example of this trend is the <a href="https://theconversation.com/the-covidsafe-bill-doesnt-go-far-enough-to-protect-our-privacy-heres-what-needs-to-change-137880">COVIDSafe app</a>: the data it collects must remain in Australia. </p>
<p>The CJEU decision comes at a time of intense public debate of privacy <a href="https://www.cmo.com.au/article/679987/explainer-what-next-privacy-laws/">in Australia</a> and many other countries. The COVID-19 pandemic has <a href="https://www.computerworld.com/article/3562701/australian-small-businesses-advance-their-digitalisation-thanks-to-covid-19.html">turbo-charged the digitalisation</a> of many aspects of daily life. Every digital transaction leaves traces in the form of personal information, which could be a target for data mining and surveillance by corporate and state actors. </p>
<p>It would be sensible to adopt internationally harmonised data protection standards to regulate global data streams. But the world appears currently headed in the opposite direction. </p>
<p>Despite both the EU and US sides emphasising the <a href="https://www.reuters.com/article/us-facebook-privacy-eu-reaction-factbox-idUSKCN24H1QV">need for cooperation</a> after the CJEU ruling, the major trading powers and blocs are increasingly pitted against each other. </p>
<p>Apart from the <a href="https://www.politico.eu/article/rejection-of-us-surveillance-tests-eu-mettle-on-privacy-shield/">long-standing EU-US division over privacy</a>, China, India and Russia have also begun to assert their own distinct data processing models. These powers generally give their citizens fewer privacy rights than the EU. They also make increasing use of data localisation requirements, which prohibit or impede data export, to enforce their own data protection protocols. The intensifying conflict between the US and China, most recently erupting over the new security laws for Hong Kong, also marks <a href="https://www.politico.com/news/2020/07/16/decaying-us-china-relationship-365431">data governance and cybersecurity</a> as significant battlegrounds.</p>
<h2>Australia’s new challenges in data protection</h2>
<p>Australia’s data regulation tends to be pragmatic and business-friendly. It steers a middle course between the conflicting privacy approaches of the US and the EU. However, in a world retreating from globalised regulation, it is becoming increasingly difficult not to take sides. </p>
<p>Privacy is looming larger than ever in public consciousness, and Australia’s <a href="https://treasury.gov.au/sites/default/files/2019-12/Government-Response-p2019-41708.pdf">Privacy Act is due for an overhaul</a>. More than ever, Australia needs to determine its own course in safeguarding personal information against potential overreach by corporations and governments.</p><img src="https://counter.theconversation.com/content/142980/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Normann Witzleb receives funding from the International Association of Privacy Professionals (IAPP).</span></em></p>An EU decision on international data movements shows Australia’s rules for safeguarding personal information may need a rethink.Normann Witzleb, Associate Professor in Law, Monash UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1407592020-06-24T16:49:40Z2020-06-24T16:49:40ZBalancing privacy with public health: how well is South Africa doing?<figure><img src="https://images.theconversation.com/files/343453/original/file-20200623-188921-kvilgt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The French mobile phone application StopCovid, developed to trace people who test positive with COVID-19. </span> <span class="attribution"><span class="source">Chesnot/Getty Images</span></span></figcaption></figure><p>COVID-19 spreads from person to person through droplet and contact transmission. That’s why contact tracing and quarantining have been included as one approach to control the spread of the virus. The aim is to ensure that the number of new cases generated by each confirmed case is maintained below the effective reproduction number. </p>
<p>This process entails identification, assessment and quarantining of people who have been exposed to the virus. But COVID-19 can be transmitted before people are symptomatic. Therefore, in an effort to prevent further transmission, “one-step-ahead” tracing and preemptive quarantining are important measures in limiting the spread of the disease.</p>
<p>President Cyril Ramaphosa acted swiftly once the first cases were identified in South Africa by declaring a national state of disaster. Among the measures taken was the gazetting of amended regulations for <a href="https://www.gov.za/speeches/president-cyril-ramaphosa-south-africas-response-coronavirus-covid-19-pandemic-13-may-2020">contact tracing</a>. These allowed for the creation of an electronic contact tracing database in which the personal information of people infected with COVID-19 – or suspected to have come into contact with infected persons – could be aggregated. Personal information was to be collected from a variety of sources. This included mass testing as well as contact tracing using digital surveillance technologies.</p>
<p>But contact tracing poses a range of challenges – from technological through to the protection of personal privacy. South Africa needs to be cognisant of both if it’s going apply this correctly.</p>
<h2>Technology issues</h2>
<p>China and Singapore reported success in the use of cellular phones in the fight against COVID-19. Their success has been dependent on smartphone applications that collect GPS and bluetooth location and proximity data. Smartphone applications are also being used in a number of developed countries as <a href="https://www.nature.com/articles/d41586-020-01514-2">virtual health passports</a>.</p>
<p>This isn’t possible in the South African context as only 51% of people <a href="https://www.pewresearch.org/global/wp-content/uploads/sites/2/2018/10/Pew-Research-Center_Technology-use-in-Sub-Saharan-Africa_2018-10-09.pdf#page=5">surveyed</a> were reported to own smartphones. </p>
<p>South Africa relies on the triangulation of cell tower metadata supplied by electronic communication service providers. This is also problematic. In rural areas with few towers, triangulation is not possible. In urban areas, buildings scatter signals. Even under ideal conditions and with a high density of cell towers, it can only locate a phone to approximately 100 metres. This technology does not allow for identification of close contacts or retrospective traces.</p>
<p>Given these limitations, it is highly unlikely that <a href="http://www.sajbl.org.za/index.php/sajbl/article/view/626">cellular telephone tracing</a> using cell tower metadata will contribute to identifying or locating COVID-19 cases or contacts in the country. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/mobile-phone-data-is-useful-in-coronavirus-battle-but-are-people-protected-enough-136404">Mobile phone data is useful in coronavirus battle. But are people protected enough?</a>
</strong>
</em>
</p>
<hr>
<h2>Privacy issues</h2>
<p>In South Africa, marred by historical systematic discrimination, recent abuses of power and continued social marginalisation, particular consideration needs to be given to the measures being used to contain the spread of COVID-19.</p>
<p>There is provision in South Africa law for <a href="https://www.justice.gov.za/legislation/acts/2002-070.pdf">cellular phone data to be accessed</a>. Data are primarily used in anti-crime activities, and access requires a court order from a judge. Allegations that the information has been used in covert and unauthorised ways have raised suspicion. The use of some information has been challenged in the courts and in late 2019, the Gauteng High Court <a href="http://www.saflii.org.za/za/cases/ZAGPPHC/2019/384.html">struck down</a> key parts of the Regulation of Interception of Communications and Provision of Communication Related Information Act as being an affront to the constitutional right to privacy. </p>
<p>What is to stop the state from using the information gathered for contact tracing as a security measure – or for other purposes that fall outside the realm of public health?</p>
<p>In the case of COVID-19, the regulations authorise the Director-General of Health to issue tracking orders. The regulations also instruct the Minister of Justice and Correctional Services to appoint a retired High Court judge as the COVID-19 Designated Judge. Justice Kate O’Regan, a retired Constitutional Court judge, <a href="https://www.gov.za/speeches/minister-ronald-lamola-appoints-justice-kate-o%E2%80%99regan-coronavirus-covid-19-designate-judge-3">was appointed</a> to oversee the contact tracing database.</p>
<p>To protect public health via contact tracing, balancing privacy rights with other constitutional rights is essential. This is not an easy task. The <a href="http://www.samj.org.za/index.php/samj/article/view/12911">rights of people in the midst of an epidemic</a> must be considered in both the textual setting of the <a href="https://www.justice.gov.za/legislation/constitution/SAConstitution-web-eng.pdf">South African Constitution</a> and their socioeconomic setting. </p>
<p>Health data qualify as special personal information in terms of the <a href="https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013protectionofpersonalinforcorrect.pdf">Protection of Personal Information Act 4 of 2013</a>. Additional safeguards are required when health data are collected, processed and stored. The Act was due to come into effect on 1 April 2020, but this has been postponed due to the pandemic. </p>
<p>This has left South Africa with its constitutional and common law protection of privacy. </p>
<h2>Balancing act</h2>
<p>A person’s right to access healthcare is determined by an intricately linked bundle of human rights. This includes the right to dignity, bodily and psychological integrity and privacy. </p>
<p>The government’s power to limit any rights during a pandemic by collecting personal information for purposes of contact tracing must be considered against its constitutional obligations. This includes taking reasonable measures to achieve the realisation of these rights within available resources. </p>
<p>Balancing these rights is nuanced by South Africa’s socioeconomic context, which influences how the rights may be exercised. For example, <a href="http://www.statssa.gov.za/?p=6429">13% of the population</a> live in informal settlements, making it difficult to implement evidence based preventative methods such as social distancing and shelter in place directives. </p>
<p>In addition, South Africa relies on <a href="https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_20200420_contact_tracing_covid_with_annex_en.pdf">guidelines</a> for ethical data management issued by <a href="https://www.cfjustice.org/civil-societys-call-to-states-we-are-in-this-together-dont-violate-human-rights-while-responding-to-covid-19/">international bodies</a> for protection of people’s privacy during the COVID-19 pandemic. Its electronic contact tracing database is aligned with the <a href="https://apps.who.int/iris/handle/10665/332049">interim guidance of the World Health Organisation</a> on contact tracing during COVID-19. </p>
<p>The World Health Organisation has also provided training material and a <a href="https://www.who.int/godata/about">link</a> to software developed to enable countries to properly manage case-contact relationships and follow-up contacts.</p>
<p>The most essential data privacy principles include transparency, accountability, information quality, security and data subject participation. Data processing, consisting of collection, storage and use, must be lawful and for a clearly defined purpose. This purpose will determine the limits of use.</p>
<p>Protection of privacy goes much deeper than merely protection of personal information. The protection of personal information is fundamental to non-discrimination, human dignity and the freedoms of speech, association, movement and trade. These rights are central to any open and democratic society. The wellbeing and safety of a society as a whole during pandemics rely heavily on the codependent relationships between society, its individuals and their government.</p>
<p>The COVID-19 pandemic has highlighted weak points in the preparedness of countries to deal with large scale health care disasters. It has also pointed out constitutional weaknesses and shortcomings. Capitalising on the fear of another war and in realising his political ambition to establish the United Nations after WWII, Winston Churchill said “Never let a good crisis go to waste”. </p>
<p>Similarly, many modern-day politicians are exploiting the current crisis to strengthen their positions of power. The challenge for society is to seek to improve protection of rights and freedoms during the crisis, rather than to acquiesce in their abrogation.</p><img src="https://counter.theconversation.com/content/140759/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Michael Sean Pepper receives funding from the South African Medical Research Council and the University of Pretoria</span></em></p><p class="fine-print"><em><span>Anne Pope, Camille Castelyn, Ignatius Michael Viljoen, and Marietjie Botes do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>In a country marred by systematic discrimination and continued social marginalisation, particular consideration needs to be given to the measures being used to contain the spread of COVID-19.Michael Sean Pepper, Director, Institute for Cellular and Molecular Medicine & SAMRC Extramural Unit for Stem Cell Research & Therapy, University of PretoriaMarietjie Botes, Post Doctoral Researcher, University of PretoriaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1317522020-03-13T13:58:54Z2020-03-13T13:58:54ZWith coronavirus containment efforts, what are the privacy rights of patients?<figure><img src="https://images.theconversation.com/files/318181/original/file-20200302-18279-tkjjev.jpg?ixlib=rb-1.1.0&rect=44%2C0%2C5000%2C3323&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">People are reflected on a volunteer's sunglasses outside a neighborhood alley in Beijing that is closed due to the COVID-19 outbreak on March 1, 2020. </span> <span class="attribution"><span class="source">AP Photo/Andy Wong</span></span></figcaption></figure><p>As an epicenter of the COVID-19 outbreak, China has launched an unprecedented effort to control the disease, locking down Wuhan in the province of Hubei — <a href="https://www.businessinsider.com/wuhan-coronavirus-what-life-like-inside-quarantined-city-china-2020-2">a megacity of 11 million people</a>. </p>
<p>These measures may have helped protect against transmission, but
whether patients’ information has been properly protected remains in question. Personal information, such as national ID number, residential addresses and occupations have been <a href="https://www.wsj.com/articles/china-marshals-the-power-of-its-surveillance-state-in-fight-against-coronavirus-11580831633">leaked online</a> for people who travelled <a href="https://apnews.com/7f7336d2ed099936bd59bf8cb7f43756">from Wuhan to Shanghai or Inner Mongolia</a>. A citizen discovered his name in online lists <a href="https://www.nytimes.com/2020/02/03/business/china-coronavirus-wuhan-surveillance.html">after he reported to authorities that he had returned from Wuhan to his hometown of Linhai</a>. </p>
<p>During this unusual time, concerns about information safety are based on the real possibility of prejudice and harassment. Information breaches can also lead to identity theft.</p>
<p>How did China respond to this emergent health incident during the lunar new year celebration, a period of <a href="https://www.youtube.com/watch?v=eFakFlddIg0">the largest annual human migration in the world</a>, and how did the response influence the privacy of ordinary people?</p>
<h2>Tracking patients</h2>
<p>To understand the context of the illegal information disclosure, we must look at the patient tracking processes used in China and other regions, and the important role of surveillance technologies.</p>
<p>Online maps are good resources for concerned citizens to check whether there is an imminent disease threat in their area. <a href="https://gisanddata.maps.arcgis.com/apps/opsdashboard/index.html#/bda7594740fd40299423467b48e9ecf6">A dashboard developed by Johns Hopkins University</a> presents the coronavirus outbreak using data from the Centers for Disease Control and the World Health Organization. A <a href="https://www.healthmap.org/wuhan/index.php">health map</a> by <a href="http://www.childrenshospital.org/">Boston Children’s Hospital</a>, on the other hand, summarizes epidemic alerts through news reports and social media posts.</p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/318184/original/file-20200303-18295-fp3u3w.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/318184/original/file-20200303-18295-fp3u3w.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/318184/original/file-20200303-18295-fp3u3w.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/318184/original/file-20200303-18295-fp3u3w.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/318184/original/file-20200303-18295-fp3u3w.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/318184/original/file-20200303-18295-fp3u3w.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/318184/original/file-20200303-18295-fp3u3w.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A HealthMap worker at Boston Children’s Hospital monitors disease outbreaks on Feb. 13, 2020. Behind him is a world map with coloured dots marking cases of COVID-19.</span>
<span class="attribution"><span class="source">(AP Photo/Steven Senne)</span></span>
</figcaption>
</figure>
<p>Artificial intelligence is also tracking the spatial patterns of the epidemic. <a href="https://www.wired.com/story/ai-epidemiologist-wuhan-public-health-warnings/">A Canadian company called BlueDot</a> collects multilingual news reports and data from official public health databases to predict the potential of future outbreaks. <a href="https://www.wired.com/story/how-ai-tracking-coronavirus-outbreak/">Researchers from Harvard Medical School</a> gather authoritative information plus social media data to explore the geographical trends of the disease.</p>
<p>However, Chinese officials need more accurate locations of potential virus spreaders.</p>
<p>Telecommunication companies in China <a href="https://www.nytimes.com/2020/02/13/world/asia/china-coronavirus.html">announced a feature</a> that generates a list of recently visited provinces when subscribers text a hotline. Rail stations such as the one in Yiwu required passengers to show their travel history from the hotline before boarding a train, screening out people who had been to Hubei province. </p>
<p>Before this feature rolled out, China had been using <a href="https://www.reuters.com/article/us-china-health-surveillance/coronavirus-brings-chinas-surveillance-state-out-of-the-shadows-idUSKBN2011HO">licence plates</a> and <a href="https://www.reuters.com/article/us-china-health-surveillance/coronavirus-brings-chinas-surveillance-state-out-of-the-shadows-idUSKBN2011HO">facial recognition technology</a> to track people who left Wuhan before the city’s lockdown. <a href="https://globalnews.ca/news/6535353/china-coronavirus-drones-quarantine/">Drones</a> were also used to remind people to wear masks in public areas.</p>
<p>Some municipal governments have “innovative” methods other than technological approaches. Starting Feb. 7, at least three Chinese cities including Hangzhou, Ningbo and Sanya <a href="https://qz.com/1799725/chinese-cities-try-to-flush-out-coronavirus-patients-by-stopping-medicine-sales/">pulled fever and cough drugs off pharmacy shelves</a> so patients with such symptoms would have to visit doctors at hospitals. Some officials from Hebei province in northern China turned “neighbour against neighbour” by <a href="https://www.nytimes.com/2020/02/03/business/china-coronavirus-wuhan-surveillance.html">offering 1,000 RMB</a> (about $190) to residents for each person from Wuhan they reported.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/318186/original/file-20200303-18275-x07180.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/318186/original/file-20200303-18275-x07180.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/318186/original/file-20200303-18275-x07180.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/318186/original/file-20200303-18275-x07180.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/318186/original/file-20200303-18275-x07180.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/318186/original/file-20200303-18275-x07180.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/318186/original/file-20200303-18275-x07180.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A worker wearing a face mask directs traffic at a highway toll gate in Wuhan, China on Jan. 21, 2020. Licence plates were used to track people who left Wuhan before the city was locked down to try to contain the COVID-19 outbreak.</span>
<span class="attribution"><span class="source">(Chinatopix via AP)</span></span>
</figcaption>
</figure>
<p>Outside of mainland China, <a href="https://fortune.com/2020/02/05/coronavirus-patient-tracking-government-surveillance/">the Hong Kong government</a> has issued tracking wristbands to families returning from Hubei province. These ensure the Department of Health is notified if the wearer leaves their home during a 14-day quarantine period. </p>
<p>Similar strategies have been used <a href="https://www.telegraph.co.uk/news/2020/02/03/taiwan-uses-smartphones-monitor-patients-quarantined-virus-scare/">in Taiwan, where smartphones have been assigned</a> to notify police if patients are not quarantined at home. Whether the monitoring technologies turned home quarantine to “house arrest” remains debatable.</p>
<h2>Information disclosure</h2>
<p>Tracking down patients is troublesome enough, but whether their sensitive information is released depends on local administrations.</p>
<p>A study found that <a href="https://doi.org/10.3390/ijerph17010305">after a list of hospitals with MERS-CoV patients was disclosed to the public in South Korea in 2015, the number of laboratory-confirmed MERS-CoV patients decreased significantly</a>. The study published in January indicates that responsible information disclosure helps control infectious respiratory diseases.</p>
<p>Possibly inspired by previous experience, the South Korean government is extremely open about communicating patient activities to the public. Officials <a href="https://www.dailymail.co.uk/news/article-8011197/South-Korea-tracks-coronavirus-patients-locations-using-phone-data-publishes-online.html">published travel data on the 29 confirmed patients on the Ministry of Health and Welfare website</a> compiled by aggregating data from cell phone, credit card and transit card records, as well as CCTV footage. The Hong Kong government had a similar measure, publishing <a href="https://www.chp.gov.hk/files/pdf/list_of_buildings_en.pdf">a list of apartments</a> with quarantined residents on the Department of Health’s website.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/318191/original/file-20200303-18299-1c147ip.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/318191/original/file-20200303-18299-1c147ip.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/318191/original/file-20200303-18299-1c147ip.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/318191/original/file-20200303-18299-1c147ip.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/318191/original/file-20200303-18299-1c147ip.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/318191/original/file-20200303-18299-1c147ip.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/318191/original/file-20200303-18299-1c147ip.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A volunteer wearing a protective face mask emerges from the barricaded entrance to a community in Wuhan, China on Feb. 22, 2020.</span>
<span class="attribution"><span class="source">Chinatopix via AP</span></span>
</figcaption>
</figure>
<p>Japanese authorities, however, are <a href="https://www.japantimes.co.jp/news/2020/02/10/national/japan-cities-prefectures-travel-details-coronavirus-patients/#.Xks1djJKiUk">in disagreement</a> on disclosing travel data on coronavirus patients. The Japanese Health, Labour and Welfare Ministry provided no details. However, after the driver of a bus that transported tourists from Wuhan became infected with COVID-19, the Nara Prefectural Government shared the locations the bus had visited.</p>
<p>Some U.S. states are in line with the Japanese Health Ministry and make patient privacy a priority. On Feb. 5, Ohio passed <a href="https://wtov9.com/news/local/new-privacy-protocol-in-place-as-coronavirus-concerns-heighten">a new protocol</a> protecting patients’ places of origin while their cases are under investigation. Florida has <a href="https://www.sun-sentinel.com/news/fl-ne-coronavirus-privacy-20200207-zcyt7b7y7vg6lcs4js5ezjha3a-story.html">a similar state law</a> that prevents public access to information about suspected patients.</p>
<h2>Legal requirements</h2>
<p>While patient information disclosure practices vary from one administration to another, laws are in place to guard against improper management of sensitive information.</p>
<p>In China, the personal information protection regulation is called <a href="https://www.newamerica.org/cybersecurity-initiative/digichina/blog/translation-cybersecurity-law-peoples-republic-china/">Cybersecurity Law of the People’s Republic of China</a>, and best practices of personal information handling are described in <a href="https://www.newamerica.org/cybersecurity-initiative/digichina/blog/translation-chinas-personal-information-security-specification/">Personal Information Security Specification</a>.</p>
<p>A <a href="https://www.newamerica.org/cybersecurity-initiative/digichina/blog/translation-chinese-authorities-emphasize-data-privacy-and-big-data-analysis-coronavirus-response/">notice</a> from the Cyberspace Administration of China (CAC) on Feb. 9 clearly states that personal information such as “names, ages, identity card numbers, telephone numbers, household addresses and other such information” may not be used for purposes other than “epidemic control and disease prevention.” </p>
<p>In the United States, the <a href="https://www.hhs.gov/hipaa/for-professionals/security/laws-regulations/index.html">Health Insurance Portability and Accountability Act</a> (HIPAA) explains under what circumstances health-care professionals may disclose patient information to families, public health agencies and the media. The <a href="https://www2.ed.gov/policy/gen/guid/fpco/ferpa/index.html">Family Educational Rights and Privacy Act</a> (FERPA) provides additional guidance for health practitioners in different settings such as university hospitals.</p>
<p>U.S. Department of Health and Human Services also posted a <a href="https://www.hhs.gov/sites/default/files/february-2020-hipaa-and-novel-coronavirus.pdf">bulletin</a> to remind health-care workers that information about “an identifiable patient” may not be disclosed to the media or the public without “the patient’s written authorization” except in special circumstances.</p>
<p>In Canada, <a href="https://www.priv.gc.ca/en/privacy-topics/privacy-laws-in-canada/the-personal-information-protection-and-electronic-documents-act-pipeda/pipeda_brief/">the Personal Information Protection and Electronic Documents Act (PIPEDA)</a> is the national privacy law that regulates personal information disclosure, including health-related practices. Under PIPEDA, and under provincial patient privacy legislation where it exists, consent is required to collect, use or disclose an individual’s personal health information. To date, only <a href="https://globalnews.ca/news/6617581/coronavirus-cases-canada-timeline/">unidentifiable patient information has been released by health officials</a> to create a timeline of cases in the country. </p>
<p>With strict enforcement of privacy law, wide disease prevention education and proper location information disclosure, it is hoped that patient information leaks will diminish in China, and patient privacy will be better protected during an epidemic outbreak.</p><img src="https://counter.theconversation.com/content/131752/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Hongyu Zhang does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Some measures taken in China to contain the COVID-19 outbreak have raised concerns about patient privacy. As other countries bring in containment measures, will patient privacy be compromised?Hongyu Zhang, PhD Student in Geography, McGill UniversityLicensed as Creative Commons – attribution, no derivatives.