tag:theconversation.com,2011:/uk/topics/privacy-act-3892/articlesPrivacy Act – The Conversation2024-02-13T04:57:47Ztag:theconversation.com,2011:article/2234282024-02-13T04:57:47Z2024-02-13T04:57:47ZWhat is doxing, and how can you protect yourself?<p>The Australian government has brought forward <a href="https://www.theguardian.com/australia-news/2024/feb/12/albanese-government-to-propose-legislation-to-crack-down-on-doxing">plans to criminalise doxing</a>, bringing nationwide attention to the harms of releasing people’s private information to the wider public.</p>
<p>The government response comes after the <a href="https://www.smh.com.au/national/hundreds-of-jewish-creatives-have-names-details-taken-in-leak-published-online-20240208-p5f3if.html">public release of almost 600 names</a> and private chat logs of a WhatsApp group of Australian Jewish creative artists discussing the Israel-Hamas war.</p>
<p>As a result, some of the people whose details were leaked claim they were harassed, <a href="https://www.theguardian.com/australia-news/2024/feb/09/josh-burns-jewish-whatsapp-group-channel-publication-israel-palestine-clementine-ford">received death threats</a> and even had to go into hiding. </p>
<p>While we wait for <a href="https://www.smh.com.au/national/australia-news-live-federal-laws-on-doxxing-to-be-brought-forward-anniversary-of-stolen-generations-apology-20240213-p5f4eh.html?post=p55nen#p55nen">new penalties</a> for doxers under the federal Privacy Act review, understanding doxing and its harms can help. And there are also steps we can all take to minimise the risk. </p>
<h2>What is doxing?</h2>
<p><a href="https://www.kaspersky.com/resource-center/definitions/what-is-doxing">Doxing</a> (or doxxing) is releasing private information — or “docs”, short for documents — online to the wider public without the user’s consent. This includes information that may put users at risk of harm, especially names, addresses, employment details, medical or financial records, and names of family members.</p>
<p>The Australian government <a href="https://ministers.ag.gov.au/media-centre/transcripts/media-conference-parliament-house-13-02-2024">currently defines doxing</a> as the “malicious release” of people’s private information without their consent.</p>
<p>Doxing began as a form of unmasking anonymous users, trolls and those using hate speech while <a href="https://www.theatlantic.com/technology/archive/2022/04/doxxing-meaning-libs-of-tiktok/629643/">hiding behind a pseudonym</a>. Recently, it has become a weapon for online abuse, harassment, hate speech and adversarial politics. It is often the outcome of online arguments or polarised public views. </p>
<p>It is also becoming more common. Although there is no data for Australia yet, according to media company <a href="https://www.safehome.org/family-safety/doxxing-online-harassment-research/">SafeHome.org</a>, about 4% of Americans report having been doxed, with about half saying their private emails or home addresses have been made public. </p>
<p>Doxing is a crime in some countries such as the Netherlands and South Korea. In other places, including Australia, privacy laws haven’t yet caught up.</p>
<h2>Why is doxing harmful?</h2>
<p>In the context of the <a href="https://theconversation.com/au/topics/israel-hamas-war-146714">Israel-Hamas war</a>, doxing has affected <a href="https://www.haaretz.com/world-news/asia-and-australia/2024-02-06/ty-article/death-threats-boycotts-target-jewish-creatives-in-australia/0000018d-7e43-d636-adef-7eefae580000">both Jewish</a> and <a href="https://edition.cnn.com/2023/10/15/business/palestinian-americans-activists-doxxing/index.html">pro-Palestinian communities and activists</a> in Australia and abroad.</p>
<p>Doxing is harmful because it treats a user as an object and takes away their agency to decide what, and how much, personal information they want shared with the wider public. </p>
<p>This puts people at very real risk of physical threats and violence, particularly when public disagreement becomes heated. From a broader perspective, doxing also damages the digital ecology, reducing people’s ability to freely participate in public or even private debate through social media.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/online-safety-what-young-people-really-think-about-social-media-big-tech-regulation-and-adults-overreacting-196003">Online safety: what young people really think about social media, big tech regulation and adults 'overreacting'</a>
</strong>
</em>
</p>
<hr>
<p>Although doxing is sometimes just inconvenient, it is often used to publicly shame or humiliate someone for their private views. This can take a toll on a person’s mental health and wellbeing. </p>
<p>It can also affect a person’s employment, especially for people whose employers require them to keep their attitudes, politics, affiliations and views to themselves. </p>
<p>Studies have shown doxing particularly impacts <a href="https://journals.sagepub.com/doi/full/10.1177/0306422015605714">women</a>, including those using dating apps or experiencing family violence. In some cases, children and family members have been threatened because a high-profile relative has been doxed. </p>
<p>Doxing is also harmful because it oversimplifies a person’s affiliations or attitudes. For example, releasing the names of people who have joined a private online community to navigate complex views can represent them as only like-minded stereotypes or as participants in a group conspiracy. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/575225/original/file-20240213-24-b68guc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A person using a laptop and smartphone simultaneously" src="https://images.theconversation.com/files/575225/original/file-20240213-24-b68guc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/575225/original/file-20240213-24-b68guc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/575225/original/file-20240213-24-b68guc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/575225/original/file-20240213-24-b68guc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/575225/original/file-20240213-24-b68guc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/575225/original/file-20240213-24-b68guc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/575225/original/file-20240213-24-b68guc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">There are steps you can take online to protect yourself from doxing without having to complete withdraw.</span>
<span class="attribution"><a class="source" href="https://www.pexels.com/photo/person-holding-smartphone-3248292/">Engin Akyurt/Pexels</a></span>
</figcaption>
</figure>
<h2>What can you do to protect yourself from doxing?</h2>
<p>Stronger laws and better platform intervention are necessary to reduce doxing. Some experts believe that the fear of <a href="https://dl.acm.org/doi/abs/10.1145/3476075">punishment</a> can help shape better online behaviours.</p>
<p>These punishments may include criminal <a href="https://www.esafety.gov.au/report/what-you-can-report-to-esafety">penalties</a> for perpetrators and <a href="https://www.theaustralian.com.au/breaking-news/doxxing-attack-on-jewish-australians-prompts-call-for-legislative-change/news-story/9a2f3615dbf5594fb521a8959739e1f8#:%7E:text=Alongside%20legislative%20reform%2C%20the%20ECAJ,information%2C%E2%80%9D%20Mr%20Aghion%20said.">deactivating social media accounts</a> for repeat offenders. But better education about the risks and harms is often the best treatment.</p>
<p>And you can also protect yourself without needing to entirely withdraw from social media:</p>
<ol>
<li><p>never share a home or workplace address, phone number or location, including among a private online group or forum with trusted people</p></li>
<li><p>restrict your geo-location settings</p></li>
<li><p>avoid giving details of workplaces, roles or employment on public sites not related to your work </p></li>
<li><p>avoid adding friends or connections on social media services of people you do not know</p></li>
<li><p>if you suspect you risk being doxed due to a heated online argument, temporarily shut down or lock any public profiles</p></li>
<li><p>avoid becoming a target by pursuing haters when it reaches a certain point. Professional and courteous engagement can help avoid the anger of those who might disagree and try to harm you.</p></li>
</ol>
<p>Additionally, hosts of private online groups must be very vigilant about who joins a group. They should avoid the trap of accepting members just to increase the group’s size, and appropriately check new members (for example, with a short survey or key questions that keep out people who may be there to gather information for malicious purposes).</p>
<p>Employers who require their staff to have online profiles or engage with the public should provide information and strategies for doing so safely. They should also provide immediate support for staff who have been doxed.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/as-use-of-digital-platforms-surges-well-need-stronger-global-efforts-to-protect-human-rights-online-135678">As use of digital platforms surges, we'll need stronger global efforts to protect human rights online</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/223428/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Rob Cover receives funding from the Australian Research Council.</span></em></p>With doxing suddenly on the national agenda, here’s what you need to know.Rob Cover, Professor of Digital Communication and Co-Director of the RMIT Digital Ethnography Research Centre, RMIT UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2143862023-10-13T04:04:03Z2023-10-13T04:04:03ZCars are a ‘privacy nightmare on wheels’. Here’s how they get away with collecting and sharing your data<figure><img src="https://images.theconversation.com/files/553601/original/file-20231013-21-l5gaka.jpg?ixlib=rb-1.1.0&rect=89%2C71%2C5901%2C3916&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Cars with internet-connected features are fast becoming all-seeing data-harvesting machines – a so-called “privacy nightmare on wheels”, <a href="https://foundation.mozilla.org/en/blog/privacy-nightmare-on-wheels-every-car-brand-reviewed-by-mozilla-including-ford-volkswagen-and-toyota-flunks-privacy-test/">according to</a> US-based research conducted by the <a href="https://foundation.mozilla.org/en/insights/open-research/">Mozilla Foundation</a>.</p>
<p>The researchers looked at the privacy terms of 25 car brands, which were found to collect a range of customer data, from facial expressions, to sexual activity, to when, where and how people drive. </p>
<p>They also found terms that allowed this information to be passed on to third parties. Cars were “the official worst category of products for privacy” they had ever reviewed, <a href="https://foundation.mozilla.org/nl/privacynotincluded/articles/its-official-cars-are-the-worst-product-category-we-have-ever-reviewed-for-privacy/">they concluded</a>.</p>
<p>Australia’s privacy laws aren’t up to the task of protecting the vast amount of personal information collected and shared by car companies. And since our privacy laws don’t demand the specific disclosures required by some US states, we have much less information about what car companies are doing with our data.</p>
<p>Australia’s privacy laws need urgent reform. We also need international cooperation on enforcing privacy regulation for car manufacturers.</p>
<h2>How do cars collect sensitive data?</h2>
<p>Apart from data entered directly into a car’s “infotainment” system, many cars can collect data in the background via cameras, microphones, sensors and connected phones and apps. </p>
<p>These data include:</p>
<ul>
<li>speed</li>
<li>steering, brake and accelerator pedal use</li>
<li>seat belt use</li>
<li>infotainment settings</li>
<li>phone contacts</li>
<li>navigation destinations</li>
<li>voice data</li>
<li>your location and surroundings</li>
<li>and even footage of you and your family outside your car. (Between 2019 and 2022, Tesla employees internally circulated <a href="https://www.abc.net.au/news/2023-04-08/tesla-workers-shared-sensitive-images-recorded-by-customer-cars/102202382">intimate footage</a> collected from people’s private cars for their own amusement, according to reports.)</li>
</ul>
<p>A lot of these data are used, at least in part, for legitimate purposes such as making driving more enjoyable and safer for the driver, passengers and pedestrians.</p>
<p>But they can also be supplemented with data collected from other sources and used for other purposes. For instance, data may be collected from your website visit, your test drive at a dealership, or from third parties including “<a href="https://www.toyota.com.au/privacy-policy">marketing agencies</a>” and “providers of data-collecting devices, products or systems that you use”.</p>
<p>The latter is very broad since our TVs, fridges and even our baby monitors can collect data about us.</p>
<p>Mozilla points out these combined data can be used “to develop inferences about a driver’s intelligence, abilities, characteristics, preferences and more”.</p>
<h2>Connected cars transmit data in real time</h2>
<p>While cars have been collecting large amounts of information since they became “<a href="https://www.toyota.com.au/privacy-policy">computers on wheels</a>”, this information has generally been stored in modules in the vehicle and accessed only when the car is physically connected to diagnostic equipment. </p>
<p>Now, however, vehicles are being sold with <a href="https://www.ag.gov.au/sites/default/files/2021-01/federal-chamber-of-automotive-industries.PDF">connected features</a> “in the sense that they can exchange information wirelessly with the vehicle manufacturer, third party service providers, users, infrastructure operators and other vehicles”. </p>
<p>This means your connected car can transmit data about you and your activities, generally via the internet, to various other companies as you go about your life. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/553621/original/file-20231013-23-olsqna.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/553621/original/file-20231013-23-olsqna.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/553621/original/file-20231013-23-olsqna.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/553621/original/file-20231013-23-olsqna.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/553621/original/file-20231013-23-olsqna.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/553621/original/file-20231013-23-olsqna.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/553621/original/file-20231013-23-olsqna.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/553621/original/file-20231013-23-olsqna.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Your internet-connected car can collect a range of data about you.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>Where do the data go?</h2>
<p>In Australia, we have little information about how our information can be used and by whom.</p>
<p>In its US-based study, Mozilla found data from consumers’ cars was being disclosed to other companies for marketing and targeted advertising purposes. It was also sold to data brokers. </p>
<p>Mozilla was able to uncover highly detailed information, largely because the laws of <a href="https://www.oag.ca.gov/privacy/ccpa#sectionc">California</a> and <a href="https://pro.bloomberglaw.com/brief/virginia-consumer-data-protection-act-vcdpa/#:%7E:text=The%E2%80%AFVCDPA%20gives%20consumers%20the%20right%20to%20access%20their,personal%20data%20for%20targeted%20advertising%20and%20sales%20purposes.">Virginia</a> require specific disclosures about who personal data is disclosed to and for what purposes (among other higher privacy standards). </p>
<p>Australian privacy law doesn’t require such specific disclosures. This is one reason car brands often have separate privacy policies for Australia. </p>
<p>A look at the privacy policies of various companies supplying connected cars in Australia reveals several vague, broad statements. Aside from using your data to provide you with connected services, these companies will:</p>
<ul>
<li>disclose it to others for “<a href="https://www.audi.com.au/au/web/en/audi-connect-plus.html#layer=/au/web/en/privacy-policy.html">customer research</a>”</li>
<li>use it to “<a href="https://www.kia.com/au/util/privacy.html">profile</a>” the type of person interested in their products </li>
<li>use it, along with “related companies” around the world, for vague “<a href="https://www.toyota.com.au/privacy-policy">data analysis</a>” and “<a href="https://www.toyota.com.au/privacy-connected">research and development purposes</a>” or </li>
<li>provide the data to unspecified “<a href="https://www.hyundai.com/au/en/privacy/bluelink-privacy-collection-notice">third parties</a> in connection with” developing new “marketing strategies”.</li>
</ul>
<p>Some may disclose your information to law enforcement or the government even when not required by law, such as when they believe “the use or disclosure is <a href="https://www.kia.com/au/util/privacy.html">reasonably necessary to assist</a> a law enforcement agency”.</p>
<h2>Trust us – we invented a ‘voluntary code’</h2>
<p>It’s safe to say car manufacturers generally don’t want privacy laws tightened. The <a href="https://www.fcai.com.au/about">Federal Chamber of Automotive Industries</a> (FCAI) represents companies distributing 68 brands of various types of vehicles in Australia.</p>
<p>During the recent review of our privacy legislation, the FCAI made a submission to the Attorney General’s department arguing against many of the privacy <a href="https://www.ag.gov.au/sites/default/files/2021-01/federal-chamber-of-automotive-industries.PDF">law reforms under consideration</a>. </p>
<p>Instead, it promoted its own <a href="https://www.fcai.com.au/news/codes-of-practice/view/publication/172#:%7E:text=The%20FCAI%20members%20have%20voluntarily%20agreed%20that%20the,and%20use%20of%20vehicle%20data%20and%20personal%20information.">Voluntary Code of Conduct for Automotive Data and Privacy Protection</a>. This weak document seems designed to comfort consumers without adding any privacy protections beyond existing legal obligations. </p>
<p>For example, signatories don’t say they’re bound by the code. Nor do they promise to follow its terms. They only say its principles will “drive their approach to treatment of vehicle-generated data and associated personal information”. There are no penalties for ignoring the code. </p>
<p>It even states signatories will “voluntarily notify” consumers of certain matters when the Privacy Act already requires this as a matter of law.</p>
<p>The code also notes third parties are increasingly interested in accessing and using consumers’ data to provide services, including insurance companies, parking garage operators, entertainment providers, social networks and search engine operators. </p>
<p>It says companies making data available to such third parties “will strive to inform you” about this.</p>
<h2>We need privacy law reform</h2>
<p>The government recently proposed important and <a href="https://www.ag.gov.au/rights-and-protections/publications/government-response-privacy-act-review-report">wide-ranging privacy law reforms</a>, following the Privacy Act Review which began in 2020. These changes are long overdue. </p>
<p>Proposals such as an updated definition of “personal information” and higher standards for “consent” could help protect consumers from intrusive and manipulative data practices.</p>
<p>The proposed “fair and reasonable test” would also assess whether a practice is substantively fair. This would help avoid claims data practices are lawful just because consumers had to provide consent.</p>
<p>The FCAI points out many cars aren’t specifically designed for Australia’s relatively small market, so increased privacy standards might result in some vehicles not being released here. But this isn’t a reason to carve out vehicles from privacy law reform.</p>
<p>Privacy laws are also being upgraded in numerous jurisdictions overseas. Australia’s government agencies should coordinate with their international counterparts to protect drivers’ privacy. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/to-steal-todays-computerized-cars-thieves-go-high-tech-210358">To steal today's computerized cars, thieves go high-tech</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/214386/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from the UNSW Allens Hub for Technology, Law and Innovation. She is a Member of the Expert Panel of the Consumer Policy Research Centre, and the Australian Privacy Foundation.</span></em></p>Cars can collect data via cameras, microphones, sensors, and connected phones and apps. Our privacy laws need urgent reform if these data are to be kept safe.Katharine Kemp, Associate Professor, Faculty of Law & Justice, and Deputy Director, Allens Hub for Technology, Law & Innovation, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2148332023-10-12T22:49:49Z2023-10-12T22:49:49ZNZ police are using AI to catch criminals – but the law urgently needs to catch up too<p>The use of artificial intelligence (AI) by New Zealand police is putting the spotlight on policing tactics in the 21st century.</p>
<p>A recent Official Information Act request <a href="https://www.rnz.co.nz/news/national/499181/new-police-intelligence-tool-speedily-sending-information-on-risk-to-frontline-officers">by Radio New Zealand</a> revealed the use of SearchX, an AI tool that can draw connections between suspects and their wider networks.</p>
<p>SearchX works by instantly finding connections between people, locations, criminal charges and other factors likely to increase the risk of harm to officers.</p>
<p>Police say SearchX is at the heart of a NZ$200 million front-line safety programme, primarily developed after the death of police constable Matthew Hunt in West Auckland in 2020, as well as other recent gun violence.</p>
<p>But the use of SearchX and other AI programmes raises questions about the invasive nature of the technology, inherent biases and whether New Zealand’s current legal framework will be enough to protect the rights of everyone.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1708649809329234031"}"></div></p>
<h2>Controversial technologies</h2>
<p>At this stage, New Zealanders only have a limited view of the AI programmes being used by the police. While some the programmes are <a href="https://www.police.govt.nz/sites/default/files/publications/technology-capabilities-list.pdf">public</a>, others are being <a href="https://thebit.nz/deep-dive/the-high-tech-stocktake-new-zealand-police-didnt-want-you-to-see/">kept under wraps</a>.</p>
<p>Police have acknowledged using <a href="https://www.stuff.co.nz/national/crime/300176755/police-using-technology-riddled-with-controversy-overseas">Cellebrite</a>, a controversial <a href="https://www.stuff.co.nz/national/crime/300176755/police-using-technology-riddled-with-controversy-overseas">phone hacker</a> technology. This programme extracts personal data from iPhones and Android mobiles and can access more than 50 social media platforms, including Instagram and Facebook.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ai-profiling-the-social-and-moral-hazards-of-predictive-policing-92960">AI profiling: the social and moral hazards of 'predictive' policing</a>
</strong>
</em>
</p>
<hr>
<p>The police have also acknowledged using <a href="https://www.police.govt.nz/sites/default/files/publications/technology-capabilities-list.pdf">BriefCam</a>, which aggregates video footage, including facial recognition and vehicle licence plates. </p>
<p>Briefcam allows police to focus on and track a person or vehicle of interest. Police <a href="https://www.rnz.co.nz/news/national/490353/ai-used-across-multiple-departments-in-camera-surveillance">claim</a> Briefcam can reduce the time analysing CCTV footage from three months to two hours.</p>
<p>Other AI tools such as <a href="https://sci-hub.se/10.1177/2032284420948161">Clearview AI</a> – which takes photographs from publicly accessible social media sites to identify a person – were tested by police before being abandoned. </p>
<p>The use of Clearview was <a href="https://www.rnz.co.nz/news/national/416483/police-trialled-facial-recognition-tech-without-clearance">particularly controversial</a> as it was trialled without the clearance of the police leadership team or the Privacy Commissioner. </p>
<h2>Eroding privacy?</h2>
<p>The promise of AI is that it can <a href="https://daily.jstor.org/what-happens-when-police-use-ai-to-predict-and-prevent-crime/">predict and prevent crime</a>. But there are also concerns over the use of these tools by police.</p>
<p>Cellebrite and Briefcam are highly intrusive programmes. They enable law enforcement to access and analyse personal data without people realising, much less providing consent. </p>
<p>But under current legislation, the use of both programmes by police is legal.</p>
<p>The Privacy Act 2020 allows government agencies – including police – to collect, withhold, use or disclose personal information in a way that would otherwise breach the act, where necessary for the “<a href="https://privacy.org.nz/tools/knowledge-base/view/252">maintenance of the law</a>”.</p>
<h2>AI’s biased decisions</h2>
<p>Privacy is not the only issue being raised by the use of these programmes. There is a tendency to assume decisions made by AI are more accurate than humans – particularly as <a href="https://www.nature.com/articles/s41598-021-87480-9">tasks become more difficult</a>. </p>
<p>This bias in favour of AI decisions means investigations may <a href="https://innocenceproject.org/news/when-artificial-intelligence-gets-it-wrong/">harden towards the AI-identified perpetrator</a> rather than other suspects. </p>
<p>Some of the mistakes can be tied to <a href="https://bhm.scholasticahq.com/article/38021-misguided-artificial-intelligence-how-racial-bias-is-built-into-clinical-models">biases in the algorithms</a>. In the past decade, <a href="https://virginia-eubanks.com/automating-inequality/">scholars have begun to document</a> the negative impacts of AI on people with low incomes and the working class, particularly in the justice system. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australian-police-are-using-the-clearview-ai-facial-recognition-system-with-no-accountability-132667">Australian police are using the Clearview AI facial recognition system with no accountability</a>
</strong>
</em>
</p>
<hr>
<p>Research has shown ethnic minorities are more likely to be <a href="https://www.washingtonpost.com/technology/2019/12/19/federal-study-confirms-racial-bias-many-facial-recognition-systems-casts-doubt-their-expanding-use/">misidentified by facial recognition software</a>. </p>
<p>AI’s use in predictive policing is also an issue as AI can be fed data from <a href="https://link.springer.com/article/10.1007/s12027-020-00602-0">over-policed neighbourhoods</a>, which fails to record crime occurring in other neighbourhoods. </p>
<p>The <a href="https://daily.jstor.org/what-happens-when-police-use-ai-to-predict-and-prevent-crime/">bias is compounded</a> further as AI increasingly directs police patrols and other surveillance onto these already over-policed neighbourhoods. </p>
<p>This is not just a problem overseas. Analyses of the <a href="https://www.otago.ac.nz/__data/assets/pdf_file/0027/312588/https-wwwotagoacnz-caipp-otago711816pdf-711816.pdf">New Zealand government’s use of AI</a> have raised a number of concerns, such as the issue of transparency and privacy, as well as how to manage “dirty data” – data with human biases already baked in before it is entered into AI programmes. </p>
<h2>We need updated laws</h2>
<p>There is no legal framework for the use of AI in New Zealand, much less for the police use of it. This lack of regulation is not unique, though. Europe’s long awaited AI law <a href="https://www.europarl.europa.eu/news/en/headlines/society/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence">still hasn’t been implemented</a>.</p>
<p>That said, New Zealand Police is a signatory to the <a href="https://www.anzpaa.org.au/homepage-announcements/australia-new-zealand-police-artificial-intelligence-principles">Australia New Zealand Police Artificial Intelligence Principles</a>. These establish guidelines around transparency, proportionality and justifiability, human oversight, explainability, fairness, reliability, accountability, privacy and security. </p>
<p>The <a href="https://www.police.govt.nz/sites/default/files/publications/algorithm-charter-english.pdf">Algorithm Charter for Aotearoa New Zealand</a> covers the ethical and responsible use of AI by government agencies.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ai-could-be-a-force-for-good-but-were-currently-heading-for-a-darker-future-124941">AI could be a force for good – but we're currently heading for a darker future</a>
</strong>
</em>
</p>
<hr>
<p>Under the principles, police are meant to continuously monitor, test and develop AI systems and ensure data are relevant and contemporary. Under the charter, police must have a point of contact for public inquiries and a channel for challenging or appealing decisions made by AI. </p>
<p>But these are both voluntary codes, leaving significant gaps for legal accountability and police antipathy. </p>
<p>And it’s not looking good so far. Police have failed to implement one of the first – and most basic – steps of the charter: to establish a point of inquiry for people who are concerned by the use of AI. </p>
<p>There is no special page on the police website dealing with the use of AI, nor is there anything on the main feedback page specifically mentioning the topic. </p>
<p>In the absence of a clear legal framework, with an independent body monitoring the police’s actions and enforcing the law, New Zealanders are left relying on police to monitor themselves. </p>
<p>AI is <a href="https://theconversation.com/nzs-political-leaders-are-ignoring-the-mounting-threats-from-ai-and-thats-putting-everyone-at-risk-214714">barely on the radar</a> ahead of the 2023 election. But as it becomes more pervasive across government agencies, New Zealand must follow Europe’s lead and enact AI regulation to ensure police use of AI doesn’t cause more problems than it solves.</p><img src="https://counter.theconversation.com/content/214833/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Alexandra Sims does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Current laws governing policing don’t take into account the capacity of AI to process massive amounts of information quickly – leaving New Zealanders vulnerable to police overreach.Alexandra Sims, Associate Professor in Commericial Law, University of Auckland, Waipapa Taumata RauLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2021272023-03-22T00:06:33Z2023-03-22T00:06:33ZPopular fertility apps are engaging in widespread misuse of data, including on sex, periods and pregnancy<figure><img src="https://images.theconversation.com/files/516601/original/file-20230321-690-se9b8m.jpeg?ixlib=rb-1.1.0&rect=24%2C58%2C3210%2C2095&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption"></span> </figcaption></figure><p>New research reveals serious privacy flaws in fertility apps used by Australian consumers – emphasising the need for urgent reform of the Privacy Act.</p>
<p>Fertility apps provide a number of features. For instance, they may help users track their periods, identify a “fertile window” if they’re trying to conceive, track different stages and symptoms of pregnancy, and prepare for parenthood up until the baby’s birth. </p>
<p>These apps collect deeply sensitive data about consumers’ sex lives, health, emotional states and menstrual cycles. And many of them are intended for use by children as young as 13. </p>
<p>My report <a href="https://allenshub.unsw.edu.au/sites/default/files/2023-03/KKemp%20Your%20Body%20Our%20Data%2022.03.23.pdf">published today</a> analysed the privacy policies, messages and settings of 12 of the most popular fertility apps used by Australian consumers (excluding apps that require a connection with a wearable device). </p>
<p>This analysis uncovered a number of concerning practices by these apps including:</p>
<ul>
<li>confusing and misleading privacy messages</li>
<li>a lack of choice in how data are used</li>
<li>inadequate de-identification measures when data are shared with other organisations</li>
<li>retention of data for years even after a consumer stops using the app, exposing them to unnecessary risk from potential data breaches.</li>
</ul>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/proposed-privacy-reforms-could-help-australia-play-catch-up-with-other-nations-but-they-fail-to-tackle-targeted-ads-200166">Proposed privacy reforms could help Australia play catch-up with other nations. But they fail to tackle targeted ads</a>
</strong>
</em>
</p>
<hr>
<h2>The data collected</h2>
<p>The apps in this study collect intimate data from consumers, such as:</p>
<ul>
<li>their pregnancy test results</li>
<li>when they have sex and whether they had an orgasm</li>
<li>whether they used a condom or “withdrawal” method</li>
<li>when they have their period</li>
<li>how their moods change (including anxiety, panic and depression)</li>
<li>and if they have health conditions such as polycystic ovary syndrome, endometriosis or uterine fibroids. </li>
</ul>
<p>Some ask for unnecessary details, such as when a user smokes and drinks alcohol, their education level, whether they struggle to pay their bills, if they feel safe at home, and whether they have stable housing.</p>
<p>They also track which support groups you join, what you add to your “to-do list” or “questions for doctor”, and which articles you read. All of this creates a more detailed picture of your health, family situation and intentions.</p>
<h2>Confusing or misleading privacy messages</h2>
<p>Consumers should expect the clearest information about how such data are collected, used and disclosed. Yet we found some of the messaging is highly confusing or misleading.</p>
<p>Some apps say “we will never sell your data”. But the fine print of the privacy policy contains a term that allows them to sell all your data as part of the sale of the app or database to another company. </p>
<p>This possibility is not just theoretical. Of the 12 apps included in the study, one was previously taken over by a drug development company, and another two by a digital media company.</p>
<p>Other apps explain privacy settings using language that makes it almost impossible for a consumer to understand what they are choosing, or obscure the privacy settings by placing them numerous clicks and scrolls away from the home screen. </p>
<h2>Keeping sensitive data for too long</h2>
<p>The <a href="https://www.abc.net.au/news/2022-10-21/medibank-optus-data-hack/101558932">major data breaches</a> of the past six months highlight the risks of companies holding onto personal data longer than necessary. </p>
<p>Breaches of highly sensitive information about health and sexual activities could lead to <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4387341">discrimination, exploitation, humiliation or blackmail</a>. </p>
<p>Most of the apps we analysed keep user data for at least three years after the user quits the app – or seven years in the case of one brand. Some apps give no indication of when user data will be deleted. </p>
<h2>Can’t count on ‘de-identification’</h2>
<p>Some apps also give consumers no choice regarding whether their “de-identified” health data will be sold or transferred to other companies for research or business. Or, they have consumers opted-in to these extra uses by default, putting the onus on users to opt out.</p>
<p>Moreover, some of these data are not truly de-identified. For example, removing your name and email address and replacing it with a unique number is not de-identification for legal purposes. Someone would only need to work out the link between your name and that number in order to link your whole record with you.</p>
<p>When supposedly de-identified Medicare records were published in 2016, <a href="https://www.unimelb.edu.au/newsroom/news/2017/december/research-reveals-de-identified-patient-data-can-be-re-identified">University of Melbourne researchers</a> showed how just a few data points can connect a de-identified record to a unique individual.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/post-roe-women-in-america-are-right-to-be-concerned-about-digital-surveillance-and-its-not-just-period-tracking-apps-185865">Post Roe, women in America are right to be concerned about digital surveillance – and it’s not just period-tracking apps</a>
</strong>
</em>
</p>
<hr>
<h2>Need for reform</h2>
<p>This research highlights the unfair and unsafe data practices consumers are subjected to when they use fertility apps. And these findings reinforce the need for Australia’s privacy laws to be updated. </p>
<p>We need improvements in what data are covered by the Privacy Act, what choices consumers can make about their data, what data uses are prohibited, and what security systems companies must have in place.</p>
<p>The government is seeking <a href="https://www.ag.gov.au/rights-and-protections/publications/privacy-act-review-report">submissions</a> on potential privacy law reforms until March 31. </p>
<p>In the meantime, if you’re using a fertility app, there are some steps you can take to help reduce some of the privacy risks: </p>
<ol>
<li>when launching the app for the first time, don’t agree to tracking of your data, or you can limit ad tracking via iPhone device settings </li>
<li>don’t log in via a social media account</li>
<li>don’t answer questions or add data you don’t need to for your own purposes</li>
<li>don’t share your Apple Health or FitBit data</li>
<li>if the app provides privacy choices, opt out of tracking and having your data sold or used for research, and delete your data when you stop using the app</li>
<li>bear in mind that every article you read, and how long you spend on it, and every group you join and comment you make there may be added to a profile about you. </li>
</ol>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/after-roe-v-wade-heres-how-women-could-adopt-spycraft-to-avoid-tracking-and-prosecution-186046">After Roe v Wade, here's how women could adopt 'spycraft' to avoid tracking and prosecution</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/202127/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, and the Australian Privacy Foundation.</span></em></p>An analysis of 12 popular apps’ privacy policies reveals a number of concerns, including confusing privacy messages and unnecessarily long data retention windows.Katharine Kemp, Senior Lecturer, Faculty of Law & Justice, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2001662023-02-20T05:50:02Z2023-02-20T05:50:02ZProposed privacy reforms could help Australia play catch-up with other nations. But they fail to tackle targeted ads<figure><img src="https://images.theconversation.com/files/511077/original/file-20230220-19-p8vr96.jpeg?ixlib=rb-1.1.0&rect=123%2C6%2C4461%2C3052&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>In the recently released <a href="https://www.ag.gov.au/sites/default/files/2023-02/privacy-act-review-report.pdf">Privacy Act Review Report</a>, the Attorney-General’s Department makes numerous important proposals that could see the legislation, enacted in 1988, begin to catch up to leading privacy laws globally.</p>
<p>Among the positive proposed changes are: more realistic definitions of personal information and consent, tighter limits on data retention, a right to erasure, and a requirement for data practices to be fair and reasonable. </p>
<p>However, the report’s proposals on targeted advertising don’t properly address the power imbalance between companies and consumers. Instead, they largely accept a status quo that sacrifices consumer privacy to the demands of online targeted ad businesses.</p>
<h2>Capturing personal information used to track and profile</h2>
<p>Obligations under the existing Privacy Act only apply to “personal information”, but there has been legal uncertainty about what exactly constitutes “personal information”. </p>
<p>Currently, companies can track an individual’s online behaviour across different websites and connect it with their offline movements by matching their data with data collected from third parties, such as retailers or <a href="https://www.oracle.com/au/cx/advertising/data-enrichment-measurement/#data-enrichment">data brokers</a>. </p>
<p>Some of these companies claim they’re not dealing in “personal information” since they don’t use the individual’s name or email address. Instead, the matching is done based on a unique identifier allocated to that person – such as a <a href="https://help.abc.net.au/hc/en-us/articles/4402890310671">hashed email</a>, for example.</p>
<p>The report proposes an expanded definition of “personal information” that clearly includes the various technical and online identifiers being used to track and profile consumers. Under this definition, companies could no longer claim such data collection and sharing are outside the scope of the Privacy Act. </p>
<h2>Improved consent (when required)</h2>
<p>The report also proposes higher standards for how consent is sought, in cases where the act requires it. This would require voluntary, informed, current, specific and unambiguous consent.</p>
<p>This would work against organisations claiming consumers have consented to unexpected data uses just because they used a website or an app with a link to a broadly worded privacy policy with take-it-or-leave-it terms. </p>
<p>For example, companies would need to demonstrate the higher standard of consent to collect sensitive information about someone’s mental health or sexual orientation. The report also proposes that some further data practices, such as precise geolocation tracking, should require consent.</p>
<p>However, it specifically states consent should not be required for some targeted ad practices. Yet <a href="https://www.accc.gov.au/system/files/Digital%20platforms%20inquiry%20-%20final%20report.pdf">surveys</a> show most consumers regard these as misuses of their personal information.</p>
<h2>‘Fair and reasonable’ data practices</h2>
<p>The report proposes a “fair and reasonable” test for dealings with personal information in general.</p>
<p>This recognises that consumers are saddled with too much of the responsibility for managing how their personal information is collected and used, while they lack the information, resources, expertise and control to do this effectively.</p>
<p>Instead, organisations covered by the Privacy Act should ensure their data handling practices are “fair and reasonable”, regardless of whether they have consumer consent. This would include considering whether a reasonable person would expect the data to be collected, used or disclosed in that way, and whether any dealing with children’s information is in the best interests of the child.</p>
<h2>Prohibiting targeted ads based on sensitive information</h2>
<p>The report proposes the prohibition of targeting based on sensitive information and traits. However, it’s not always easy to draw the line between “sensitive” information or traits, and other personal information. </p>
<p>For instance, is having an interest in “cosmetic procedures” or “rapid weight loss” a sensitive trait, or a general reading interest? Companies may exploit such grey areas. So while prohibiting targeting based on sensitive information is appropriate, it’s not enough in itself.</p>
<p>Another loophole arises in the report’s proposal that consumer consent should be necessary before an organisation trades in their personal information. The report leaves open an exception to this consent requirement where the “trading” is reasonably necessary for an organisation’s functions or activities.</p>
<p>This may be a substantial exception: data brokers, for example, might argue their trade in personal information (without consumers’ knowledge or consent) is necessary.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/is-your-phone-really-listening-to-your-conversations-well-turns-out-it-doesnt-have-to-162172">Is your phone really listening to your conversations? Well, turns out it doesn't have to</a>
</strong>
</em>
</p>
<hr>
<h2>Opt out only, not opt in</h2>
<p>Both the <a href="https://www.accc.gov.au/system/files/Digital%20platforms%20inquiry%20-%20final%20report.pdf">ACCC</a> and the <a href="https://assets.publishing.service.gov.uk/media/5fa557668fa8f5788db46efc/Final_report_Digital_ALT_TEXT.pdf">UK Competition & Markets Authority</a> have recommended consumers should opt <em>in</em> to the use of their personal information for targeted advertising if they wish to see this content.</p>
<p>But the report proposes individuals should only be allowed to opt <em>out</em> of “seeing” targeted ads. This still wouldn’t stop companies from collecting, using and disclosing a user’s personal information for broader targeting purposes.</p>
<p>Even if a consumer opts out of seeing targeted ads, a business may continue to collect their personal information to create “lookalike audiences” and target other people with similar attributes. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1626662613500166144"}"></div></p>
<p>Although having the option to opt out of seeing targeted ads gives consumers some limited control, companies still control the “<a href="https://www.accc.gov.au/system/files/DPB%20-%20DPSI%20-%20September%202021%20-%20Full%20Report%20-%2030%20September%202021%20%283%29_1.pdf">choice architecture</a>” of such settings. They can use their control to make opting out <a href="https://cprc.org.au/dupedbydesign/">confusing and difficult</a> for users, by forcing them to navigate through multiple pages or websites with obscurely labelled settings. </p>
<h2>Are targeted ads necessary to support online services?</h2>
<p>This limitation of consumers’ choices was partly explained by the view of the Attorney-General’s Department that targeted ads are necessary to fund “free” services. This refers to services where consumers “pay” with their attention and data (which companies use to make revenue from targeted advertising).</p>
<p>However, many companies using customers’ personal information for targeted ad businesses aren’t providing free services. Consider online marketplaces such as Amazon or eBay, or subscription-based products of media companies such as NewsCorp and Nine.</p>
<p>Meta (Facebook) and the Interactive Advertising Bureau Australia argued that if consumers opt out of targeted ads, a company should be able to stop offering them the service in question. This proposal was rejected on the basis that a platform can still show non-targeted ads to such consumers.</p>
<p>Inconsistently, the report failed to question broader claims that targeted advertising – as opposed to less intrusive forms of advertising – must be protected for online services to be viable. </p>
<h2>Real change is needed</h2>
<p>The reform of our privacy laws is long overdue. The government should avoid watering down potential improvements by attempting to preserve the status quo dictated by large businesses. </p>
<p>The government is seeking <a href="https://ministers.ag.gov.au/media-centre/landmark-privacy-act-review-report-released-16-02-2023">feedback on the report</a> until March 31. It will then decide on the final form of the reforms it proposes, before these are debated in Parliament. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/this-law-makes-it-illegal-for-companies-to-collect-third-party-data-to-profile-you-but-they-do-anyway-190758">This law makes it illegal for companies to collect third-party data to profile you. But they do anyway</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/200166/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, and the Australian Privacy Foundation.</span></em></p>The proposals from the Attorney-General’s Department could help bolster Australia’s privacy laws — but there are some deficiencies.Katharine Kemp, Senior Lecturer, Faculty of Law & Justice, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2000792023-02-17T04:51:52Z2023-02-17T04:51:52ZGovernment’s privacy review has some strong recommendations – now we really need action<p>Attorney-General Mark Dreyfus yesterday <a href="https://www.ag.gov.au/rights-and-protections/publications/privacy-act-review-report">released a report</a> with 30 proposals for updating Australia’s privacy regime. The proposals are practical, necessary and overdue. However, they are just proposals, which have been made several times in the past before disappearing into the “too hard basket” of the Australian, state and territory governments. </p>
<p>We can expect to see lots of noise about specific proposals and hope the Albanese government (copied by state/territory counterparts) gives us the legislation we need.</p>
<h2>Making sense of the report</h2>
<p>At a superficial level, the report gives effect to an election commitment – a promise to do something about federal privacy law, which is centred on public/private data collection and use (often online), rather than <a href="https://www.oaic.gov.au/privacy/privacy-in-your-state">state/territory</a> law dealing with activity such as strip searches, public hospital records, hidden cameras in toilets or senior figures distributing nude <a href="https://www.theguardian.com/australia-news/2023/feb/15/nsw-premier-stands-by-mp-peter-poulos-who-leaked-explicit-photos-of-female-rival">photos</a> of rivals. </p>
<p>More deeply, it is a recognition that, as part of the global economy where data and investment flow across borders, Australia continues to limp behind law and administration where protecting privacy is concerned. Updating the <a href="https://www.oaic.gov.au/privacy/the-privacy-act">Privacy Act</a> also reflects recognition of challenges facing business and government in the world of ransomware, big data and artificial intelligence. </p>
<p>Unhappiness with the “she’ll be right, mate” approach of some large organisations and the failure of the key national privacy regulator (under-resourced, under-skilled and slow to act) was evident in the recent Optus and Medibank data breaches.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ive-given-out-my-medicare-number-how-worried-should-i-be-about-the-latest-optus-data-breach-191575">I've given out my Medicare number. How worried should I be about the latest Optus data breach?</a>
</strong>
</em>
</p>
<hr>
<p>The proposals are not new. They have been voiced in detailed law reform commission reports, national and state parliamentary committee reports, statements by independent bodies such as the Law Council and academics over the past 20 years. The lack of action to date means Australians might be sceptical about what will happen once the government is lobbied by those whose interests are served by keeping things as they are, and it is again tempted to kick the can down the road.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1625975616934916096"}"></div></p>
<h2>What do the proposals cover?</h2>
<p>It is important to remember that states and territories have significant responsibilities regarding privacy. The proposal to set up a working party involving those governments provokes thought about why that hasn’t been done already.</p>
<p>The initial proposal calls for changing the <a href="https://www.oaic.gov.au/privacy/the-privacy-act">1988 Privacy Act</a> to explicitly recognise that privacy is in the public interest, something that shouldn’t be controversial and offsets the absence of a human rights framework in the national constitution. After that, we are into some positive steps forward. However, these are tempered by a lot of “let’s wait and see the administration” before starting to celebrate.</p>
<p>The report retains the overall structure of the 1988 Act but, crucially, extends its coverage, in particular on what is “personal information”. It calls for consultation about criminal penalties and for prohibiting some of the ways organisations have got around restrictions. </p>
<p>It proposes consultation about removing the exemption for small businesses (those under A$3million) and about the handling of employee records. The major <a href="https://www.alrc.gov.au/publication/for-your-information-australian-privacy-law-and-practice-alrc-report-108/41-political-exemption/exemption-for-registered-political-parties-political-acts-and-practices/">exclusion</a> of political parties – a common source of unhappiness – would be modified. Journalists would be expected to behave better.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1625992574937989120"}"></div></p>
<p>The report emphasises meaningful consent. In the collection of personal information, consent must be </p>
<blockquote>
<p>voluntary, informed, current, specific and unambiguous.</p>
</blockquote>
<p>This would bring Australia into line with Europe and indeed with much of our existing law, such as that administered by the Australian Competition and Consumer Commission.</p>
<p>We can expect controversy about a proposed right of “erasure” and about “de-indexing”. This is referred to as the “right to obscurity” in Europe, and means some personal information stays online but is not highlighted in search engine results. Individuals would need to ask for that obscurity, and it would not be granted for serious criminal offences.</p>
<p>There have been recurrent proposals for a “privacy tort”: this means people whose privacy has been seriously invaded could take action in a court to stop the invasion and/or gain compensation. </p>
<p>The report endorses <a href="https://www.alrc.gov.au/publication/serious-invasions-of-privacy-in-the-digital-era-alrc-report-123/4-a-new-tort-in-a-new-commonwealth-act-2/">this</a> recommendation by the Australian Law Reform Commission. It also proposes a “direct right of action” under the current act. This implicitly offsets the weakness of the Office of the Australian Information Commissioner (OAIC), one of the two national information privacy watchdogs.</p>
<p>The report grapples with data breaches such as the recent Optus and Medibank incidents. Proposals regarding mandatory reporting of such breaches tweak the current regime. </p>
<p>There is likely to be more push-back from business and public sector organisations regarding a proposed requirement for those bodies to “identify, mitigate and redress actual and reasonably foreseeable loss”. This is a first step towards persuading organisations to meaningfully lift their game and compensate for harms.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/heres-how-tech-giants-profit-from-invading-our-privacy-and-how-we-can-start-taking-it-back-120078">Here's how tech giants profit from invading our privacy, and how we can start taking it back</a>
</strong>
</em>
</p>
<hr>
<h2>It’s too soon to cheer</h2>
<p>On the surface, the report is a major step forward, something that business and the community should strongly endorse. In practice, we need to look beyond the headlines and see the details of how the proposals would be written into law, and whether the attorney-general can harness support in the face of the usual strong lobbying. </p>
<p>Proposals that there will be discussion, yet again, don’t provide much comfort. More worryingly, the proposals centre on the development and implementation of guidelines and standards by the OAIC. </p>
<p>In practice, the report proposes to perpetuate existing problems involving a regulator with a <a href="https://www.sciencedirect.com/science/article/abs/pii/S0167739X20329940">timid</a> corporate culture and a commitment to interpreting the legislation through the eyes of the bodies it is meant to <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4083468">regulate</a>. Change is better than good intentions.</p><img src="https://counter.theconversation.com/content/200079/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dr Arnold was a former board member of the Australian Privacy Foundation and a member of OECD data protection working parties</span></em></p>There are many good proposals in Dreyfus’s reform paper. But they risk being lost once again among the voices of those whose interests are served by maintaining the status quo.Bruce Baer Arnold, Associate Professor, School of Law, University of CanberraLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1935332022-11-08T05:06:20Z2022-11-08T05:06:20ZJust 25% of businesses are insured against cyber attacks. Here’s why<figure><img src="https://images.theconversation.com/files/493730/original/file-20221107-19-ppqbvf.jpg?ixlib=rb-1.1.0&rect=0%2C243%2C6490%2C3250&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>In the past financial year, the Australian Cyber Security Centre received <a href="https://www.cyber.gov.au/acsc/view-all-content/reports-and-statistics/acsc-annual-cyber-threat-report-july-2021-june-2022">76,000 cyber-crime reports</a> – on average, one every seven minutes. The year before, it was a report every eight minutes. The year before that, every ten minutes.</p>
<p>The growth of cyber crime means it is now arguably the <a href="https://www.aon.com/2021-global-risk-management-survey/index.html">top risk facing any business</a> with an online presence. One successful cyber attack is all it takes to ruin an organisation’s reputation and bottom line. The estimated cost to the Australian economy in <a href="https://www.unsw.adfa.edu.au/newsroom/news/cybercrime-estimated-42-billion-cost-australian-economy">2021 was $42 billion</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/why-are-there-so-many-data-breaches-a-growing-industry-of-criminals-is-brokering-in-stolen-data-193015">Why are there so many data breaches? A growing industry of criminals is brokering in stolen data</a>
</strong>
</em>
</p>
<hr>
<p>To protect itself (and its customers), a business has three main options. It can limit the amount of sensitive data it stores. It can take greater care to protect the data it does store. And it can insure itself against the consequences of a cyber attack.</p>
<p>Cyber-insurance is a broad term for insurance policies that address losses as a result of a computer-based attack or malfunction of a firm’s information technology systems. This can include costs associated with business interruptions, responding to the incident and paying relevant fines and penalties.</p>
<p>The global cyber-insurance market is now worth an estimated US$9 billion (A$13.9 billion). It is tipped to grow to <a href="https://www.munichre.com/content/dam/munichre/contentlounge/website-pieces/documents/MunichRe-Topics-Cyber-Whitepaper-2022.pdf/_jcr_content/renditions/original./MunichRe-Topics-Cyber-Whitepaper-2022.pdf">US$22 billion by 2025</a>. </p>
<p>But a big part of this growth reflects escalating premium costs – in Australia they increased more <a href="https://www.insurancebusinessmag.com/au/news/cyber/whats-driving-up-cyber-insurance-premiums-in-australia-417542.aspx">than 80% in 2021</a> – rather than more business taking up insurance. </p>
<p>So coverage rates are growing slowly, with about 75% of all businesses in Australia having no cyber-insurance, according to 2021 figures from the <a href="https://insurancecouncil.com.au/wp-content/uploads/2022/03/Cyber-Insurance_March2022-final.pdf">Insurance Council of Australia</a>.</p>
<h2>Challenges in pricing cyber-insurance</h2>
<p>With cyber-insurance still in its infancy, insurers face significant complexities in quantifying cyber risk pricing premiums accordingly – high enough for the insurers not to lose money, but as competitive as possible to encourage greater uptake. </p>
<p>A 2018 assessment of the cyber-insurance market by the <a href="https://www.cisa.gov/sites/default/files/publications/20_0210_cisa_oce_cyber_insurance_market_assessment.pdf">US Cybersecurity and Infrastructure Security Agency</a> identified three major challenges: lack of data, methodological limitations, and lack of information sharing. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-cybercriminals-turn-paper-checks-stolen-from-mailboxes-into-bitcoin-173796">How cybercriminals turn paper checks stolen from mailboxes into bitcoin</a>
</strong>
</em>
</p>
<hr>
<p>Lack of historical loss data means insurers are hampered in accurately predicting risks and costs.</p>
<p>Because of the relative newness of cyber crime, many insurers use risk-assessment methodologies derived from more established insurance markets <a href="https://www.rand.org/pubs/external_publications/EP67850.html">such as for car, house and contents</a>. These markets, however, are not analogous to cyber crime. </p>
<p>Companies may be hesitant to disclose information about cyber incidents, unless required to do so. Insurance carriers are reluctant to share data pertaining to damage and claims. </p>
<p>This makes it hard to create effective risk models that can calculate and predict the likelihood and cost of future incidents. </p>
<h2>So what needs to be done?</h2>
<p>Deakin University’s <a href="https://cybercentre.org.au/">Centre for Cyber Security Research and Innovation</a> has been working with insurance companies to understand what must be done to improve premium and risks models pertaining to cyber insurance. </p>
<p>Here is what we have found so far.</p>
<p>First, greater transparency is needed around cyber-related incidents and insurance to help remedy the lack of data and information sharing. </p>
<p>The federal government has taken two steps in the right direction on this. </p>
<p>One is the <a href="https://www.accc.gov.au/focus-areas/consumer-data-right-cdr-0">Consumer Data Right</a>, which provides guidelines on how service providers must share data about customers. This came into effect in mid-2021. </p>
<p>The other is the government’s proposal to amend <a href="https://www.aph.gov.au/Parliamentary_Business/Bills_Legislation/Bills_Search_Results/Result?bId=r6940">privacy legislation</a> to increase penalties for breaches and give the Privacy Commissioner new powers.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/after-the-optus-data-breach-australia-needs-mandatory-disclosure-laws-192612">After the Optus data breach, Australia needs mandatory disclosure laws</a>
</strong>
</em>
</p>
<hr>
<p>Second, insurers must find better ways to measure the financial value and worth of the data that organisations hold. </p>
<p>The primary asset covered by cyber insurance is the data itself. But there is no concrete measure of how that data is worth. </p>
<p>The recent Optus and Medibank Private data breaches provide clear examples. The Optus event affected millions more people than the Medibank Private hack, but the Medibank Private data includes <a href="https://www.afr.com/technology/privacy-fallout-from-medibank-hack-will-be-widespread-20221023-p5bs75">sensitive medical data</a> that, in principle, is worth far more than data regarding just your personal identity.</p>
<p>Without an accurate way to measure the financial value of data, it is difficult to determine the appropriate premium costs and coverage.</p>
<p>Cyber insurance is a new, specialised market with significant uncertainty.
Given the ever-increasing risks to individuals, organisations and society, it is imperative that insurers develop robust and reliable risk-based models as soon as possible. </p>
<p>This will require a consolidated effort between cyber-security experts, accountants and actuaries, insurance professionals and policymakers.</p><img src="https://counter.theconversation.com/content/193533/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The work has been supported by the Cyber Security Cooperative Research Centre Limited whose activities are partially funded by the Australian Government’s Cooperative Research Centres Programme.</span></em></p><p class="fine-print"><em><span>Robin Doss does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Cyber crime is arguably the top risk now facing any business. But things need to change if cyber-insurance is to be viable for most.Jongkil Jay Jeong, CyberCRC Senior Research Fellow, Centre for Cyber Security Research and Innovation (CSRI), Deakin UniversityRobin Doss, Director, Centre for Cyber Security Research and Innovation (CSRI), Deakin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1926122022-10-19T03:31:36Z2022-10-19T03:31:36ZAfter the Optus data breach, Australia needs mandatory disclosure laws<figure><img src="https://images.theconversation.com/files/490238/original/file-20221017-17274-9l47al.jpg?ixlib=rb-1.1.0&rect=0%2C895%2C6709%2C3571&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>The Optus data breach, which has affected close to 10 million Australians, has sparked calls for changes to Australia’s privacy laws, placing limits on what and for how long organisations can hold our personal data. </p>
<p>Equally important is to strengthen obligations for organisations to publicly disclose data breaches. Optus made a public announcement about its breach, but was not legally required to do so. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/why-the-class-action-against-optus-could-be-australias-biggest-191515">Why the class action against Optus could be Australia's biggest</a>
</strong>
</em>
</p>
<hr>
<p>In fact, beyond the aggregated data produced by the Office of the Australian Information Commissioner, the public is not made aware of the vast majority of data breaches that occur in Australia every year. </p>
<p>Australia has had a “<a href="https://www.oaic.gov.au/privacy/notifiable-data-breaches">Notifiable Data Breaches</a>” scheme since February 2018 that requires all organisation to notify affected individuals as well as the Office of the Australian Information Commissioner in the case a breach of personal information likely to result in serious harm. </p>
<p>However, no notification is required if the organisation takes remedial action to prevent harm. Most importantly, public disclosure is never required.</p>
<p>This gives a lot of discretion to organisations. They can make their own assessment about the risks and decide not to disclose a breach at all.</p>
<p>Companies listed on the Australian Securities Exchange (ASX) are also obliged to disclose any data breach expected to have a “material economic impact” on a company’s share price. But it is notoriously difficult to measure material economic impact. So these announcements are not a reliable source of information for the public.</p>
<h2>Notified data breaches</h2>
<p>While the <a href="https://www.oaic.gov.au/privacy/notifiable-data-breaches">Notifiable Data Breaches</a> scheme is a step in the right direction, it’s impossible to know if the disclosures made reflect the scale and scope of data breaches.</p>
<p>The most recent <a href="https://www.oaic.gov.au/__data/assets/pdf_file/0010/12205/Final-Notifiable-Data-Breaches-Report-Jul-Dec-2021.pdf">Notifiable Data Breaches Report</a>, covering the six months from July to December 2021, lists 464 notifications (up 6% from the previous period).</p>
<p>Of these, 256 (55%) were attributed to malicious or criminal attacks, and 190 (41%) to human error, such as emailing personal information to the wrong recipient, publishing information by accident, or losing data storage devices <a href="https://www.oaic.gov.au/__data/assets/pdf_file/0010/12205/Final-Notifiable-Data-Breaches-Report-Jul-Dec-2021.pdf">or paperwork</a>. Another 18 (4%) were attributed to system errors. </p>
<p>The sectors that reported the most breaches were the health care service (83 notifications); finance (56); and legal, accounting and management services (51). </p>
<p>About 70% of all incidents reportedly affected fewer than 100 people. But one event affected at least a million people. Despite the scale, the public has not been provided details of these events, or the identities of the organisations responsible. </p>
<hr>
<p><iframe id="Ccd5f" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/Ccd5f/1/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<hr>
<p>Regardless of the scale or reason, all data breaches have an impact on people and organisations. Despite this, we rarely learn about anything other than the most spectacular and most criminal of these events.</p>
<p>Without mandatory disclosure, there is insufficient public accountability. </p>
<h2>How should minimum disclosure work?</h2>
<p>A minimum disclosure framework <a href="https://www.sciencedirect.com/science/article/pii/S1045235421001155">should include</a> information about the type of data breached, the sensitivity of the data, the cause and size of the breach, and the risk-mitigation strategies the organisation has adopted.</p>
<p>The framework should require both a standardised public announcement when any significant data breach occurs, as well as a mandatory annual public report of data breaches. Reports and announcement should be published on the company’s website (just like an annual report) and filed with the Office of the Australian Information Commissioner.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/optus-says-it-needed-to-keep-identity-data-for-six-years-but-did-it-really-191498">Optus says it needed to keep identity data for six years. But did it really?</a>
</strong>
</em>
</p>
<hr>
<p>This would ensure public access to a coherent historical record of breach-related events and organisational responses. The disclosures would allow community groups, regulators and interested parties to analyse breaches of our data and act accordingly.</p>
<p>At its simplest, a mandatory disclosure framework encourages annual disclosures that are comparable and publicly available. At the very least it creates opportunities for scrutiny and discussion.</p><img src="https://counter.theconversation.com/content/192612/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jane Andrew received funding from the Australian Research Council to study organisational data breach disclosure practices.</span></em></p><p class="fine-print"><em><span>Max Baker received funding from the Australian Research Council. </span></em></p><p class="fine-print"><em><span>Monique Sheehan does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Optus made a public announcement about its breach but was not legally required to do so. This needs to change.Jane Andrew, Professor, University of Sydney Business School, University of SydneyMax Baker, Senior lecturer, University of SydneyMonique Sheehan, Research officer, University of SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1915152022-10-04T19:04:42Z2022-10-04T19:04:42ZWhy the class action against Optus could be Australia’s biggest<figure><img src="https://images.theconversation.com/files/487470/original/file-20220930-23-gwzr37.jpg?ixlib=rb-1.1.0&rect=9%2C1070%2C6253%2C3239&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>With the Optus data breach exposing almost 10 million current and former customers to identity theft, law firms are circling for what could end up being the biggest – and most valuable – class action case in Australian legal history. </p>
<p>A settlement could well be worth billions, eclipsing the current record of <a href="https://www.abc.net.au/news/2014-07-15/black-saturday-bushfire-survivors-secure-record-payout/5597062">$494 million</a> paid to 10,000 victims of Victoria’s 2009 Black Saturday bushfires.</p>
<p>Two class-action specialists, <a href="https://www.lawyersweekly.com.au/biglaw/35625-maurice-blackburn-investigates-action-against-optus">Maurice Blackburn</a> and <a href="https://www.slatergordon.com.au/class-actions/current-class-actions/optus-data-breach">Slater & Gordon</a>, are considering suing, and it’s possible others will follow. (Maurice Blackburn also has another case against Optus on its books over a 2019 data breach involving 50,000 customers.)</p>
<p>To proceed they’ll need to sign up at least seven people – one of whom acts as the “representative” or lead plaintiff. This shouldn’t be hard. They’ll then need to file a statement of claim for financial, economic or other loss. </p>
<p>Multiple class actions are possible if those claims pursue different issues. Or the firms could work together, as they have in the past.</p>
<h2>Things to know about class actions</h2>
<p>There have been about 700 class actions in Australia in the past 30 years. Class actions can be pursued through state or federal courts. Most go to the Federal Court, which has been empowered to hear class actions since 1992. </p>
<p>Less <a href="https://www.alrc.gov.au/wp-content/uploads/2019/08/alrc_report_134_webaccess_2.pdf">than 5%</a> of Federal Court actions have progressed to a judgement. About 60% have ended in a court-approved settlement, with the balance dismissed or discontinued.</p>
<hr>
<p><iframe id="GDSCN" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/GDSCN/1/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<hr>
<p>The most common type of class action is by shareholders for loss of earnings. These account for about a third of Federal Court class actions. </p>
<p>The biggest shareholder settlement so far is $200 million, paid by Centro Property Group to almost 6,000 shareholders in 2012 over misleading and deceptive conduct by Centro’s board. This followed the Australian Securities and Investments Commission <a href="https://www.smh.com.au/business/asic-wins-case-against-centro-directors-20110627-1gmk5.html">successfully prosecuting</a> Centro (also in the Federal Court). </p>
<p>Class actions account for less than 1% of claims lodged with the Federal Court, but their scale and complexity means they take a disproportionate amount of court time, as well as media attention. </p>
<p>Because of their cost, many class actions are funded by third parties as a type of business venture. This enables the law firms running the action to sign up plaintiffs on a “no win, no fee”. The litigation funder then takes a share of the settlement (as does the law firm for its legal fees). </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/regulations-needed-for-litigation-funders-who-cant-pay-out-when-cases-fail-72502">Regulations needed for litigation funders who can't pay out when cases fail</a>
</strong>
</em>
</p>
<hr>
<p>According to <a href="https://www.alrc.gov.au/wp-content/uploads/2019/08/alrc_report_134_webaccess_2.pdf">Australian Law Reform Commission</a> data for settled cases, the median percentage of any settlement going to plaintiffs is 57%, with law firms taking 17% and funders taking 22%. </p>
<h2>What would a class action against Optus involve?</h2>
<p>Based on what is currently known, there are two main ways a class action (or class actions) could proceed against Optus. </p>
<p>First, it could argue negligence, with the scope of liability outlined in state or territory legislation. Second, it could argue breach of privacy, in contravention of the federal <a href="https://www.legislation.gov.au/Details/C2014C00076">Privacy Act</a>, in the Federal Court.</p>
<p>To succeed in negligence, a court would have to find Optus had a duty of care to its customers to protect their personal information, that it breached its duty, and that customers suffered damage or loss.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-not-to-tell-customers-their-data-is-at-risk-the-optus-approach-191258">How not to tell customers their data is at risk: the Optus approach</a>
</strong>
</em>
</p>
<hr>
<p>To succeed on a breach of privacy, the Federal Court would have to find that personal information held by Optus was subject to unauthorised access or disclosure, or lost, and that the company failed to comply with the “privacy principles” enshrined in the Privacy Act.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/optus-says-it-needed-to-keep-identity-data-for-six-years-but-did-it-really-191498">Optus says it needed to keep identity data for six years. But did it really?</a>
</strong>
</em>
</p>
<hr>
<p>A second basis for a class action in the Federal Court could be to argue a breach of the <a href="https://www.legislation.gov.au/Details/C2018C00385">Telecommunications Act</a>. This legislation says carriers and carriage service providers “must to do their best” to protect telecommunications networks and facilities from unauthorised interference or unauthorised access. </p>
<h2>What are the precedents?</h2>
<p>The closest precedent in Australia to a successful class action for a mass breach of privacy is a 2019 case in the NSW Supreme court. This involved a claim by 108 NSW ambulance service employees against the NSW Health Department.</p>
<p>The employees, represented by the firm <a href="https://www.centenniallawyers.com.au/nsw-ambulance-class-action/">Centennial Lawyers</a>, had their personnel files sold to a personal injury law firm by a contractor (who was convicted of unlawfully disclosing information and carried out community service for the crime).</p>
<p>The court ordered NSW Health to pay the sum of <a href="http://www8.austlii.edu.au.ezproxy.newcastle.edu.au/cgi-bin/viewdoc/au/cases/nsw/NSWSC/2019/1781.html">$275,000 in compensation</a>) – $10,000 for the lead plaintiff and about $2,400 for the others. </p>
<h2>How much could the Optus case be worth?</h2>
<p>Given the Optus data leak is established, there’s a strong basis to believe a class action would be successful.</p>
<p>If so, a court could award compensatory damages for the time and cost of replacing identification documents, as well as exemplary (or punitive) damages, to send a message to corporations handling citizens’ private information. </p>
<p>In determining damages, a court will take into account what efforts Optus has made to remedy the leak, mitigate the potential impact on those affected and pay for the costs of replacing drivers’ licences, Medicare cards or passports. </p>
<p>Though the economic loss per customer may be relatively small, multiplied by the potential class-action pool size – up to 10 million plaintiffs – compensatory damages could easily be billions of dollars, even without exemplary damages.</p>
<p>That makes this a hugely attractive prospect for a law firm or class-action funder.</p><img src="https://counter.theconversation.com/content/191515/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>With up to 10 million plaintiffs, a successful class action against Optus over its identify data breach could easily be worth billions of dollars.Mirella Atherton, Lecturer in Law, University of NewcastleEliezer Sanchez-Lasaballett, Lecturer, University of NewcastleLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1907582022-09-20T20:19:45Z2022-09-20T20:19:45ZThis law makes it illegal for companies to collect third-party data to profile you. But they do anyway<figure><img src="https://images.theconversation.com/files/485463/original/file-20220920-875-n1syu1.jpeg?ixlib=rb-1.1.0&rect=57%2C24%2C5406%2C3612&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Unsplash</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>A little-known provision of the Privacy Act makes it illegal for many companies in Australia to buy or exchange consumers’ personal data for profiling or targeting purposes. It’s almost never enforced. In a <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4224653">research paper</a> published today, I argue that needs to change. </p>
<p>“Data enrichment” is the intrusive practice of companies going behind our backs to “fill in the gaps” of the information we provide. </p>
<p>When you purchase a product or service from a company, fill out an online form, or sign up for a newsletter, you might provide only the necessary data such as your name, email, delivery address and/or payment information.</p>
<p>That company may then turn to other retailers or <a href="https://www.oracle.com/au/cx/advertising/data-enrichment-measurement/#data-enrichment">data brokers</a> to purchase or exchange extra data about you. This could include your age, family, health, habits and more. </p>
<p>This allows them to build a more detailed individual profile on you, which helps them predict your behaviour and more precisely target you with ads. </p>
<p>For almost ten years, there has been a law in Australia that makes this kind of data enrichment illegal if a company can “reasonably and practicably” request that information directly from the consumer. And at least <a href="https://consultations.ag.gov.au/rights-and-protections/privacy-act-review-discussion-paper/consultation/view_respondent?_b_index=60&uuId=926016195">one major data broker</a> has asked the government to “remove” this law. </p>
<p>The burning question is: why is there not a single published case of this law being enforced against companies “enriching” customer data for profiling and targeting purposes? </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/its-time-for-third-party-data-brokers-to-emerge-from-the-shadows-94298">It's time for third-party data brokers to emerge from the shadows</a>
</strong>
</em>
</p>
<hr>
<h2>Data collection ‘only from the individual’</h2>
<p>The relevant law is Australian Privacy Principle 3.6 and is part of the federal <a href="https://www.legislation.gov.au/Details/C2022C00199">Privacy Act</a>. It applies to most organisations that operate businesses with annual revenues higher than A$3 million, and smaller data businesses. </p>
<p>The law says such organisations:</p>
<blockquote>
<p>must collect personal information about an individual only from the individual […] unless it is unreasonable or impracticable to do so.</p>
</blockquote>
<p>This “direct collection rule” protects individuals’ privacy by allowing them some control over information collected about them, and avoiding a combination of data sources that could reveal sensitive information about their vulnerabilities. </p>
<p>But this rule has received almost no attention. There’s only one published determination of the federal privacy regulator on it, and that was against the <a href="https://www.austlii.edu.au/cgi-bin/viewdoc/au/cases/cth/AICmr/2020/69.html">Australian Defence Force</a> in a different context.</p>
<p>According to Australian Privacy Principle 3.6, it’s only legal for an organisation to collect personal information from a third party if it would be “unreasonable or impracticable” to collect that information from the individual alone. </p>
<p>This exception was intended to apply to <a href="https://www.oaic.gov.au/privacy/australian-privacy-principles-guidelines/chapter-3-app-3-collection-of-solicited-personal-information#collecting-directly-from-the-individual">limited situations</a>, such as when:</p>
<ul>
<li>the individual is being investigated for some wrongdoing<br></li>
<li>the individual’s address needs to be updated for delivery of legal or official documents. </li>
</ul>
<p>The exception shouldn’t apply simply because a company wants to collect extra information for profiling and targeting, but realises the customer would probably refuse to provide it.</p>
<h2>Who’s bypassing customers for third-party data?</h2>
<p>Aside from data brokers, companies also exchange information with each other about their respective customers to get extra information on customers’ lives. This is often referred to as “data matching” or “data partnerships”.</p>
<p>Companies tend to be very vague about who they share information with, and who they get information from. So we don’t know for certain who’s buying data-enrichment services from data brokers, or “matching” customer data. </p>
<p>Major companies such as <a href="https://www.amazon.com.au/gp/help/customer/display.html?nodeId=202075050&ref_=footer_iba">Amazon Australia</a>, <a href="https://www.ebay.com.au/help/policies/member-behaviour-policies/user-privacy-notice-privacy-policy?id=4260&mkevt=1&mkcid=1&mkrid=705-53470-19255-0&campid=5337590774&customid=&toolid=10001#section4">eBay Australia</a>, <a href="https://www.facebook.com/privacy/policy/?subpage=1.subpage.4-InformationFromPartnersVendors">Meta</a> (Facebook), <a href="https://www.viacomcbsprivacy.com/en/policy">10Play Viacom</a> and <a href="https://twitter.com/en/privacy#twitter-privacy-1">Twitter</a> include terms in the fine print of their privacy policies that state they collect personal information from third parties, including demographic details and/or interests.</p>
<p><a href="https://policies.google.com/privacy?hl=en-US#infocollect">Google</a>, <a href="https://preferences.news.com.au/privacy">News Corp</a>, <a href="https://www.sevenwestmedia.com.au/privacy-policies/privacy">Seven</a>, <a href="https://login.nine.com.au/privacy?client_id=smh">Nine</a> and others also say they collect personal information from third parties, but are more vague about the nature of that information.</p>
<p>These privacy policies don’t explain why it would be unreasonable or impracticable to collect that information directly from customers. </p>
<h2>Consumer ‘consent’ is not an exception</h2>
<p>Some companies may try to justify going behind customers’ backs to collect data because there’s an obscure term in their privacy policy that mentions they collect personal information from third parties. Or because the company <em>disclosing</em> the data has a privacy policy term about sharing data with “trusted data partners”.</p>
<p>But even if this amounts to consumer “consent” under the relatively weak standards for consent in our current privacy law, this is not an exception to the direct collection rule. </p>
<p>The law allows a “consent” exception for government agencies under a separate part of the direct collection rule, but <em>not</em> for private organisations. </p>
<h2>Data enrichment involves personal information</h2>
<p>Many companies with third-party data collection terms in their privacy policies acknowledge this is personal information. But some may argue the collected data isn’t “personal information” under the Privacy Act, so the direct collection rule doesn’t apply.</p>
<p>Companies often exchange information about an individual without using the individual’s legal name or email. Instead they may use a unique advertising identifier for that individual, or <a href="https://help.abc.net.au/hc/en-us/articles/4402890310671">“hash” the email address</a> to turn it into a unique string of numbers and letters. </p>
<p>They essentially allocate a “code name” to the consumer. So the companies can exchange information that can be linked to the individual, yet say this information wasn’t connected to their actual name or email. </p>
<p>However, this information should still be treated as personal information because it can be linked back to the individual when combined with other <a href="https://www.austlii.edu.au/cgi-bin/viewdoc/au/cases/cth/FCAFC/2017/4.html">information about them</a>. </p>
<h2>At least one major data broker is against it</h2>
<p>Data broker <a href="https://www.experian.com.au/business/solutions/audience-targeting/digital-solutions-sell-side/digital-audiences-ss">Experian Australia</a> has asked the government to “remove” Australian Privacy Principle 3.6 “altogether”. In its <a href="https://consultations.ag.gov.au/rights-and-protections/privacy-act-review-discussion-paper/consultation/view_respondent?_b_index=60&uuId=926016195">submission</a> to the Privacy Act Review in January, Experian argued:</p>
<blockquote>
<p>It is outdated and does not fit well with modern data uses.</p>
</blockquote>
<p>Others who profit from data enrichment or data matching would probably agree, but prefer to let sleeping dogs lie.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A screenshot shows six different categories of consumer data offered by Experian." src="https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=369&fit=crop&dpr=1 600w, https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=369&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=369&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=463&fit=crop&dpr=1 754w, https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=463&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/485485/original/file-20220920-14-p8l88p.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=463&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">On its website, Experian claims to offer a ‘combination of demographic, geographic, financial and market research data - both online and offline’.</span>
<span class="attribution"><span class="source">Screenshot/Experian</span></span>
</figcaption>
</figure>
<p>Experian argued the law favours large companies with direct access to lots of customers and opportunities to pool data collected from across their own corporate group. It said companies with access to fewer consumers and less data would be disadvantaged if they can’t purchase data from brokers. </p>
<p>But the fact that some digital platforms impose extensive personal data collection on customers supports the case for stronger privacy laws. It doesn’t mean there should be a data free-for-all. </p>
<h2>Our privacy regulator should take action</h2>
<p>It has been three years since the consumer watchdog recommended <a href="https://www.accc.gov.au/system/files/Digital%20platforms%20inquiry%20-%20final%20report.pdf">major reforms</a> to our privacy laws to reduce the disadvantages consumers suffer from invasive data practices. These reforms are probably still years away, if they eventuate at all.</p>
<p>The direct collection rule is a very rare thing. It is an existing Australian privacy law that favours consumers. The privacy regulator should prioritise the enforcement of this law for the benefit of consumers.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/amazon-just-took-over-a-primary-healthcare-company-for-a-lot-of-money-should-we-be-worried-187627">Amazon just took over a primary healthcare company for a lot of money. Should we be worried?</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/190758/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, and the Australian Privacy Foundation.</span></em></p>The terms of the Australian Privacy Principle 3.6 are quite clear. So why is there not a single published case of this law being enforced?Katharine Kemp, Senior Lecturer, Faculty of Law & Justice, UNSW, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1882792022-08-23T18:27:55Z2022-08-23T18:27:55ZA new US data privacy bill aims to give you more control over information collected about you – and make businesses change how they handle data<figure><img src="https://images.theconversation.com/files/480484/original/file-20220822-88277-9t0pw.jpg?ixlib=rb-1.1.0&rect=53%2C0%2C6000%2C3736&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The U.S. could soon catch up to the European Union in protecting people's data privacy.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/cybersecurity-data-security-and-data-access-must-be-royalty-free-image/1366362135">Teera Konakan/Moment via Getty Images</a></span></figcaption></figure><p>Data privacy in the U.S. is, in many ways, a legal void. While there are limited protections for health and financial data, the cradle of the world’s largest tech companies, like Apple, Amazon, Google, and Meta (Facebook), <a href="https://www.dli.tech.cornell.edu/post/us-data-privacy-law-federal-and-state-legislation-impact-and-risk-mitigation">lacks any comprehensive federal data privacy law</a>. This leaves U.S. citizens with minimal <a href="https://www.nytimes.com/2019/06/08/opinion/sunday/privacy-congress-facebook-google.html">data privacy</a> protections <a href="https://scholarship.law.edu/cgi/viewcontent.cgi?article=1061&context=jlt">compared with citizens of other nations</a>. But that may be about to change. </p>
<p>With rare <a href="https://www.jdsupra.com/legalnews/bipartisan-u-s-federal-privacy-bill-9169312/">bipartisan support</a>, the <a href="https://www.congress.gov/bill/117th-congress/house-bill/8152/actions">American Data and Privacy Protection Act</a> moved out of the U.S. House of Representatives Committee on Energy and Commerce <a href="https://www.natlawreview.com/article/house-committee-passes-comprehensive-federal-privacy-legislation">by a vote of 53-2</a> on July 20, 2022. The bill still needs to pass the full House and the Senate, and <a href="https://subscriber.politicopro.com/article/2022/06/lawmakers-reach-bipartisan-compromise-on-privacy-bill-with-preemption-right-to-sue-00036563">negotiations are ongoing</a>. Given the Biden administration’s <a href="https://www.csoonline.com/article/3664175/u-s-data-privacy-and-security-solutions-emerging-at-the-federal-level.html">responsible data practices strategy</a>, White House support is likely if a version of the bill passes.</p>
<p>As a legal scholar and attorney who <a href="https://papers.ssrn.com/sol3/cf_dev/AbsByAuth.cfm?per_id=2643050">studies and practices technology and data privacy law</a>, I’ve been closely following the act, known as ADPPA. If passed, it will fundamentally alter U.S. data privacy law. </p>
<p>ADPPA fills the data privacy void, builds in federal preemption over some state data privacy laws, allows individuals to file suit over violations and substantially changes data privacy law enforcement. Like all big changes, ADPPA is getting mixed reviews from <a href="https://www.wired.com/story/american-data-privacy-protection-act-adppa/">media</a>, <a href="https://truthonthemarket.com/2022/06/22/adppa-mimics-gdprs-flaws-and-goes-further-still/">scholars</a> and <a href="https://www.cnbc.com/2022/06/09/bipartisan-privacy-proposal-is-unworkable-chamber-of-commerce-says.html">businesses</a>. But many see the bill as a triumph for U.S. data privacy that provides a needed national standard for data practices.</p>
<h2>Who and what will ADPPA regulate?</h2>
<p>ADPPA would apply to “covered” entities, meaning any entity collecting, processing or transferring covered data, including nonprofits and sole proprietors. It also regulates cellphone and internet providers and other <a href="https://www.law.cornell.edu/uscode/text/47/153">common carriers</a>, with <a href="https://iapp.org/news/a/advocates-concerned-with-telecom-data-oversight-in-proposed-adppa/">potentially concerning changes to federal communications regulation</a>. It does not apply to government entities.</p>
<p>ADPPA defines “covered” data as any information or device that identifies or can be reasonably linked to a person. It also protects biometric data, genetic data and geolocation information.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a city street view with a young woman looking down at her phone in focus while passersby are out of focus" src="https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/480483/original/file-20220822-86766-bc8uno.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Protected data includes your location.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/woman-with-smartphone-royalty-free-image/657929972">Christoph Hetzmannseder/Moment via Getty Images</a></span>
</figcaption>
</figure>
<p>The bill excludes three big data categories: deidentified data, employee data and publicly available information. That last category includes social media accounts with privacy settings open to public viewing. While <a href="https://georgetownlawtechreview.org/re-identification-of-anonymized-data/GLTR-04-2017/">research</a> has repeatedly shown <a href="https://www.theregister.com/2021/09/16/anonymising_data_feature/">deidentified data can be easily reidentified</a>, the ADPPA attempts to address that by requiring covered entities to take “reasonable technical, administrative, and physical measures to ensure that the information cannot, at any point, be used to re-identify any individual or device.”</p>
<h2>How ADPPA protects your data</h2>
<p>The act would require data collection to be as minimal as possible. The bill allows covered entities to collect, use or share an individual’s data only when reasonably necessary and proportionate to a product or service the person requests or to respond to a communication the person initiates. It allows collection for authentication, security incidents, prevention of illegal activities or serious harm to persons, and compliance with legal obligations.</p>
<p>People would gain rights to access and have some control over their data. ADPPA gives users the right to correct inaccuracies and potentially delete their data held by covered entities.</p>
<p>The bill permits data collection as part of research for public good. It allows data collection for peer-reviewed research or research done in the public interest – for example, testing whether a website is unlawfully discriminating. This is important for researchers who might otherwise run afoul of site terms or hacking laws.</p>
<p>The ADPPA also has a provision that <a href="https://www.wired.com/story/american-data-privacy-protection-act-adppa/">tackles the service-conditioned-on-consent problem</a> – those annoying “I Agree” boxes that force people to accept a jumble of legal terms. When you click one of those boxes, you contractually waive your privacy rights as a condition to simply use a service, visit a website or buy a product. The bill will prevent covered entities from using contract law to get around the bill’s protections.</p>
<h2>Looking to federal electronic surveillance law for guidance</h2>
<p>The U.S.’s <a href="https://www.law.cornell.edu/uscode/text/18/part-I/chapter-119">Electronic Communications Privacy Act</a> can provide federal law makers guidance in finalizing ADPPA. Like the ADPPA, the 1986 ECPA legislation involved a massive overhaul of U.S. electronic privacy law to address adverse effects to individual privacy and civil liberties posed by advancing surveillance and communication technologies. Once again, advances in surveillance and data technologies, such as artificial intelligence, are significantly affecting citizens’ rights.</p>
<p>ECPA, still in effect today, provides a baseline national standard for electronic surveillance protections. ECPA protects communications from interception unless one party to the communication consents. But ECPA does not preempt states from passing more protective laws, so states can choose to provide greater privacy rights. The end result: Roughly a quarter of U.S. states require consent of all parties to intercept a communication, thus providing their citizens increased privacy rights.</p>
<p>ECPA’s federal/state balance has worked for decades now, and ECPA has not overwhelmed the courts or destroyed commerce. </p>
<h2>National preemption</h2>
<p>As drafted, ADPPA preempts some state data privacy legislation. This affects <a href="https://oag.ca.gov/privacy/ccpa">California’s Consumer Privacy Act</a>, although it does not preempt the <a href="https://www.ilga.gov/legislation/ilcs/ilcs3.asp?ActID=3004&ChapterID=57">Illinois Biometric Information Privacy Act</a> or state laws specifically regulating facial recognition technology. The preemption provisions, however, are in flux as members of the House continue to negotiate the bill.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/S8D7I-FGKOM?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">The federal bill could end up preempting parts of California’s tougher state data privacy law.</span></figcaption>
</figure>
<p>ADPPA’s national standards provide uniform compliance requirements, serving economic efficiency; but its preemption of most state laws has <a href="https://teachprivacy.com/a-faustian-bargain-is-preemption-too-high-a-price-for-a-federal-privacy-law/">some scholars concerned</a>, and <a href="https://www.natlawreview.com/article/california-privacy-protection-agency-holds-public-meeting-to-formally-oppose-federal">California opposes its passage</a>. </p>
<p>If preemption stands, any final version of the ADPPA will be the law of the land, limiting states from more firmly protecting their citizens’ data privacy.</p>
<h2>Private right of action and enforcement</h2>
<p>ADDPA provides for a <a href="https://crsreports.congress.gov/product/pdf/LSB/LSB10776">private right of action</a>, allowing people to sue covered entities who violate their rights under ADPPA. That gives the bill’s enforcement mechanisms a big boost, although it has significant restrictions.</p>
<p>The <a href="https://www.cnbc.com/2022/06/09/bipartisan-privacy-proposal-is-unworkable-chamber-of-commerce-says.html">U.S. Chamber of Commerce</a> and the tech industry oppose a private right of action, preferring ADPPA enforcement be restricted to the Federal Trade Commission. But the FTC has far less staff and far fewer resources than U.S. trial attorneys do.</p>
<p>ECPA, for comparison, has a private right of action. It has not overwhelmed courts or businesses, and entities likely comply with ECPA to avoid civil litigation. Plus, courts have honed ECPA’s terms, providing clear precedent and understandable compliance guidelines. </p>
<h2>How big are the changes?</h2>
<p>The changes to U.S. data privacy law are big, but ADPPA affords much-needed security and data protections to U.S. citizens, and I believe that it is workable with tweaks. </p>
<p>Given how the internet works, data routinely flows across international borders, so many U.S. companies have already built compliance with other nations’ laws into their systems. This includes the <a href="https://gdpr-info.eu/">E.U.’s General Data Protection Regulation</a> – a law similar to the ADPPA. Facebook, for example, provides E.U. citizens with GDPR’s protections, but it does not give U.S. citizens those protections, because it is not required to do so.</p>
<p>Congress has done little with data privacy, but ADPPA is poised to change that.</p><img src="https://counter.theconversation.com/content/188279/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Anne Toomey McKenna is affiliated faculty with Penn State University 's Institute for Computational and Data Sciences, a Visiting Law Professor at University of Richmond's Law School, and she co-chairs IEEE-USA's AI Policy Subcommittee or Privacy, Equity, and Justice in AI. The views expressed herein are the author's own.</span></em></p>Data collection is big business in the US, but a bipartisan data privacy bill rapidly moving through Congress promises to affect the information websites, social media platforms and all other businesses collect.Anne Toomey McKenna, Visiting Professor of Law, University of RichmondLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1872742022-07-20T20:08:52Z2022-07-20T20:08:52ZWhat do TikTok, Bunnings, eBay and Netflix have in common? They’re all hyper-collectors<figure><img src="https://images.theconversation.com/files/474987/original/file-20220719-6978-2qdmfk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>You walk into a shopping centre to buy some groceries. Without your knowledge, an electronic scan of your face is taken by in-store surveillance cameras and stored in an online database. Each time you return to that store, your “faceprint” is compared with those of people wanted for shoplifting or violence.</p>
<p>This might sound like science fiction but it’s the reality for many of us. By failing to take our digital privacy seriously – as former human rights commissioner Ed Santow has warned – Australia is “<a href="https://www.theage.com.au/national/we-must-not-sleepwalk-into-mass-surveillance-20220630-p5ay0q.html">sleepwalking</a>” its way into mass surveillance.</p>
<h2>Privacy and the digital environment</h2>
<p>Of course, companies have been collecting personal information for decades. If you’ve ever signed up to a loyalty program like FlyBuys then you’ve performed what marketing agencies call a “<a href="https://www.choice.com.au/consumers-and-data/data-collection-and-use/who-has-your-data/articles/loyalty-program-data-collection">value exchange</a>”. In return for benefits from the company (like discounted prices or special offers), you’ve handed over details of who you are, what you buy, and how often you buy it.</p>
<p>Consumer data is big business. In 2019, a <a href="https://www.webfx.com/blog/internet/what-are-data-brokers-and-what-is-your-data-worth-infographic/">report</a> from digital marketers WebFX showed that data from around 1,400 loyalty programs was routinely being traded across the globe as part of an industry <a href="https://clearcode.cc/blog/what-is-data-broker/">worth around US$200 billion</a>. That same year, the Australian Competition and Consumer Commission’s <a href="https://www.accc.gov.au/publications/customer-loyalty-schemes-final-report">review of loyalty schemes</a> revealed how many of these loyalty schemes lacked data transparency and even discriminated against vulnerable customers.</p>
<p>But the digital environment is making data collection even easier. When you <a href="https://onlinemasters.ohio.edu/blog/netflix-data/">watch Netflix</a>, for example, the company knows what you watch, when you watch it, and how long you watch it for. But they go further, also <a href="https://seleritysas.com/blog/2019/04/05/how-netflix-used-big-data-and-analytics-to-generate-billions/">capturing data</a> on which scenes or episodes you watch repeatedly, the ratings of your content, the number of searches you perform and what you search for.</p>
<h2>Hyper-collection: a new challenge to privacy</h2>
<p>Late last year, the controversial tech company ClearView AI was <a href="https://www.oaic.gov.au/updates/news-and-media/clearview-ai-breached-australians-privacy">ordered</a> by the Australian information commissioner to stop “scraping” social media for the pictures it was collecting in its massive facial recognition database. Just this month, the commissioner was investigating several retailers for <a href="https://www.abc.net.au/news/2022-07-13/bunnings-kmart-investigated-over-facial-recognition-technology/101233372">creating facial profiles</a> of the customers in their stores.</p>
<p>This new phenomenon – “hyper-collection” – represents a growing trend by large companies to collect, sort, analyse and use more information than they need, usually in covert or passive ways. In many cases, hyper-collection is not supported by a truly legitimate commercial or legal purpose.</p>
<h2>Digital privacy laws and hyper-collection</h2>
<p>Hyper-collection is a major problem in Australia for three reasons.</p>
<p>First, Australia’s privacy law wasn’t prepared for the likes of Netflix and TikTok. Despite <a href="https://www.oaic.gov.au/privacy/the-privacy-act/history-of-the-privacy-act">numerous amendments</a>, the <a href="https://www.oaic.gov.au/privacy/the-privacy-act">Privacy Act</a> dates back to the late 1980s. Although former Attorney-General Christian Porter <a href="https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988">announced a review</a> of the Act in late 2019, it has been held up by the recent change of government.</p>
<p>Second, Australian privacy laws are unlikely on their own to threaten the profit base of foreign companies, especially those located in China. The Information Commissioner has the power to order companies to take certain actions – like it <a href="https://www.afr.com/policy/foreign-affairs/australia-s-tiktok-data-vulnerable-to-access-by-china-staff-20220712-p5b10f">did with Uber in 2021</a> – and can enforce these through court orders. But the penalties aren’t really big enough to discourage companies with profits in the billions of dollars.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/83-of-australians-want-tougher-privacy-laws-nows-your-chance-to-tell-the-government-what-you-want-149535">83% of Australians want tougher privacy laws. Now’s your chance to tell the government what you want</a>
</strong>
</em>
</p>
<hr>
<p>Third, hyper-collection is often enabled by the vague consents we give to get access to the services these companies provide. Bunnings, for example, argued that its collection of your faceprint was allowed because <a href="https://ia.acs.org.au/article/2022/bunnings-doubles-down-on-facial-recognition.html">signs at the entry to their stores</a> told customers facial recognition might be used. Online marketplaces like eBay, Amazon, Kogan and Catch, meanwhile, supply “<a href="https://www.accc.gov.au/media-release/concerning-issues-for-consumers-and-sellers-on-online-marketplaces">bundled consents</a>” – basically, you have to consent to their privacy policies as a condition of using their services. No consent, no access.</p>
<h2>TikTok and hyper-collection</h2>
<p>TikTok (owned by Chinese company ByteDance) has largely replaced YouTube as a way of creating and sharing online videos. The app is powered by an algorithm has already drawn <a href="https://theconversation.com/tiktoks-secret-algorithm-is-its-greatest-strength-and-could-also-be-its-undoing-176605">criticism</a> for routinely collecting data about users, as well as the ByteDance’s secretive approach to <a href="https://www.lowyinstitute.org/the-interpreter/unique-power-tiktok-s-algorithm">content moderation and censorship</a>.</p>
<p>For years, TikTok executives have been telling governments that <a href="https://www.aspistrategist.org.au/its-time-tiktok-australia-came-clean/">data isn’t stored in servers on the Chinese mainland</a>. But these promises might be hollow in the wake of recent allegations.</p>
<p>Cybersecurity experts now claim that not only does the TikTok app <a href="https://www.smartcompany.com.au/technology/tiktok-chinese-servers-aussie-cybersecurity/">routinely connect to Chinese servers</a>, but that users’ data is accessible by ByteDance employees, including the mysterious Beijing-based “Master Admin”, which has <a href="https://www.buzzfeednews.com/article/emilybakerwhite/tiktok-tapes-us-user-data-china-bytedance-access">access to every user’s personal information</a>.</p>
<p>Then, just this week, it was alleged that TikTok (owned by Chinese company ByteDance) can also access <a href="https://www.abc.net.au/news/2022-07-18/tiktok-users-warned-the-platform-is-harvesting-personal-data/13977370">almost all the data</a> contained on the phone it is installed on – including photos, calendars and emails.</p>
<p>Under China’s national security laws, the government can order tech companies to <a href="https://www.sbs.com.au/news/article/so-what-if-china-can-access-your-tiktok-data/mr1anx97k">pass on that information</a> to police or intelligence agencies.</p>
<h2>What options do we have?</h2>
<p>Unlike a physical store, we don’t get a lot of choice about consenting to digital companies’ privacy policies and how they collect our information.</p>
<p>One option – supported by encryption expert Vanessa Teague at ANU – is for consumers simply to delete offending apps until their creators are <a href="https://www.sbs.com.au/news/article/so-what-if-china-can-access-your-tiktok-data/mr1anx97k">willing to submit to greater data transparency</a>. Of course, this means locking ourselves out of those services, and it will only have a big impact in the company if enough Australians join in.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/facial-recognition-is-on-the-rise-but-the-law-is-lagging-a-long-way-behind-185510">Facial recognition is on the rise – but the law is lagging a long way behind</a>
</strong>
</em>
</p>
<hr>
<p>Another option is “opting-out” of intrusive data collection. We’ve done this before – when My Health records became mandatory in 2019, a record number of us <a href="https://www.yourlifechoices.com.au/health/my-health-record-an-expensive-white-elephant-critics-say/">opted out</a>. Though these opt-outs reduced the usefulness of that <a href="https://www.theguardian.com/commentisfree/2018/jul/20/there-is-no-social-license-for-my-health-record-australians-should-reject-it">digital health record program</a>, they did demonstrate that Australians can take their data privacy seriously. </p>
<p>But how exactly can Australians opt-out of a massive social app like TikTok? Right now, they can’t – perhaps the government needs to explore a solution as part of its review.</p>
<p>A further option being explored by the Privacy Act review is whether to create new laws that would allow individuals to <a href="https://www.ag.gov.au/system/files/2020-10/privacy-act-review-terms-of-reference.pdf">sue companies for damages for breaches of privacy</a>. While lawsuits are expensive and time-consuming, they might just deliver the kind of financial damage to big companies that could change their behaviour.</p>
<p>No matter which option we take, Australians need to start getting more savvy with their data privacy. This might just mean we actually read those terms and conditions before agreeing, and being prepared to “vote with our feet” if companies won’t be honest about what they’re doing with our personal information.</p><img src="https://counter.theconversation.com/content/187274/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Brendan Walker-Munro receives funding from the Australian Government through Trusted Autonomous Systems, a Defence Cooperative Research Centre funded through the Next Generation Technologies Fund. </span></em></p>Australians – and Australian governments – need to get more savvy about data privacyBrendan Walker-Munro, Senior Research Fellow, The University of QueenslandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1707112021-10-27T04:22:07Z2021-10-27T04:22:07ZA new proposed privacy code promises tough rules and $10 million penalties for tech giants<figure><img src="https://images.theconversation.com/files/428675/original/file-20211027-21-chefvu.jpeg?ixlib=rb-1.1.0&rect=5%2C2%2C1991%2C1353&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>This week the federal government <a href="https://consultations.ag.gov.au/rights-and-protections/online-privacy-bill-exposure-draft/">announced</a> proposed legislation to develop an online privacy code (or “OP Code”) setting tougher privacy standards for Facebook, Google, Amazon and many other online platforms. </p>
<p>These companies collect and use vast amounts of consumers’ personal data, much of it without their knowledge or real consent, and the code is intended to guard against privacy harms from these practices.</p>
<p>The higher standards would be backed by increased penalties for interference with privacy under the Privacy Act and greater enforcement powers for the federal privacy commissioner. Serious or repeated breaches of the code could carry penalties of up to A$10 million or 10% of turnover for companies.</p>
<p>However, relevant companies are likely to try to avoid obligations under the OP Code by drawing out the process for drafting and registering the code. They are also likely to try to exclude themselves from the code’s coverage, and argue about the definition of “personal information”.</p>
<p>The current definition of “personal information” under the Privacy Act does not clearly include technical data such as IP addresses and device identifiers. Updating this will be important to ensure the OP Code is effective.</p>
<h2>Which organisations would be covered and why?</h2>
<p>The code is intended to address some clear online privacy dangers, while we await broader changes from the <a href="https://consultations.ag.gov.au/rights-and-protections/privacy-act-review-discussion-paper/">current broader review of the Privacy Act</a> that would apply across all sectors.</p>
<p>The OP Code would target online platforms that “collect a high volume of personal information or trade in personal information”, including:</p>
<ul>
<li><p>social media networks such as Facebook; dating apps like Bumble; online blogging or forum sites like Reddit; gaming platforms; online messaging and videoconferencing services such as WhatsApp and Zoom</p></li>
<li><p><a href="https://theconversation.com/its-time-for-third-party-data-brokers-to-emerge-from-the-shadows-94298">data brokers</a> that trade in personal information, including Quantium, Acxiom, Experian and Nielsen Corporation</p></li>
<li><p>other large online platforms that collect personal information and have more than 2.5 million annual users in Australia, such as Amazon, Google and Apple.</p></li>
</ul>
<p>The OP Code would impose higher standards for these companies than otherwise apply under the Privacy Act.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/its-time-for-third-party-data-brokers-to-emerge-from-the-shadows-94298">It's time for third-party data brokers to emerge from the shadows</a>
</strong>
</em>
</p>
<hr>
<h2>Higher standards for consent - maybe</h2>
<p>The OP Code would set out details about how these organisations must meet obligations under the Privacy Act. This would include higher standards for what constitutes users’ “consent” for how their data are used.</p>
<p>The government’s <a href="https://consultations.ag.gov.au/rights-and-protections/online-privacy-bill-exposure-draft/user_uploads/online-privacy-bill-explanatory-paper.pdf">explanatory paper</a> says the OP Code would require consent to be “voluntary, informed, unambiguous, specific and current”. (Unfortunately, the draft legislation itself doesn’t actually say that, and will require some amendment to achieve this.)</p>
<p>This description draws on the definition of consent in the European Union’s <a href="https://gdpr.eu/what-is-gdpr/">General Data Protection Regulation</a>.</p>
<p>In the EU, for example, <a href="https://gdpr-info.eu/issues/consent/">“unambiguous” consent</a> means a person must take clear, affirmative action – for instance by ticking a box or clicking a button – to consent to a use of their information. </p>
<p>Consent must also be “specific”, so companies cannot, for example, require consumers to consent to unrelated uses (such as market research) when their data is only needed to process a specific purchase.</p>
<h2>Requests to stop using and disclosing personal information</h2>
<p>The ACCC recommended we should have a right to erase our personal data as a means of reducing the power imbalance between consumers and large platforms. In the EU, the “right to be forgotten” by search engines and the like is part of this erasure right. The government has not adopted this recommendation.</p>
<p>However, the OP Code would include an obligation for organisations to comply with a consumer’s reasonable request to stop using and disclosing their personal data. Companies would be allowed to charge a “non-excessive” fee for fulfilling these requests. This is a very weak version of the EU right to be forgotten.</p>
<p>For example, Amazon currently states in its <a href="https://www.amazon.com.au/gp/help/customer/display.html?nodeId=GX7NJQ4ZB8MHFRNJ#GUID-C3396B35-7018-45C5-999A-5989043DA870__SECTION_C877F3A6113249BF905B04840EFB3496">privacy policy</a> that it uses customers’ personal data in its advertising business and discloses the data to its vast Amazon.com corporate group. The proposed OP Code would mean Amazon would have to stop this, at a customer’s request, unless it had reasonable grounds for refusing.</p>
<p>Ideally, the code should also allow consumers to ask a company to stop <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3905693">collecting their personal information from third parties</a>, as they currently do, to build profiles on us.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-one-simple-rule-change-could-curb-online-retailers-snooping-on-you-166174">How one simple rule change could curb online retailers' snooping on you</a>
</strong>
</em>
</p>
<hr>
<h2>Increased protections for children and vulnerable groups</h2>
<p>The draft bill also includes a vague provision for the OP Code to add protections for kids and other vulnerable people who are not capable of making their own privacy decisions.</p>
<p>A more controversial proposal would require new consents and verification for kids using social media services such as Facebook and WhatsApp. These services would be required to:</p>
<ul>
<li><p>take reasonable steps to verify the age of social media users</p></li>
<li><p>obtain parental consent before collecting, using or disclosing personal information of a child under 16</p></li>
<li><p>ensure its data practices are “fair and reasonable in the circumstances”, with the best interests of the child as the primary consideration.</p></li>
</ul>
<h2>What is ‘personal information’?</h2>
<p>A key tactic companies will likely use to avoid the new rules is to claim that the information they use is not truly “personal”, since the OP Code and the Privacy Act only apply to “personal information”, as defined in the Act. </p>
<p>The companies may claim the data they collect is only connected to our individual device or to an online identifier they’ve allocated to us, rather than our legal name. However, the effect is the same. The data is used to build a more detailed profile on an individual and to have effects on that individual.</p>
<p>Australia needs to update the definition of “personal information” to clarify it includes data such as IP addresses, device identifiers, location data, and any other online identifiers that may be used to identify an individual or to interact with them on an individual basis. Data should only be de-identified if no individual is identifiable from that data. </p>
<h2>Increased penalties and upgraded enforcement</h2>
<p>The government has pledged to give tougher powers to the privacy commissioner, and to hit companies with tougher penalties for breaching their obligations once the code comes into effect.</p>
<p>The maximum civil penalty for a serious and/or repeated interference with privacy will be increased up to the equivalent penalties in the Australian Consumer Law. </p>
<p>For individuals, the maximum penalty will increase to more than A$500,000. For corporations, the maximum will be the greater of A$10 million, or three times the value of the benefit received from the breach, or (if this value cannot be determined) 10% of the company’s annual turnover.</p>
<p>The privacy commissioner could also issue infringement notices for failing to provide relevant information to an investigation. The maximum penalty will be A$2,644 for individuals or A$13,320 for companies.</p>
<p>Such civil penalty provisions will make it unnecessary for the Commissioner to resort to prosecution of a criminal offence, or to civil litigation, in these cases. </p>
<h2>Don’t hold your breath</h2>
<p>Once legislation is passed, it will take around 12 months for the code to be developed and registered.</p>
<p>The tech giants will have plenty of opportunity to create delay in this process. Companies are likely to challenge the content of the code, and whether they should even be covered by it at all.</p><img src="https://counter.theconversation.com/content/170711/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, the Centre for Law, Markets & Regulation and the Australian Privacy Foundation.</span></em></p><p class="fine-print"><em><span>Graham Greenleaf is a board member of the NGO, the Australian Privacy Foundation.</span></em></p>A proposed online privacy code would give consumers more control over how tech companies collect and use their dataKatharine Kemp, Senior Lecturer, Faculty of Law & Justice, UNSW, UNSW SydneyGraham Greenleaf, Professor of Law and Information Systems, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1287752019-12-12T09:43:24Z2019-12-12T09:43:24ZThe federal government’s response to the ACCC’s Digital Platforms Inquiry is a let down<figure><img src="https://images.theconversation.com/files/306565/original/file-20191212-85386-1dmwhpo.jpg?ixlib=rb-1.1.0&rect=51%2C92%2C6818%2C4325&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Of the 23 recommendations made in the ACCC's final report, the government supported six in their entirety, ten "in principle", "noted" five and rejected two. </span> <span class="attribution"><span class="source">shutterstock</span></span></figcaption></figure><p>Today, the federal government <a href="http://www.treasury.gov.au/publication/p2019-41708">responded</a> to the recommendations of the Australian Competition and Consumer Commission’s (ACCC) “world-leading” <a href="http://www.treasury.gov.au/consultation/digital-platforms-inquiry">Digital Platforms Inquiry</a>. </p>
<p>The response, however, is a less-than world-leading roadmap for reform.</p>
<p>Few dispute the ACCC’s inquiry was ground breaking, as it held to account tech giants including Google and Facebook, and the power they wield over media, advertising and consumers.</p>
<p>But the government’s plan for reform lags behind other major global jurisdictions, where greater privacy protections have been enacted.</p>
<h2>The recommendations, and the response</h2>
<p>The ACCC spent 18 months on the Digital Platforms Inquiry, publishing its hefty <a href="https://www.accc.gov.au/publications/digital-platforms-inquiry-final-report">final report</a> in July. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/consumer-watchdog-calls-for-new-measures-to-combat-facebook-and-googles-digital-dominance-120077">Consumer watchdog calls for new measures to combat Facebook and Google's digital dominance</a>
</strong>
</em>
</p>
<hr>
<p>In the report, 23 recommendations were made. These included wide-ranging reforms to consumer protection and privacy laws.</p>
<p>In today’s announcement, the <a href="http://ministers.treasury.gov.au/ministers/josh-frydenberg-2018/media-releases/response-digital-platforms-inquiry">government supported</a> six of these recommendations in their entirety and ten “in principle” (with plans for further reviews). </p>
<p>It “noted” five others, and rejected two. </p>
<h2>A lagging Privacy Act</h2>
<p>In its final report, the ACCC highlighted that Australia’s privacy laws do not give consumers the same protections granted in other comparable countries and the European Union. </p>
<p>For instance, the European Union’s General Data Protection Regulation (<a href="https://theconversation.com/social-media-doesnt-need-new-regulations-to-make-the-internet-safer-gdpr-can-do-the-job-111438">GDPR</a>), gives European consumers more choices and more complete information about how their personal data is used. </p>
<p>The <a href="https://www.oag.ca.gov/privacy/ccpa">California Consumer Privacy Act</a> also puts Californian consumers ahead of us, with rights to access and delete data. There are even moves to <a href="https://www.nytimes.com/2019/12/10/opinion/congress-privacy-bill.html">introduce nationwide legislation</a> to protect consumers online in the United States. </p>
<p>The ACCC’s recommendations on privacy sought to align Australia’s privacy laws with the GDPR. This included imposing higher standards for consumer consent and privacy notices, and introducing rights to erase personal data in certain situations. </p>
<p>The government does support higher penalties for breaches of the Privacy Act, and will act on that recommendation sometime next year. However, there will be another 18 months of inquiry into whether any other improvements to the Privacy Act are actually required. </p>
<p>The government also supported, in principle, a new statutory tort (law) of serious invasion of privacy, which was also recommended by the <a href="https://www.alrc.gov.au/publication/serious-invasions-of-privacy-in-the-digital-era-alrc-report-123/">Australian Law Reform Commission</a> in 2014. </p>
<p>Similarly, the ACCC’s report proposed a private right of action (which allows an individual to sue directly) on privacy matters. This right is currently only available to the regulator. The government supports this recommendation in principle, but it is “subject to consultation and design of specific measures”.</p>
<p>The reality is, privacy best practice standards are evolving rapidly around the world, while Australia lags behind. Australian web and app-based businesses have to design their services to comply with overseas legislation.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/media-files-accc-seeks-to-clip-wings-of-tech-giants-like-facebook-and-google-but-international-effort-is-required-121359">Media Files: ACCC seeks to clip wings of tech giants like Facebook and Google but international effort is required</a>
</strong>
</em>
</p>
<hr>
<h2>What has the government committed to?</h2>
<p>To its credit, the government made a commitment of A$26.9 million over four years to fund a new unit in the ACCC. </p>
<p>This unit will monitor and report on the state of competition and consumer protection in digital platform markets. </p>
<p>Its first task will be to conduct further inquiries into the advertising technology (ad-tech) sector, in line with an ACCC recommendation. Ad-tech facilitates personalised targeted advertisements, such as those presented by Facebook and Google.</p>
<p>The key issue, from a platforms perspective, was the ACCC’s proposal to have a voluntary code of practice regarding the power imbalances between digital platforms and news media businesses. A voluntary code means the industry has the opportunity to develop new rules itself.</p>
<p>The government has directed the ACCC to work with stakeholders to develop and implement voluntary codes to address this power imbalance, and the ACCC must provide a progress report on code negotiations in May next year. </p>
<p>The code must be finalised no later than November. If agreement isn’t reached, the government has reserved the right to impose a mandatory code.</p>
<p>The government will also work with the Australian Communications and Media Authority (ACMA), in a staged process to reform media regulation. </p>
<p>The objective is to have a “platform-neutral” regulatory framework covering both online and offline delivery of media content (for example, having common rules for Netflix and free-to-air television). </p>
<p>In practice, this “staging” is likely to lead to the first legislation being introduced into parliament in late 2020.</p>
<h2>What was left out?</h2>
<p>One of the recommendations the government rejected was the proposed mandatory ACMA take-down code, which would assist copyright enforcement on digital platforms. The rejection was on the basis that major copyright owners and users identified “unintended effects” of such a code.</p>
<p>The government also rejected the proposal that philanthropic funding of journalism should be tax deductible, mainly because it is in the process of implementing a deductibility framework, introduced in 2017.</p>
<p>The ACCC also recommended the prohibition of unfair contract terms. It repeated this proposal in the context of small businesses and <a href="https://www.accc.gov.au/focus-areas/market-studies/customer-loyalty-schemes-review">consumer loyalty schemes</a>.</p>
<p>The government noted this, and promised there will be consultation on a range of policy options to strengthen unfair contract term protections. </p>
<p>The was also a “note” response on the recommendation to prohibit certain unfair trading practices. There is currently work underway through Consumer Affairs Australia and New Zealand exploring such a prohibition. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australian-media-regulators-face-the-challenge-of-dealing-with-global-platforms-google-and-facebook-121430">Australian media regulators face the challenge of dealing with global platforms Google and Facebook</a>
</strong>
</em>
</p>
<hr>
<p>The government deferred adopting a change to search engine and internet browser defaults. Instead, the ACCC will monitor and report back on Google’s roll-out of options in Europe. </p>
<p>The ACCC also proposed changes to merger law. These were intended to address what in Europe is called a “<a href="https://www.jdsupra.com/legalnews/eu-report-on-competition-policy-for-68460/">killer acquisition</a>”. </p>
<p>The government anticipates further public consultation on this proposal.</p>
<p>Unfortunately, the government’s plan for further legislative reviews offers only the mere possibility of improvement to consumer privacy protections. And even if these eventuate, they are more than 18 months away.</p><img src="https://counter.theconversation.com/content/128775/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, the Centre for Law, Markets & Regulation and the Australian Privacy Foundation.</span></em></p><p class="fine-print"><em><span>Rob Nicholls receives grant funding from the International Association of Privacy Professionals.</span></em></p>The ACCC’s inquiry was launched to address concerns about the market power of major digital platforms, such as Google and Facebook, and their impact on Australia’s businesses and media.Katharine Kemp, Senior Lecturer, Faculty of Law, UNSW, and Co-Leader, 'Data as a Source of Market Power' Research Stream of The Allens Hub for Technology, Law and Innovation, UNSW SydneyRob Nicholls, Senior lecturer in Business Law. Director of the UNSW Business School Cybersecurity and Data Governance Research Network, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1169992019-05-16T20:05:05Z2019-05-16T20:05:05ZYour credit report is a key part of your privacy – here’s how to find and check it<figure><img src="https://images.theconversation.com/files/274769/original/file-20190515-69192-5qp08u.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The Privacy Act gives you the right to find out what’s in your credit report and change any incorrect information in your report.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/business-woman-hands-working-financial-plan-646040347?src=X3er8kj77M7s2B5Em_fOXA-1-76">from www.shutterstock.com</a></span></figcaption></figure><p>The Australian government encourages citizens to protect their privacy and personal information. </p>
<p>Most of the <a href="https://www.oaic.gov.au/individuals/privacy-fact-sheets/general/privacy-fact-sheet-8-ten-tips-to-protect-your-privacy">tips provided</a> by the Office of the Information Commissioner are pretty intuitive – know your rights, read privacy policies, use security software and more. </p>
<p>But you might be surprised to know “check your credit report” is also on the list of recommended actions. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/seven-ways-the-government-can-make-australians-safer-without-compromising-online-privacy-111091">Seven ways the government can make Australians safer – without compromising online privacy</a>
</strong>
</em>
</p>
<hr>
<p>Checking your credit report, preferably annually, is a good way to ensure incorrect information is not listed against you. Having the right information in place can protect you against <a href="https://www.moneysmart.gov.au/scams/identity-fraud">identity theft</a>, so is an important component of privacy in this sense. </p>
<p>The <a href="https://www.oaic.gov.au/privacy-law/privacy-act/">Privacy Act 1988</a> is an Australian law which regulates the handling of personal information about individuals. The Privacy Act has very strict rules, reflected in 13 <a href="https://www.oaic.gov.au/privacy-law/privacy-act/australian-privacy-principles">Australian Privacy Principles</a>, that control the way information about you is accessed, used and corrected. </p>
<p>The Privacy Act gives you the right to find out what’s in your credit report and change any incorrect information in your report.</p>
<p>As well as stopping others from stealing your identity, having an accurate credit report is also crucial if you want to borrow money. For example, when applying for credit such as a home loan, the lender will obtain your credit report to assess your credit worthiness and also your ability to repay the loan. You really don’t want your application for a home loan to be knocked back because of errors in your credit report, do you? </p>
<h2>How to check your credit report</h2>
<p>The first step is getting a copy of your credit report. This can be obtained free from credit reporting agencies such as <a href="https://www.equifax.com.au/">Equifax</a>, <a href="https://www.illion.com.au/">illion</a> and <a href="http://www.experian.com.au/">Experian</a>. Tasmanians can also refer to the <a href="https://www.tascol.com.au/">Tasmanian Collection Service</a>. </p>
<p>Make sure you spend a bit of time looking carefully for this free option – it is there, but can sometimes be a little buried. </p>
<p>The report will be sent to you in about ten days. If you are in a hurry and need it faster, you can pay between A$30 to A$50 dollars and the credit report will arrive in a day or two. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/another-day-another-data-breach-what-to-do-when-it-happens-to-you-99150">Another day, another data breach – what to do when it happens to you</a>
</strong>
</em>
</p>
<hr>
<h2>Look at the details</h2>
<p>Once you have your credit report, there are <a href="https://www.moneysmart.gov.au/media/400943/your-credit-report.pdf">certain things that you must check</a>. </p>
<p>First, as a minimum, check that your personal details such as name, date of birth, employment and driver’s license or other identifying documents are correct. </p>
<p>Second, have a look at your credit history in the report. This will include details of all credit or loans that you applied for, any overdue payments more than 60 days for which default actions have been initiated, and any other credit infringements. Such credit infringements can be listed on your credit report for between five to seven years, depending on their severity. </p>
<p>Third, examine your repayment history to determine whether you missed any payments on due dates. </p>
<p>Last, check whether any recorded serious adverse credit activities such as bankruptcies, court judgements and debt agreements are correct and accurately reflect your circumstances.</p>
<h2>What happens if it’s wrong?</h2>
<p>You are entitled to request changes to any incorrect listing and this should be done free for you. </p>
<p>In the first instance, you can contact the credit reporting agency directly and they will be able to fix small errors immediately. For other errors originating from a credit provider such as a bank, they will sometimes even contact the bank on your behalf. </p>
<p>However, if you have to contact the credit provider yourself, do so and explain why the listing is incorrect. Most often, they will fix the mistake. If they refuse, you can then go to an independent dispute resolution scheme, such as the <a href="https://www.afca.org.au/">Australian Financial Complaints Authority</a>. </p>
<p>If all else fails, you can also contact the <a href="https://www.oaic.gov.au/">Office of the Australian Information Commissioner</a> who will deal with your complaint if it is not older than a year.</p>
<p>So, what are you waiting for? It really is in your best interest to check your credit report, and no one else can do it for you.</p><img src="https://counter.theconversation.com/content/116999/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Checking your credit report is a good way to ensure that incorrect information is not listed against you, and can protect you against identity theft.Harjinder Singh, Senior lecturer, Curtin UniversityNigar Sultana, Senior Lecturer, Faculty of Business and Law, Curtin UniversityYeut Hong Tham, Lecturer, Curtin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/976952018-06-18T05:17:01Z2018-06-18T05:17:01ZThe privacy problem with camera traps: you don’t know who else could be watching<figure><img src="https://images.theconversation.com/files/221688/original/file-20180605-175438-skeqeo.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A spotted-tailed Quoll detected during a small mammal survey at Carrai Plateau, New South Wales.</span> <span class="attribution"><span class="source">Paul Meek</span>, <span class="license">Author provided</span></span></figcaption></figure><p>We use remotely activated cameras – known as camera traps – to study the ecology and population responses of wildlife and pest species in management programs across Australia.</p>
<p>These devices are used widely by scientists, researchers and managers to detect rare wildlife, monitor populations, study behaviour and measure long term wildlife population health. </p>
<p>But the lack of transparency surrounding how these images are transmitted, where they are stored, and who has access to them in transit, has scientists worried.</p>
<p>We’ve discovered that images captured by these devices may potentially be accessed by more than those intended, and that this could pose potential privacy breaches, and even poaching risks.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/publish-and-dont-perish-how-to-keep-rare-species-data-away-from-poachers-80239">Publish and don’t perish – how to keep rare species' data away from poachers</a>
</strong>
</em>
</p>
<hr>
<h2>A chance discovery</h2>
<p>It was an accidental discovery that our images can travel from the field to big overseas internet servers. We had not considered the transmission path of our images, and who may have access to them along the way.</p>
<p>Manufacturers have developed camera traps that are capable of transmitting image data using the telecommunications network (in Australia this is 3G and soon to move to 4G). </p>
<p>Most of these camera trap models can transmit images using both MMS (Multi Media Message Service), where the image is sent in an SMS (Short Message Service) to a smart phone, and via SMTP (Simple Mail Transfer Protocol), where the image is transmitted to an email address. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=800&fit=crop&dpr=1 600w, https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=800&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=800&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1005&fit=crop&dpr=1 754w, https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1005&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1005&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A 3G camera trap set in the Strzelecki Desert and sending images to the authors email and phone.</span>
<span class="attribution"><span class="source">PM</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>In Australia, when you buy a 3G compatible camera trap you just need to add a SIM card from a service provider. The images will then be sent from the camera trap at a field site to your work or home in seconds. This process is made simple for users by manufacturers who set up default settings to assist you in programming the camera trap. </p>
<p>If, like most people, you don’t over-ride the default settings, then your data will be managed for you. An attractive offer, especially for those people who are not tech-savvy or who don’t have time to fiddle around with programming equipment. </p>
<p>But where are your images going? Who has the legal right to access and store them? How secure is each stage of the transmission path, and are your images being used without your knowledge?</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/explainer-what-is-4g-9448">Explainer: what is 4G?</a>
</strong>
</em>
</p>
<hr>
<h2>An evaluation process</h2>
<p>Our research team has been evaluating the transmission of images via SMTP for a larger research project, aimed at developing camera trap transmission via satellite.</p>
<p>We have been testing and comparing several models of 3G camera trap, which includes evaluating the message structure and headers.</p>
<p>It was these investigations that revealed some alarming information that pose several potential risks to camera trap users when a camera trap is set up using the default settings for SMTP transmission. </p>
<p>Each manufacturer will use different methods, but in essence when an image is transferred through some 3G telecommunication service, the image is sent to one or more web-servers, where the image may be stored, then sent to the recipient email address or phone. </p>
<p>These servers can be in any country. Our investigations of the five models we tested identified that images are being sent via some large, well-known Asian and North American companies. The exact location of each server, and the full transmission pathway cannot be fully known. </p>
<p>Exactly what happens to these images during transmission also remains unknown. But most practitioners we have spoken to have no idea their images could potentially be going to servers overseas, so it raises several concerns for users.</p>
<h2>A privacy concern</h2>
<p>One of our foremost concerns is how legal professionals would interpret ownership and distribution of images of people under privacy legislation. Camera traps deployed to detect wildlife often detect unsuspecting people walking past.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A harmless image of an un-suspecting person walking past a camera trap could end up in a court of law if the image is used without their permission.</span>
<span class="attribution"><span class="source">Paul Meek</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>It’s a <a href="https://www.publish.csiro.au/book/7150">legal mine field</a> when a camera trap user potentially distributes an image of a person without their permission.</p>
<p>It was an issue raised back in 2012 when an unnamed Austrian politician was <a href="http://www.spiegel.de/international/europe/forest-sex-footage-sparks-debate-in-austria-a-838691.html">caught in a sexual encounter by a camera trap</a>. In that case the image wasn’t released publicly but it raised concerns over a potential breach of privacy.</p>
<p>In Australia, such an image belongs to the person who is photographed irrespective of where the images were taken, so strictly speaking they could pursue legal action against anyone distributing it. </p>
<p>Clearly there would be extenuating circumstances, but whether or not there is a case to be answered is yet to be tested and would depend on the country and legislation involved.</p>
<p>Camera traps are also used for security purposes by authorities, farmers and members of the public, so potential legal and sensitive data could be distributed over the internet. As there is a lack of transparency surrounding the transmission pathway, storage, and usage of the data, this could be a huge concern.</p>
<p>In Australia, this might constitute a breach under the <a href="https://www.oaic.gov.au/privacy-law/privacy-act/">Privacy Act 1988</a> dependent on the whether any personal data is disclosed and the potential for serious harm which might result.</p>
<h2>All in the cloud</h2>
<p>The Australian government has <a href="https://www.oaic.gov.au/privacy-law/privacy-archive/privacy-speeches-archive/privacy-and-the-cloud">released policy and guidelines</a> concerning the protection of data privacy when using cloud services. </p>
<p>But these requirements might not extend, or have not been adopted, in the context of technological based ecology monitoring and so valuable data could currently be leaving Australian shores.</p>
<p>How this data is used is also largely unknown. It may serve many commercial purposes for companies, such as data mining, advertising, and machine learning and artificial intelligence development, to name but a few. Exactly what country, where and how securely the data is stored remains a mystery.</p>
<p>Of real concern for many international wildlife conservation groups is the potential misuse of wildlife images that could identify threatened species and locations. This information could be illegally accessed by poachers, or those looking to sell the data for profit. </p>
<p>Our disclaimer here is that we have no evidence to prove or deny that such practices are occurring, but the potential exists and the lack of transparency is alarming.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/scientists-are-accidentally-helping-poachers-drive-rare-species-to-extinction-78342">Scientists are accidentally helping poachers drive rare species to extinction</a>
</strong>
</em>
</p>
<hr>
<h2>Reducing the risk</h2>
<p>Until recently we did not fully comprehend the risks we were taking by using 3G camera traps without taking some precautions. Like most, we accepted that our data was safe and controlled by Australian telecommunications systems, and had no concept that the images may be transmitted or stored by servers overseas.</p>
<p>We now know the risks and that in many cases this image management protocol can be circumvented by over-riding the camera’s default settings. In the ideal world every user would know the full transmission pathway of the image and could take steps to make sure it is as secure as practically possible. Given this is not possible, we recommend that where possible, users program camera traps to send SMTP images direct to an email address that they have more control over. </p>
<p>It will take a little extra time to program the camera traps, but at least users will have more control over the path of their image from the field to any receiving device.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The right thing captured in the camera trap: a spotted-tailed Quoll.</span>
<span class="attribution"><span class="source">Paul Meek</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure><img src="https://counter.theconversation.com/content/97695/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Paul D Meek receives funding from Dept Agriculture and Water Resources, Centre for Invasive Species Solutions, Australian Wool Innovation and Meat and Livestock Australia</span></em></p><p class="fine-print"><em><span>Greg Falzon receives funding from Dept Agriculture and Water Resources, Centre for Invasive Species Solutions, Australian Wool Innovation and Meat and Livestock Australia</span></em></p><p class="fine-print"><em><span>James Bishop receives funding from Dept Agriculture and Water Resources, Centre for Invasive Species Solutions, Australian Wool Innovation and Meat and Livestock Australia.
James Bishop receives PhD research funding from an Australian Postgraduate Award (APA).
</span></em></p>Remote cameras used to track wildlife in Australia could pose a privacy risk, especially if the images they capture fall into the wrong hands.Paul D Meek, Adjunct Lecturer in School of Environmental and Rural Science, University of New EnglandGreg Falzon, Lecturer in Computational Science, University of New EnglandJames Bishop, PhD candidate, software engineer, University of New EnglandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/955212018-05-01T03:50:02Z2018-05-01T03:50:02ZSoft terms like ‘open’ and ‘sharing’ don’t tell the true story of your data<figure><img src="https://images.theconversation.com/files/216798/original/file-20180430-135851-u9os0q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Advances in machine learning may allow data that is de-identified now to be re-dentified in the future. </span> <span class="attribution"><span class="source">from www.shutterstock.com </span></span></figcaption></figure><p>The Turnbull government today announced the creation of a new <a href="https://www.mhs.gov.au/media-releases/2018-05-01-government-response-productivity-commission-inquiry-data-availability-and-use">National Data Commissioner</a> to oversee the implementation of greater data access and “sharing” in Australia.</p>
<p>This follows the government’s announcement late last year of a “<a href="https://ministers.pmc.gov.au/taylor/2017/australians-own-their-own-banking-energy-phone-and-internet-data">consumer data right</a>” relating to banking, energy, phone and internet transactions. This has been promoted as a means for Australians:</p>
<blockquote>
<p>(…) to compare offers, get access to cheaper products and plans to help them “make the switch” and get greater value for money.</p>
</blockquote>
<p>But we argue that the choice of words like “openness” and “sharing” hides the true nature of a rushed and risky proposal for our data. </p>
<p>It’s time the government used more accurate language and less spin, so we can have a realistic debate about its plans <em>before</em> our personal information is irrevocably exposed.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australia-should-strengthen-its-privacy-laws-and-remove-exemptions-for-politicians-93717">Australia should strengthen its privacy laws and remove exemptions for politicians</a>
</strong>
</em>
</p>
<hr>
<h2>‘Open banking’ within 12 months</h2>
<p>For some years, the Australian government has pushed for <a href="http://sjm.ministers.treasury.gov.au/media-release/032-2016/">increased data disclosure and linking</a> in pursuit of efficiency and international competitiveness. It argues that access to more data will allow businesses to plan and adapt their offerings more efficiently, and that “big data” analytics will lead to increased innovation. </p>
<p>In 2017, the <a href="http://www.pc.gov.au/inquiries/completed/data-access/report">Productivity Commission</a> backed this proposal – referring to the need for increased “openness” and “access”. It recommended increased disclosure and use of data, including our personal and sensitive information.</p>
<p>The Commission does concede we, the public, might be wary of exposing our information. As a result, it has suggested that to gain necessary acceptance or “social licence”, the government should create a new “consumer data right” allowing us to transfer our data to providers to get better offers.</p>
<p>The government is currently considering the <a href="https://treasury.gov.au/consultation/c2018-t247313/">Final Report of the Review into Open Banking</a>, released in February. This recommends opening up data within 12 months for financial services, followed by other sectors.</p>
<p>In our opinion, this haste seems to be driven by FOMO (fear of missing out) – a sense that the world is talking big data and Australia shouldn’t be left behind. </p>
<h2>Inadequate privacy protection</h2>
<p>What should be more troubling is that Australia already lags behind on the basic privacy protections that could make the planned data disclosure safe (or at least less risky). </p>
<p>Unlike most comparable countries advocating open data (including the US, UK and NZ), Australians have no right to take anyone to court for a serious invasion of our privacy.</p>
<p>This is the case even though the <a href="https://www.alrc.gov.au/publications/serious-invasions-privacy-digital-era-alrc-report-123">Australian Law Reform Commission</a> recommended this back in 2014 (after a near-identical recommendation <a href="https://www.alrc.gov.au/sites/default/files/pdfs/108_vol1.pdf">in 2008</a>) and <a href="http://eresources.hcourt.gov.au/showCase/2001/HCA/63">the High Court</a> called for action in 2001.</p>
<p>What’s more, obligations under the Australian <a href="https://www.oaic.gov.au/privacy-law/privacy-act/">Privacy Act</a> don’t apply to the overwhelming majority of businesses – and <a href="https://theconversation.com/how-the-law-allows-governments-to-publish-your-private-information-74304">experts criticise</a> the weak enforcement of its already weak remedies. </p>
<p>In large part, the Privacy Act makes you responsible for protecting your privacy. Under the Australian law, if you continue to use a website after it has provided a link to its privacy policy, your consent is taken to be implied by that continued use. Consent does not even require ticking of a box in this context. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-the-law-allows-governments-to-publish-your-private-information-74304">How the law allows governments to publish your private information</a>
</strong>
</em>
</p>
<hr>
<h2>Where’s the harm?</h2>
<p>While few of us have celebrity-level secrets that might make us obsess over protection from paparazzi, the reality is in future we could suffer from weak privacy protections far more than any celebrity or politician.</p>
<p>If open banking goes ahead under current law, here’s what’s likely. When you agree to transfer your banking information from your existing bank to another provider via an Application Programming Interface (API), that provider will require you to tick a box saying you agree to its terms and conditions.</p>
<p>Those terms will include a privacy policy saying you consent to the new provider storing your data, giving it to others, and using it for other things, including vague “marketing purposes”. Words in such policies typically state, for example: </p>
<blockquote>
<p>(…) we may collect your personal information for research, marketing, for efficiency purposes (…)</p>
</blockquote>
<p>The new provider, and subsequent recipients, may combine that data with other personal information about you – collected from data aggregating giants like Acxiom, Facebook and Google – and use it to create a 360-degree, “<a href="https://www.fastcompany.com/40447841/you-are-being-exploited-by-the-opaque-algorithm-driven-economy">God-like view</a>” of you as an individual.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/academics-call-on-facebook-to-make-data-more-widely-available-for-research-95365">Academics call on Facebook to make data more widely available for research</a>
</strong>
</em>
</p>
<hr>
<p>This can be used to create scores, psychographic <a href="https://docs.house.gov/meetings/IF/IF17/20171129/106659/HHRG-115-IF17-Wstate-PasqualeF-20171129.pdf">profiles and predictions</a> based on your spending, friends, health, race, sexual orientation, political affiliation, and lifestyle choices.</p>
<p>Such aggregated data could potentially be used to exploit, manipulate or discriminate against you based on your needs and weaknesses. </p>
<p>The <a href="https://treasury.gov.au/consultation/c2018-t247313/">Final Report of the Review into Open Banking</a> accepted these plans would increase data security risks from hacking, improper disclosure and access. It recommended some improvements to consumer consent processes.</p>
<p>But it didn’t recommend <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3150138">the essential change</a> to substantive privacy law: to give us the right to sue, or increased penalties for breaches, or to give us a right to have our data deleted once it’s been used for its original purpose.</p>
<p>The <a href="http://www.pc.gov.au/inquiries/completed/data-access/report">Productivity Commission</a> proposed anonymisation or de-identification of your data to reduce risks. But advances in big data and machine learning for <a href="https://dl.acm.org/citation.cfm?doid=2835776.2835798">re-identification</a> overtake attempts to de-identify, so data previously thought safe to release later becomes unsafe.</p>
<p>Attending a recent blockchain conference in Sydney, we heard a computer scientist say that, given a choice, he wouldn’t agree to the release of his anonymised medical record because he’s sure it will be re-identified – as his record – within the decade. </p>
<h2>Not ‘openness’, not ‘sharing’</h2>
<p>It’s misleading to talk of these data practices as “openness” and “sharing”. These are just feel-good marketing terms to evoke positive emotions and hide reality. </p>
<p>The government’s proposal does not make data more open. It encourages us to consent to vast exposure of our personal information, including to those who may use it against us, for example, through vulnerability-based marketing. </p>
<p>The <a href="http://www.ohchr.org/EN/NewsEvents/Pages/DisplayNews.aspx?NewsID=22271&LangID=E">UN’s Special Rapporteur on Privacy</a> has noted that open data originally referred to governments making information about <em>government</em> and “the world we live in” more accessible to citizens; but it’s now used to refer to governments and corporations releasing personal information about <em>citizens</em>.</p>
<p>It’s also misleading to call this sharing. “Sharing” suggests a safe relationship with someone you know and trust; a friendly interaction which ends with you taking back your book or your bike or your holiday photos.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-to-stop-haemorrhaging-data-on-facebook-94511">How to stop haemorrhaging data on Facebook</a>
</strong>
</em>
</p>
<hr>
<p>It does not reflect an irrevocable transfer of your personal information to an unknown corporation – which can keep it indefinitely, use it as they see fit, and give it to other countries and entities regardless of your interests. </p>
<p>Instead of talking about some undefined social licence for opening up data and sharing our personal information, the Australian government should start a more transparent discussion. It should use neutral words with practical meaning and known legal implications, like collection, use, storage, transfer and disclosure. The government should also highlight the risks of weak data protection. </p>
<p>This would be a real conversation about one stakeholder seeking to gain the trust of another, and what it would take for the trust-seeker to be viewed as trust-worthy.</p><img src="https://counter.theconversation.com/content/95521/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, the Centre for Law, Markets & Regulation and the Australian Privacy Foundation. </span></em></p><p class="fine-print"><em><span>David Vaile has previously conducted and supported research in areas related to privacy and/or open data funded in part or in whole by the Australian Research Council, ACCAN, auDA and by federal and state government bodies. He is a committee or board member of not-for-profit, industry and professional organisations including the Australian Privacy Foundation, Internet Australia, NSW Law Society, AUSTRAC, and the Association of Marketing and Social Research Organisations. The views expressed here are his alone.</span></em></p>Words matter – not just for building trust and understanding, but for weighing up legal issues. So maybe “open” and “shared” aren’t the right words to use when we refer to our data.Katharine Kemp, Lecturer, Faculty of Law, UNSW, and Co-Leader, 'Data as a Source of Market Power' Research Stream of The Allens Hub for Technology, Law and Innovation, UNSW SydneyDavid Vaile, Teacher of cyberspace law, and leader of the Data Protection and Surveillance stream of the Allens Hub for Technology Law and Innovation, UNSW Faculty of Law, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/946062018-04-10T10:36:50Z2018-04-10T10:36:50ZFragmented US privacy rules leave large data loopholes for Facebook and others<figure><img src="https://images.theconversation.com/files/213923/original/file-20180409-114112-1adcmfk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">What are the rules governing who's watching you online?</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-vector/big-brother-watching-giant-hand-magnifying-345947360">Aleutie</a></span></figcaption></figure><p>Facebook CEO Mark Zuckerberg’s <a href="https://www.judiciary.senate.gov/press/rep/releases/senate-judiciary-and-commerce-committees-announce-joint-hearing-with-facebook-ceo">Congressional testimony</a> <a href="https://www.axios.com/read-mark-zuckerberg-testimony-for-congress-1523288674-4ec25015-b37c-4c9e-b367-fd55f9e227f4.html">will discuss</a> ways to keep people’s <a href="https://theconversation.com/is-there-such-a-thing-as-online-privacy-7-essential-reads-88849">online data private</a>, which I’m interested in as a privacy scholar. Facebook and other U.S. companies already follow more comprehensive privacy laws in other countries. But without comparable requirements at home, there’s little reason for them to protect U.S. consumers the same way.</p>
<h2>Inform customers and secure data</h2>
<p>U.S. privacy laws are mostly based on the <a href="https://www.ftc.gov/">Federal Trade Commission’s</a> <a href="https://en.wikipedia.org/wiki/FTC_fair_information_practice">Fair Information Practice Principles</a>, which recommend companies:</p>
<ul>
<li>tell customers their data practices,</li>
<li>give people some choice about additional uses,</li>
<li>provide people with access to information about them, and </li>
<li>ensure the security of the data collected. </li>
</ul>
<p>In some industries, there are regulations for handling what’s called “<a href="https://www.gsa.gov/reference/gsa-privacy-program/rules-and-policies-protecting-pii-privacy-act">personally identifiable information</a>.” Federal laws protect <a href="https://www.hhs.gov/hipaa/index.html">medical information</a>, <a href="https://www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/fair-credit-reporting-act">financial</a> <a href="https://www.ftc.gov/tips-advice/business-center/privacy-and-security/gramm-leach-bliley-act">data</a> and <a href="https://www2.ed.gov/policy/gen/guid/fpco/ferpa/index.html">education-related</a> records.</p>
<p>Online services and apps are barely regulated, though they must <a href="https://www.ftc.gov/tips-advice/business-center/privacy-and-security/children%27s-privacy">protect children</a>, limit <a href="https://www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/can-spam-rule">unsolicited email marketing</a> and <a href="https://oag.ca.gov/privacy">tell the public</a> what they do with data they collect.</p>
<p>Online tracking and advertising is self-regulated: <a href="https://digitaladvertisingalliance.org/">Industry associations</a> set rules for their members. Data collection by emerging technologies, such as smart speakers or self-driving cars, is mostly unregulated. The FTC does investigate if companies are “<a href="https://www.ftc.gov/about-ftc/what-we-do/enforcement-authority">unfair or deceptive</a>,” but firms that prominently disclose what they do may avoid trouble.</p>
<h2>Strong limits on data collection</h2>
<p>Europe, by contrast, generally prohibits collecting and using personal data. Its <a href="http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=uriserv:OJ.L_.2016.119.01.0001.01.ENG&toc=OJ:L:2016:119:TOC">General Data Protection Regulation</a>, which takes effect on May 25, applies to all businesses and government agencies in European Union member countries – including U.S. companies offering services in Europe.</p>
<p>The GDPR gives <a href="https://ico.org.uk/for-organisations/guide-to-the-general-data-protection-regulation-gdpr/lawful-basis-for-processing/">six reasons</a> for collecting personal data. But even then, any analysis must be closely related to the purpose for which the data was collected. For example, a fitness-tracking company couldn’t <a href="http://theconversation.com/could-your-fitbit-data-be-used-to-deny-you-health-insurance-72565">sell users’ exercise data to a health insurance company</a> without additional consent. Companies that violate the GDPR may be fined <a href="https://iapp.org/news/a/top-10-operational-impacts-of-the-gdpr-part-4-cross-border-data-transfers/">up to 20 million euros</a>, or 4 percent of the firm’s worldwide annual revenue.</p>
<p>Building on the GDPR, Europe’s forthcoming <a href="https://ec.europa.eu/digital-single-market/en/proposal-eprivacy-regulation">ePrivacy Regulation</a> will likely <a href="https://www.marketingweek.com/2018/02/08/eprivacy-cookies-data-laws/">require explicit individual consent</a> before a company can track a person’s online activity.</p>
<p>Many other countries, including <a href="https://iapp.org/news/a/gdpr-matchup-mexicos-federal-data-protection-law-held-by-private-parties-and-its-regulations/">Mexico</a>, Switzerland and Russia, have adopted <a href="https://iapp.org/resources/article/the-general-data-protection-regulation-matchup-series/">comprehensive privacy regulations</a> like the EU’s. Canada also broadly regulates how <a href="https://www.priv.gc.ca/en/privacy-topics/privacy-laws-in-canada/the-privacy-act/">government agencies</a> and <a href="https://www.priv.gc.ca/en/privacy-topics/privacy-laws-in-canada/the-personal-information-protection-and-electronic-documents-act-pipeda/">private companies</a> use data.</p>
<p>The advantage of comprehensive privacy protections is that they’re consistent across services and industries, even as new technologies emerge.</p><img src="https://counter.theconversation.com/content/94606/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Florian Schaub conducts research on personal privacy. He has received funding from the National Science Foundation as part of the Usable Privacy Policy Project.</span></em></p>US privacy laws focus on informing consumers what’s happening with their data; other countries specifically restrict data collection and analysis.Florian Schaub, Assistant Professor of Information; Assistant Professor of Electrical Engineering and Computer Science, University of MichiganLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/312732014-09-04T05:06:09Z2014-09-04T05:06:09ZCivil action is the big stick needed to protect our privacy<figure><img src="https://images.theconversation.com/files/58221/original/bswfhq73-1409806944.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Who will keep our selfies safe?</span> <span class="attribution"><a class="source" href="http://www.flickr.com/photos/david_baxendale/14803587918">www.david baxendale.com/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p>Never mind the <a href="https://theconversation.com/who-is-to-blame-when-icloud-is-hacked-you-or-apple-31215">celebrities</a>; let’s say you and I had naked photos of ourselves (selfie-steams) floating in Apple’s iCloud. If somehow those photos were exposed, we would have little recourse under Australia’s current legal regime.</p>
<p>If we were lucky the Privacy Commissioner may show some interest. It is unlikely the police would. Help, justice and satisfaction would be remote possibilities.</p>
<p>If the question “Why do so many young people have naked photos on their phone?” is running through your mind, keep it there. Posing it out loud denotes age, and a serious lack of connection with the iGeneration in the eyes of many (just ask comedian <a href="http://www.news.com.au/entertainment/celebrity-life/ricky-gervais-slammed-over-joke-about-nude-photo-hacking-scandal/story-fn907478-1227044608869">Ricky Gervais</a> who was criticised for saying the celebrities were to blame).</p>
<p>And, although closer understanding and educating regarding this phenomenon is needed, it is probably less helpful in solving the immediate problem.</p>
<p>Here is a snapshot of the current reality: many people take naked images of themselves using technology to share in private relationships. But there is a shortfall between their perceived control over their “own” private images and the nebulous state of rights, responsibilities and regulation of the technology they use to facilitate such behaviours.</p>
<p>The rapid uptake of mobile electronic devices as well as popular web applications (such as social networking sites) and cloud storage and services (where our data no longer resides on the device itself, but is pumped into and out of storage locations elsewhere) has fundamentally changed the privacy of Australians.</p>
<p>We generate inordinate amounts of data – both metadata and content – just by using the devices and services. We expose that data to a range of companies and individuals all of whom must maintain our trust that they will use that information wisely, within the law, and also protect it from exposure to unauthorised people. But they haven’t.</p>
<h2>The need for privacy reform</h2>
<p>The Australian Law Reform Commission final report on <a href="http://www.alrc.gov.au/publications/serious-invasions-privacy-digital-era-alrc-123-summary">Serious Invasions of Privacy in the Digital Era</a> was tabled in Parliament yesterday.</p>
<p>It was commissioned in June 2013 by the previous Attorney-General, the ALP’s <a href="http://www.aph.gov.au/Senators_and_Members/Parliamentarian?MPID=HWG">Mark Dreyfus</a>, to examine the adequacy of the <a href="http://www.comlaw.gov.au/Series/C2004A03712">Commonwealth Privacy Act</a>, particularly in light of digital technologies.</p>
<p>The report is a valuable contribution to the discussion that we need to be having in this digital age. Importantly, it examines the introduction of a <a href="http://www.sbs.com.au/news/article/2014/09/03/new-laws-urged-protect-privacy-aust">private right to civil action</a> in cases of serious breaches of privacy.</p>
<p>A right to civil action is important because it helps empower the individual (let’s put aside social and economic barriers to the likelihood of such action for the moment). It is also important because such a right will go some way to changing behaviour online. And change is needed.</p>
<h2>Bad behaviour encouraged by a lack of consequences</h2>
<p>I have long argued that much of the root cause for criminals abusing technologies (by stealing or personal financial information, our broader data, controlling our accounts and so on) and for online businesses being somewhat cavalier in their protection and (mis)use of our data has been a lack of consequences. That is, there is a perception that they will get away with it.</p>
<p>Bad actions that have no consequences quickly spread to others. Parents know that. Governments need to as well. Innovation by companies to protect our privacy has been slow while the plundering of our information by criminals and vexatious people has been anything but.</p>
<p>The prospect of individual civil actions to remedy wrongs may well prove effective where our public mechanisms have not. Or at least they may augment those public institutions and help bring about a more civil society online.</p>
<p>Companies will understandably not relish the prospect of civil action, but something needs to be done to bring the treatment of our private information online in line with our expectations of privacy. Those that recognise and respect their customer’s data and expectations of privacy will benefit.</p>
<p>As for the criminals and others who wantonly and recklessly abuse us online, several large US corporates continue to have good success by using civil means to <a href="http://www.microsoft.com/en-us/news/press/2013/jun13/06-05dcupr.aspx">pursue criminal groups</a> operating at scale online. Giving us the option to use that tool too will only hurt them further.</p>
<p>Many will argue that it is up to the individual to protect their own privacy, particularly by changing their behaviours, and stopping others, such as not taking nudie selfies.</p>
<p>But it isn’t that simple. Even the dullest, most simple action online generates information that could be – and likely is – abused and misused by someone today.</p>
<p>Imagine what’s happening to our more interesting online actions. Do <a href="http://www.gotinder.com/">Tinder</a> users really think their data won’t be mined (or exposed) while they trawl the internet for instant relationships?</p>
<p>Are <a href="http://www.cio.com/article/2368602/social-media49680-15-Social-Media-Apps-You-May-Not/social-media/149680-15-Social-Media-Apps-You-May-Not-Know-About.html">anonymous messaging apps</a> really anonymous?. Does your personal <a href="http://www.fitbit.com/">Fitbit</a> health and fitness data just stay with you? What would your heath insurance or life insurance company think of your web searches?</p>
<p>What’s needed is a change on many fronts: end users, businesses, governments, criminals and other data abusers. The Law Reform Commission’s report won’t fix it all – even if the legislation it recommends comes to life – but it will go some way to empowering us end users to assert our rights. And that should be celebrated.</p>
<p>Until then, changing passwords and only taking photos you’re comfortable seeing on the front page may be the best options.</p>
<p><br>
<em><strong>Related reading: <a href="https://theconversation.com/its-time-for-privacy-invasion-to-be-a-legal-wrong-31288">It’s time for privacy invasion to be a legal wrong</a></strong></em></p><img src="https://counter.theconversation.com/content/31273/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Alastair MacGibbon does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Never mind the celebrities; let’s say you and I had naked photos of ourselves (selfie-steams) floating in Apple’s iCloud. If somehow those photos were exposed, we would have little recourse under Australia’s…Alastair MacGibbon, Director, Centre for Internet Safety, University of CanberraLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/278562014-06-11T20:33:41Z2014-06-11T20:33:41ZSnowden report calls out Australia’s inadequate privacy law<p>The revelations of NSA whistleblower Edward Snowden have altered the way we think about accountability, transparency and the rule of law with regard to both the activities of security agencies and the value of privacy, according to a detailed <a href="http://www.privacysurgeon.org/blog/wp-content/uploads/2014/06/Snowden-final-report-for-publication.pdf">report</a> released this week.</p>
<p>But this change in thinking has not led to practical reform, according to the report:</p>
<blockquote>
<p>While there has been a notable volume of “activity” in the form of diplomatic representations, parliamentary inquiries, media coverage, campaign strategies, draft legislation and industry initiatives, there has – at the global level – been an insignificant number of tangible reforms adopted to address the concerns raised by the Snowden disclosures. Two thirds of legal professionals and technology experts from 29 countries surveyed for this study reported that they could recall no tangible measure taken by government.</p>
</blockquote>
<p>The Snowden revelations received wide coverage in the Australian media, which was not the case in all countries included in the study. </p>
<p>The report notes Australia’s inquiry into a comprehensive revision of the Telecommunications (Interception and Access) Act 1979, and how the Joint Parliamentary Committee on Intelligence and Surveillance, in June 2013, decided not to endorse the then Attorney-General’s <a href="http://www.crikey.com.au/2013/10/03/revealed-attorney-generals-drive-for-data-retention-law/?wpmp_switcher=mobile">proposal</a> for data retention on an increased scale.</p>
<h2>Where Australia’s law falls short</h2>
<p>The key issue discussed in relation to Australia is, however, the soon to be decided fate of the proposed <a href="http://www.alrc.gov.au/inquiries/invasions-privacy">new tort of serious invasion of privacy</a>. In discussion papers released in <a href="http://www.alrc.gov.au/publications/invasions-privacy-ip43">October 2013</a> and in <a href="http://www.alrc.gov.au/publications/serious-invasions-privacy-dp-80">March 2014</a>, the Australian Law Reform Commission discussed the possibility of introducing such a tort to complement the recently amended Privacy Act.</p>
<p>The proposal has widespread support and would help bring Australian privacy law in line with that of many other countries. However, the Snowden report concludes that:</p>
<blockquote>
<p>The new conservative government seems unlikely to implement the proposed privacy tort or give the Privacy Commissioner adequate resources. It also seems uninterested in reining in powers or activities of intelligence and law enforcement agencies, or considering risks and harm to individuals, businesses or the public interest from erosion of trust in communications confidentiality, IT security and privacy.</p>
</blockquote>
<p>The consequences of a failure to introduce a tort of serious invasion of privacy go way beyond the matters brought to light by Snowden. As I have pointed out <a href="http://www.thefreelibrary.com/Lawsuits+against+privacy+breach+can+help+counter+sexting,+says+Oz...-a0311456606">elsewhere</a>, Australian privacy law contains serious gaps, for example in relation to “sexting” situations. A carefully drafted tort such as that discussed would help a great deal. </p>
<p>The Australian Law Reform Commission is due to deliver its final report to the Attorney-General this month - we really should not miss this opportunity to improve Australia’s law.</p>
<p>The Australian correspondent of the Snowden report, Dr Jenny Ng, concludes that the higher profile of privacy and surveillance issues may in the longer term lead to improvements in legal privacy protection. However, this conclusion is accompanied by the statement that “the outcome is by no means certain, with the capacity for inhibiting stronger privacy laws ever present”.</p>
<p>More generally, one of the report’s most interesting observations relates to the international political responses to the disclosures made by Edward Snowden:</p>
<blockquote>
<p>While, for example, President Obama declared an interest in providing some protections for non-US persons, the protections themselves were marginal at best, and have so far failed to materialise. Indeed the available evidence indicates that the US administration has engaged in a global campaign to neutralise attempts by some governments to create reform of international security relationships.</p>
</blockquote>
<p>But the reverberations from the Snowden revelations go beyond the big stage of international politics. The report also notes changes in corporate attitudes:</p>
<blockquote>
<p>A significant number of corporations have responded to the disclosures by introducing a range of accountability and security measures (transparency reports, end-to-end encryption etc). Nonetheless, while acknowledging that these reforms are “a promising start” nearly sixty percent of legal and IT professionals surveyed for this report believe that they do not go far enough, with more than a third of respondents reporting that they felt the measures were “little more than window dressing” or are of “little value” outside the US".</p>
</blockquote>
<p>The concerns are obvious; any time we decide to entrust our personal information to a foreign business, for example through one of the many cloud computing providers, we are exposing ourselves to three-letter agencies such as the NSA. And if we decide only to trust Australian businesses, the weakness of the Privacy Act’s regulation of data transfers overseas will mean we are still at risk. Indeed, our own government shares our personal information with those overseas three-letter agencies. </p>
<p>So is the conclusion that we must surrender our interest in privacy? I do not think so. Rights, like our fundamental human right to privacy, are always of the greatest importance where they are most difficult to uphold. The right to water, for example, means a lot more to us in a desert than amongst lakes and rivers of fresh water. In the same way, the right to privacy is made even more important in the privacy-hostile environment we operate in online. </p><img src="https://counter.theconversation.com/content/27856/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dan Jerker B. Svantesson is an ARC Future Fellow (project number FT120100583) and receives funding from the Australian Research Council. The views expressed herein are those of the author and are not necessarily those of the Australian Research Council.</span></em></p>The revelations of NSA whistleblower Edward Snowden have altered the way we think about accountability, transparency and the rule of law with regard to both the activities of security agencies and the…Dan Jerker B. Svantesson, Co-Director Centre for Commercial Law, Bond UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/229322014-03-12T00:03:13Z2014-03-12T00:03:13ZPrivacy law is toothless without greater transparency<figure><img src="https://images.theconversation.com/files/43153/original/gqtgmwhq-1393994063.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">More formal decisions are required to shine a light on how privacy law is applied.</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com">Shutterstock</a></span></figcaption></figure><p><em>What does privacy mean in an age of ongoing privacy breaches? With new privacy law coming online in Australia on March 12, our Privacy in Practice series explores the practical challenges facing Australian business and consumers in a world rethinking privacy.</em></p>
<hr>
<p>After more than a quarter of a century, Australia’s federal Privacy Act remains opaque. </p>
<p>We are used to reading complex legislation through the lens of court decisions that tell us what it really means. But only two slight cases illuminate the Privacy Act after 25 years: <a href="http://www.austlii.edu.au/au/journals/PrivLawPRpr/2004/18.html">one</a> confirming that the Act really did mean that “any other person” could seek an injunction, and <a href="http://www.austlii.edu.au/au/cases/cth/AATA/2004/1221.html">one</a> saying the Commissioner’s approach to assessment of damages was wrong.</p>
<p>So the Commissioner issues his lore (guidelines etc), and law firms and academics peddle their own, but no-one knows enough reliable law - transparency gap #1. Why this odd situation has arisen needs explanation.</p>
<p>The Privacy Commissioner is about to get more powers to enforce the Act – civil penalties (fines) for serious or repeated breaches, enforceable undertakings, and powers to enforce Commissioner-initiated investigations – all of which are needed. But expanded powers will be of limited value unless they are used and their use is transparent, visible to individuals who want to use the Act to protect their rights, companies and agencies who must respond to complaints, and (most important) the lawyers and NGOs, advisers and intermediaries. </p>
<p>Unless all interested parties know the real tariff for breaching the Act (not just the formal possibilities), and all of the enforcement toolkit of the Act actually gets used, then there are no effective feedback loops to discourage breaches, encourage complaints, and encourage compliance. If this is to be achieved after the April empowerment, the Commissioner will have to lift his game on the transparency front.</p>
<p>Until now, the Commissioner’s one significant power has been to make “determinations” (formal decisions) under section 52, including compensation payments where appropriate. When made, these formal decisions are accompanied by a reasonably lengthy explanation of how the Commissioner applied the law, and the respondent is named. The problem is that the Commissioner completes over 1500 complaints per year (2012/13 figures) but doesn’t make any formal decisions: one in 2012/13, none since then. </p>
<p>Since the private sector came under the Act in 2001, Commissioners have only made seven such decisions, a bit more than half a formal decision per year - transparency gap #2. Most complaints are resolved through a mediated settlement or by being discontinued on one of the many grounds available.</p>
<h2>Decisions in the dark</h2>
<p>In the absence of court decisions, or formal Commissioner’s decisions, summaries of important or illustrative complaints are the best information available on how the Act is interpreted and applied. </p>
<p>From 2001-2011 Commissioners published on average 20 complaint summaries per year, but in January 2012 they suddenly stopped, and the Commissioner has published only one since - transparency gap #3. Five reports of investigations initiated by the Commissioner have been published in that time.</p>
<p>Next month, for the first time, dissatisfied complainants and respondents will obtain a right of appeal on the merits of their complaint. They can appeal to the Administrative Appeals Tribunal, against the Commissioner’s formal decisions under section 52. If appeals occur, AAT and court decisions will at last shine some light into corners of the Act. </p>
<p>The problem, based on the track record of all Commissioners for over a decade, is that only 0.5 persons per year will be able to consider appealing, because there are no decisions to appeal against. But surely a party will now demand that the Commissioner makes a decision in their complaint, if they don’t like the way it is going, so they can then haul him into court and have reason prevail? Unfortunately not, because successive Commissioners have insisted they won’t make decisions just because complainants insist they do, and will dismiss complaints if they think “the respondent has dealt adequately with the complaint”, though the complainant disagrees. Transparency gap #4 awaits.</p>
<h2>Shining a light on compensation</h2>
<p>Lawyers and consumer advocates don’t believe that anyone gets financial compensation as a result of privacy complaints - transparency gap #5. </p>
<p>In fact, by laboriously combining figures from different tables in Annual Reports, and making some assumptions about averages, it is possible to conservatively estimate that A$131,000 was paid in compensation to 45 complainants in 2012/13, an average of $2,911. </p>
<p>Depending on the year, about 3-5% of complaints result in compensation payments. These figures are unspectacular, but valuable to anyone wanting to understand how privacy law actually works. It is commendable that they are provided (most Privacy Commissioners don’t do so), but they need to be consolidated and more visible. Compensation is the second most frequent remedy arising from complaints, after apologies.</p>
<p>It is therefore a matter of hope, not expectation, that the Privacy Commissioner will use his new powers, that he will do so transparently, and that he will open the door to privacy law appeals.</p><img src="https://counter.theconversation.com/content/22932/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Graham Greenleaf does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>What does privacy mean in an age of ongoing privacy breaches? With new privacy law coming online in Australia on March 12, our Privacy in Practice series explores the practical challenges facing Australian…Graham Greenleaf, Professor of Law and Information Systems, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/230152014-03-10T19:52:05Z2014-03-10T19:52:05ZMobile phone tracking: it’s not personal<figure><img src="https://images.theconversation.com/files/42415/original/r928jfwm-1393292572.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Will the Privacy Act protect you from being identified by your mobile phone address?</span> <span class="attribution"><a class="source" href="http://www.flickr.com/photos/grimsanto/6190478735/sizes/l/">Santos "Grim Santo" Gonzalez/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p><em>What does privacy mean in an age of ongoing privacy breaches? With new privacy law coming online in Australia on March 12, our Privacy in Practice series explores the practical challenges facing Australian business and consumers in a world rethinking privacy.</em></p>
<hr>
<p>Mobile phone tracking techniques are becoming more commonplace. <a href="http://arstechnica.com/security/2013/08/no-this-isnt-a-scene-from-minority-report-this-trash-can-is-stalking-you/">Waste bins</a> target ads. <a href="http://www.abc.net.au/news/2013-08-29/use-of-phone-tracking-tech-in-shopping-centres-set-to-increase/4923298">Shopping centres</a> follow customers. <a href="http://www.cbc.ca/news/politics/csec-used-airport-wi-fi-to-track-canadian-travellers-edward-snowden-documents-1.2517881">Spooks</a> follow airport passengers. Will the Privacy Act’s new definition of personal information provide enhanced protections against mobile phone tracking? Not really. Here’s why.</p>
<h2>Defining what’s personal</h2>
<p>The Privacy Act covers personal information. Any information that is not personal information is not covered by the Act. Under the new definition, personal information is information: </p>
<blockquote>
<p>about an identified individual, or an individual who is reasonably identifiable. </p>
</blockquote>
<p>Information can therefore be personal information in two ways. The first where information directly identifies an individual - what we normally think of as our “personal information”. In other words, the unique identifiers needed for our lives. Our name. Our email address. Our credit card number. </p>
<p>The second where information does not directly identify an individual, but that information can be combined with other information to identify that individual. A residential address is a good case in point. </p>
<p>An address does not identify an individual directly. 742 Evergreen Terrace is not an unique identifier. However, 742 Evergreen Terrace can be used as a means to combine different pieces of non-personal information together to reveal the identity of an individual. 742 Evergreen Terrace + Duff Beer drinker + doughnut consumer + balding haed + nuclear power plant worker = Homer Simpson. </p>
<p>Check it out yourself. Next time you log on to Facebook, try the <a href="https://www.facebook.com/about/graphsearch">Facebook Graph Search</a>. Enter a range of different “Likes”. You may have to play around but you can generally go from millions of individual Facebook users to one user with a small number of combinations. It’s an example of <a href="http://randomwalker.info/">Arvind Narayanan’s</a> <a href="http://33bits.org/about/">33 Bits of Entropy</a>. You can identify an individual from any population by combining a maximum of 33 pieces of random non-personal information around a single point. It is this “singling out” type of harm that is <a href="https://elaw.murdoch.edu.au/index.php/elawmurdoch/article/viewFile/41/15">central</a> to the definition of personal information.</p>
<h2>The reasonable part</h2>
<p>Does that mean any piece of information could be personal information?
Potentially yes and that’s problematic because the Privacy Act is not designed for application to all information. The definition gets around this problem through its “reasonable” element. Information will only be personal information if an individual is “reasonably identifiable”. </p>
<p>A reasonable identification refers to an organisation’s ability to combine information to identify an individual within “moderate steps” that leads to actual identification. In other words, identification is doable without too much trouble. </p>
<p>What is too much trouble will vary between different organisations. For example, Google’s moderate steps would be vastly different to most other organisations which do not have Google’s resources, skills and capacities.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/42306/original/v29vzss7-1393212144.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/42306/original/v29vzss7-1393212144.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/42306/original/v29vzss7-1393212144.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/42306/original/v29vzss7-1393212144.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/42306/original/v29vzss7-1393212144.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/42306/original/v29vzss7-1393212144.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/42306/original/v29vzss7-1393212144.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">UK company Renew used hi-tech bins to deliver targeted advertising.</span>
<span class="attribution"><span class="source">Tal Cohen/EPA</span></span>
</figcaption>
</figure>
<h2>Waste bins & information about individuals</h2>
<p>The new definition, like the old one, refers to information about individuals. However, actions that threaten privacy no longer just concern information “about” us. They now more readily concern information that “relates” to us. Mobile phone tracking is a case in point. Let’s look at those waste bins to find out why. </p>
<p>In 2013, <a href="http://renewlondon.com/">Renew</a>, a UK company found itself embroiled in a privacy scandal. The City of London installed waste bins provided by the company that broadcast video adverts. </p>
<p>Renew then went one step further. It created a network of sensors called <a href="http://renewlondon.com/tag/presence-orb/">Presence Orb</a> paid for by retailers that recorded the details of when a mobile phone’s medium access control (MAC) address came within the range of a sensor. </p>
<p>The MAC address is unique to the phone’s wi-fi network card. A MAC address can be changed with <a href="http://www.wired.com/gadgetlab/2008/01/mac-address-spo/">a degree of technical know-how</a> but it is generally viewed as an unique identifier. When the phone passed one of those bins, the bin recognised the MAC address and then broadcast a video advert for the retail company. Targeted ads via mobile phone tracking.</p>
<p>Would this be a privacy infringement in Australia? It is possible the collection of MAC address details would have been an unfair collection. But it first depends whether a MAC address would be classed as personal information. </p>
<p>MAC addresses are device identifiers. They are information about devices rather than individuals. They do not directly identify individuals. The issue therefore is whether an individual is reasonably identifiable from a MAC address. As highlighted above, this issue is inherently contextual. It depends on the circumstances of use so it is difficult to determine an answer without a more rigorous analysis of Presence Orb’s sensor network. </p>
<p>This example highlights the general problem with information about individuals as a basis for defining personal information. It does not automatically include information about our devices that relates to us especially in the lives we live today. The EU’s Article 29 Data Protection Working Party best <a href="http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2011/wp185_en.pdf">summarised the issue</a></p>
<blockquote>
<p>Smart mobile devices are inextricably linked to natural persons. There is usually direct and indirect identifiability.</p>
</blockquote>
<p>In other words, the link between our mobile phone and ourselves is such that information that relates to us (e.g. a mobile’s MAC address) has to be seen as information about us. The extent to which mobile phone tracking will be covered by the Privacy Act is unclear. It will primarily depend on whether an individual is reasonably identifiable from the collection and use of MAC address information.</p>
<p>A definition of personal information that incorporates information “about” individuals and reasonable identifiability still provides protections. But the Privacy Act’s new definition of personal information provides less flexibility when considering the privacy consequences of smart tracking technologies that will become more prevalent with the onset of the <a href="http://cccs.uq.edu.au/sensor-society">sensor society</a>. </p>
<p>The new definition of personal information is consequently a missed opportunity and more legal guidance will be required to clearly outline how the inextricable link between us and our devices will operate under the auspices of the Privacy Act.</p><img src="https://counter.theconversation.com/content/23015/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Mark Burdon does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>What does privacy mean in an age of ongoing privacy breaches? With new privacy law coming online in Australia on March 12, our Privacy in Practice series explores the practical challenges facing Australian…Mark Burdon, Lecturer, The University of QueenslandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/233672014-03-07T00:31:43Z2014-03-07T00:31:43ZWhen data privacy goes missing, will the regulators hear it cry?<figure><img src="https://images.theconversation.com/files/42309/original/mwgmtty3-1393212631.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Data breaches often go undiscovered for years.</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com">www.shutterstock.com</a></span></figcaption></figure><p><em>What does privacy mean in an age of ongoing privacy breaches? With new privacy law coming online in Australia on March 12, our Privacy in Practice series explores the practical challenges facing Australian business and consumers in a world rethinking privacy.</em></p>
<hr>
<p>Reporting a data breach that carries a “real risk of serious harm” could soon be mandatory should Australian data breach reporting legislation be implemented.*</p>
<p>The proposed law puts organisations on notice that any data privacy breaches are to be taken very seriously - with stiff penalties for non compliance.</p>
<p>Having a warning triggered on the misuse of personal data is a key control in helping to assure your privacy in cyberspace. Raising the alert immediately, while not preventing the event itself, may mitigate its propagation. </p>
<p>All well and good, in theory at least. How practically this can be achieved in our highly connected and rapidly changing digital world is altogether another matter.</p>
<h2>The power of stealth</h2>
<p>Managing data breaches is no trivial task. According to a 2013 <a href="http://www.verizonenterprise.com/DBIR/2013/">report</a>, data breaches are often not discovered for months — or even years. This presents a real challenge for organisations where the breach may have occurred and the perpetrator has long since moved on. </p>
<p>Of greater relevance to mandatory data breach reporting is that the majority, close to 70%, of breaches were reported not by the organisations themselves, but by an external party.</p>
<p>The stellar cast of data breaches is impressive and seemingly never ending:</p>
<ul>
<li>On February 14 this year, media group <a href="http://news.cnet.com/8301-1009_3-57618945-83/syrian-electronic-army-hacks-forbes-steals-user-data/">Forbes</a> had more than a million names, email addresses, usernames, and passwords stolen by the Syrian Electronic Army;</li>
<li>On February 8 this year, <a href="http://www.cnbc.com/id/101401500">Barclays Bank</a> had 27,000 customer files containing names, addresses, passport numbers, and national insurance numbers, as well as information regarding health issues, insurance policies, mortgages, savings, and earnings leaked;</li>
<li>On February 5 this year, a US healthcare provider, <a href="http://www.st-joseph.org/body.cfm?id=804">St. Joseph Health System</a>, had 405,000 patient names, US Social Security numbers, dates of birth, addresses, and medical details, as well as an unknown amount of bank account information held on their server accessed by hackers;</li>
<li>US retailer <a href="https://theconversation.com/easy-target-the-shadow-hanging-over-online-retail-22035">Target</a> has now seen the data of at least 70,000,000 customers affected, including names, phone numbers, email and mailing addresses;</li>
<li>Even the <a href="http://krebsonsecurity.com/2014/01/dhs-alerts-contractors-to-bank-data-theft/">US Department of Homeland Security</a> had 520 private documents and financial information belonging to at least 114 organisations extracted by an unauthorised party. Interestingly this incident occurred on September 2013, and was only reported in January 2014, some 4 months later. </li>
</ul>
<p>Data breaches seem to be a fact of life.</p>
<h2>Effective, on paper at least</h2>
<p>The effectiveness of any legislation is based on considerations such as the deterrence factor, the actual protections afforded under the law and the practicalities of enforcing the law. </p>
<p>In the face of sophisticated and persistent cyber attacks, the protection offered by the legislation is limited, especially if an organisation was not aware of the attack having occurred. If the organisation that suffered a breach had in fact implemented, and was operating with best of breed security measures and technologies, it is <a href="http://www.oaic.gov.au/privacy/applying-privacy-law/app-guidelines/">unlikely to be prosecuted</a>. A great “Get Out of Jail Free” card.</p>
<p>However, if the organisation “did not take reasonable steps to protect the personal information from unauthorised access” it <a href="http://www.oaic.gov.au/privacy/applying-privacy-law/app-guidelines/">may be in breach</a> of the legislation. In such instances, the interpretation of what constitutes “reasonable steps” may not be a simple exercise. </p>
<p>Cybercrime is sophisticated, well funded and is big business, and a constant threat.</p>
<p>The new legislation also presents a unique challenge for organisations with existing cloud arrangements, in that they are, for the most part, at the mercy of their provider’s willingness or ability to meet these new legal requirements. In the face of the new legislation, it is prudent to reassess your cloud provider’s security measures.</p>
<p>Add to this mix the challenges facing those organisations at war with their own IT departments or IT vendors. Legacy systems, poorly architected IT services based on fragmented technologies, inflexible IT supply contracts and not to mention substandard business leadership and technology management practices are hindering many an organisation’s abilities to respond rapidly to meet the new legislative demands. </p>
<p>Moreover, the pervasive phenomenon of “<a href="http://rob-livingstone.com/2013/06/shadow-it-in-broad-daylight/">shadow IT</a>” is also a factor, where individuals, local departments or business units within organisations are implementing IT systems without the appropriate due diligence, contribute to the risk of a potential data breach. </p>
<p>Both shadow IT and cybercrime escalate the risks of, and challenges associated with the protection of sensitive data.</p>
<h2>Room for improvement</h2>
<p>In an era of financial austerity, organisations are keen to cut all unnecessary costs, and the lure of cutting the ongoing investment in information security is a constant trade-off, especially where they have no history of data breaches. It’s akin to an airline gradually reducing the maintenance effort of its fleet of aircraft because it has never had an accident yet. The question is which airline is carrying your personal data, and when it crashes, will you hear the explosion or will it disappear silently and without a trace into the digital Bermuda Triangle?</p>
<p>*This article has been updated to reflect the fact that mandatory data breach legislation has not yet been enacted in Australia.</p><img src="https://counter.theconversation.com/content/23367/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Rob Livingstone has no financial interests in, or affiliations with any organisation mentioned in this article. Other than his role at UTS, he is also the owner and principal of an independent Sydney based IT advisory practice.</span></em></p>What does privacy mean in an age of ongoing privacy breaches? With new privacy law coming online in Australia on March 12, our Privacy in Practice series explores the practical challenges facing Australian…Rob Livingstone, Fellow of the Faculty of Engineering and Information Technology, University of Technology SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/218912014-03-06T19:32:40Z2014-03-06T19:32:40ZRedefining privacy in the age of Edward Snowden<figure><img src="https://images.theconversation.com/files/40259/original/tzk6fdws-1391129573.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">In the age of Edward Snowden regulators and the public are rethinking privacy.</span> <span class="attribution"><span class="source">Abode of Chaos/Flickr</span></span></figcaption></figure><p><em>What does privacy mean in an age of ongoing privacy breaches? With new privacy law coming online in Australia on March 12, our Privacy in Practice series explores the practical challenges facing Australian business and consumers in a world rethinking privacy.</em></p>
<hr>
<p>If you are trying to make sense of privacy in the age of Edward <a href="https://theconversation.com/the-internet-after-snowden-what-now-20775">Snowden</a>, Facebook, <a href="https://theconversation.com/see-change-is-google-glass-all-its-cracked-up-to-be-13268">Google Glass</a>, <a href="https://theconversation.com/amazons-eyes-in-the-sky-and-pig-farmers-might-fly-21000">drones</a>, <a href="https://theconversation.com/social-rejection-why-snapchat-turned-down-facebooks-offer-20354">Snapchat</a>, genetic <a href="http://blogs.crikey.com.au/croakey/2014/01/30/the-new-world-of-big-health-data-some-questions-about-profits-privacy-and-the-public-interest/">profiling</a> and the Personally Controlled Electronic Health Record you could be forgiven for being confused. In 2014 the confusion isn’t going to go away. </p>
<p>That’s because people have different views of privacy, different priorities and get mixed messages from an increasingly complex patchwork of Commonwealth, state and territory law. Law is about sending messages rather than just about punishment.</p>
<p>On march 12 the amended national Privacy Act 1988 comes into effect. The Act covers all Australians but is weakened by exceptions. In the words of one of my students it has “more holes than swiss cheese”. </p>
<h2>Limits of the law</h2>
<p>The Act covers information privacy – in essence the creation and use of computer files – rather than all privacy. It doesn’t, for example, cover the increasingly prevalent workplace drug testing, police strip searches and nastiness such as covert private videos of your bedroom. </p>
<p>It is administered by the Office of the Australian Information Commissioner (<a href="http://www.oaic.gov.au">OAIC</a>), an agency that is seriously <a href="http://www.peteraclarke.com.au/2013/09/13/delays-in-dealing-with-complaints-by-privacy-commissioner/">under-resourced</a>. There are questions about its expertise and apparent permissiveness in dealing with big business and big government agencies. The OAIC has been softer than its peers in Europe, which are increasingly sending a strong <a href="https://theconversation.com/google-slapped-hard-in-europe-over-data-handling-10270">legal message</a> about privacy invasions by Google, Facebook and the NSA.</p>
<p>The Act permits any collection of information or invasion of privacy that is lawful. In the absence of constitutionally protected human rights, that “lawfulness” simply means whatever the government of the day can get through the parliament. That is convenient but results in complexity, confusion and omission. </p>
<h2>In good company?</h2>
<p>The Act sits alongside over 500 other Acts and provisions dealing with privacy. </p>
<p>Some are benign, such as protection of the <a href="https://theconversation.com/big-brothers-little-helper-why-people-say-no-to-the-census-7648">census</a> and tax records. There is a strong social good in people providing information to government. We could not enjoy the benefits of the welfare system and of the electronic payments system without providing some information to a wide range of agencies and businesses. We do so on the basis of trust, which public and private sector actors are tempted to abuse.</p>
<p>The value of other Acts depends on your perspective. Some critics for example regard <em>any</em> data collection by intelligence agencies as utterly abhorrent. Others, such as this author, <a href="https://theconversation.com/i-spy-you-spy-we-all-spy-but-is-it-legal-20540">recognise</a> the appropriateness of surveillance in particular circumstances. </p>
<p>As media consumers and increasingly media creators we are habituated to practices that disrespect the privacy of other people or that facilitate the disregard of our own privacy. That complicity fosters visceral responses by the commercial media to inquiries by <a href="https://theconversation.com/leveson-inquiry-into-uk-press-the-experts-respond-11082">Leveson</a> and <a href="https://theconversation.com/self-regulation-and-a-media-we-can-trust-6466">Finkelstein</a>. Media executives have for example rationalised egregious privacy abuses through claims that freedom of speech is more important than privacy as a freedom from interference. Claims that the public have a “right to know” or that all publication is “in the public interest” confuses public curiosity with public interest. What’s good for Channel 7 or News Corp is not necessarily good for you or I.</p>
<h2>Basic rights, not assumed</h2>
<p>Non-interference is a deeply traditional value, inherent in common law since the middle ages and notions that an englishman’s home is his castle. Regrettably it is not a value that seems to be acknowledged by the federal government in rhetoric about winding back law that erodes <a href="http://inside.org.au/the-brandis-agenda/">traditional freedoms</a>. One freedom - disregarded by creeping surveillance law - is the freedom to be left alone if you are in a private space and causing no harm.</p>
<p>This year will see the <a href="http://www.alrc.gov.au/publications/invasions-privacy-ip43">report</a> by the Australian Law Reform Commission about establishment of a <a href="https://theconversation.com/far-from-sinister-privacy-laws-might-mean-media-does-its-job-better-3326">privacy tort</a>, with scope for action by individuals whose privacy has been unlawfully invaded. The report follows strong recommendations by other commissions and parliamentary committees for a tort that would fix holes in the privacy patchwork and deal with technological challenges such as <a href="https://theconversation.com/expect-more-spy-drones-if-ag-gag-laws-introduced-18194">drones</a> and Google Glass.</p>
<p>If we are thinking about principles we need to consider potentially conflicting rights. There is no simple answer and we cannot magic away policy dilemmas in the style of one academic who dismissed privacy as something for woolly-minded members of the public who believe in santa and <a href="http://barnoldlaw.blogspot.com.au/2013/02/bigthink.html">unicorns</a>. </p>
<p>Do you have a right to be free of interference? Do you have a right to know, a right that covers celebrities and your neighbours and your children and the wife of the Indonesian president? Do we need watchdogs with teeth and a willingness to go out in stormy weather? Should we leave privacy to Mr Gates, Mr Brin, Mr Snowden and Mr Zuckerberg?</p>
<p>We need to look at principles and have an informed community discussion about social goods rather than being driven by personal or bureaucratic convenience.</p><img src="https://counter.theconversation.com/content/21891/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bruce Baer Arnold does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>What does privacy mean in an age of ongoing privacy breaches? With new privacy law coming online in Australia on March 12, our Privacy in Practice series explores the practical challenges facing Australian…Bruce Baer Arnold, Assistant Professor, School of Law, University of CanberraLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/215962013-12-18T02:28:15Z2013-12-18T02:28:15ZValue your privacy? Few Australian websites do<figure><img src="https://images.theconversation.com/files/38057/original/48ww8swm-1387322742.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">New privacy rules are coming, and Australian websites will have to smarten up their act.</span> <span class="attribution"><span class="source">http://heatherbuckley.co.uk</span></span></figcaption></figure><p>On March 12 2014, the way Australian organisations will have to handle online privacy is going to <a href="http://www.comlaw.gov.au/Series/C2012A00197">change significantly</a>. We investigated whether these organisations are ready and found in most cases, they’re nowhere near.</p>
<p>The new Australian Privacy Principles (<a href="http://www.oaic.gov.au/privacy/privacy-resources/privacy-fact-sheets/other/privacy-fact-sheet-17-australian-privacy-principles">APPs</a>) replace the current National Privacy Principles and the Information Privacy Principles. They cover organisations with more than A$3 million turnover and some others such as health care providers, including government (Commonwealth and ACT). They will mandate how these organisations have to deal with sensitive private information collected in the course of their activities. </p>
<p>To determine the level of compliance with the new principals, we at the <a href="http://www.canberra.edu.au/cis">Centre for Internet Safety</a> at the University of Canberra produced the <a href="http://www.canberra.edu.au/cis/storage/AOPI_FINAL.pdf">2013 Australian Online Privacy Index</a>. It benchmarks the public-facing privacy practices of the websites most visited by Australians. The index is an Australian first.</p>
<p>The privacy index lets consumers and regulators assess the privacy implications of interacting with popular websites. Businesses can also compare themselves with peers in their own sector, and how their sector fares against others. </p>
<p>Organisations will need to take reasonable steps to ensure personal information collected is properly protected. The majority of privacy policies we reviewed do not adequately articulate this.</p>
<p>The introduction of privacy principles mean organisations will have to update their privacy policies and risk management protocols. If they combine the principles with best practises for responding to a data breach, they’ll need to have a cultural rethink in the collection, storage, use and dissemination of information which personally identifies customers.</p>
<p>Privacy policies need to clearly state how an organisation collects, uses, discloses, transfers, and stores such customer information.</p>
<p>Our index measures two aspects on an online organisation. First, we derived a score from the number and duration of tracking <a href="http://en.wikipedia.org/wiki/HTTP_cookie">cookies</a> dropped onto a computer visiting the homepage of the website. We used this to determine how intrusive an organisation is.</p>
<p>Tracking cookies are used by organisations to gather detailed profiles on individuals who visit websites. Profiles can include a real name, address, phone number or other identifiable information such as machine identity. </p>
<p>Analytical tools use this information, extrapolating the types of websites visited over days, weeks, months and even years. This information is extremely valuable for targeted marketing against gender, approximate age, marital status, location, work, hobbies, health issues, political leanings and education.</p>
<p>The second score is based on an assessment of the published privacy policy of the company, including how that privacy policy addresses the requirements of the upcoming privacy principles. We also included a measure for readability, with best practice aiming for comprehension by 14 year olds. We used these data to measure the organisation’s stated privacy intention.</p>
<p>As a whole, government websites scored the highest and were the most respectful of privacy. Banking and finance sites were the next best. </p>
<p>The vast majority of privacy policies are not compliant with the new privacy principles. The sites which did poorly – such as those of internet service providers <a href="http://www.tpg.com.au/">TPG</a> and <a href="http://www.iinet.net.au/home/">iinet</a> – failed basic tests. Their policies had not been updated recently enough, and do not state sufficiently what information they are collecting and what they do with customer identifying data. </p>
<p>Because of this lack of knowledge, consumers interacting online with these organisations are not fully informed about the amount of personal information being collected and for what reasons it is being collected. </p>
<p>The majority of websites analysed also did not stipulate if personal information was disclosed to overseas entities or given to third-party marketing companies (both are requirements under the new Privacy Act).</p>
<p>Australian websites also contained a large number of tracking cookies which had long expiry dates. <a href="http://www.harveynorman.com.au">Harvey Norman</a> had 43 tracking cookies with an average expiry of 706 days. While there is no mandate on such practises, tracking a user for over two years seems a little excessive. </p>
<p>Changes to the <a href="http://www.oaic.gov.au/privacy/privacy-act/the-privacy-act">Privacy Act</a> will give the <a href="http://www.oaic.gov.au/">Australian Information Commissioner</a> enhanced powers, including the ability to accept enforceable undertakings and seek civil penalties from individuals and organisations for a serious or repeated breach of privacy.</p>
<p>It is now time for organisations to start taking privacy more seriously.</p><img src="https://counter.theconversation.com/content/21596/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nigel Phair does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>On March 12 2014, the way Australian organisations will have to handle online privacy is going to change significantly. We investigated whether these organisations are ready and found in most cases, they’re…Nigel Phair, Director, Centre for Internet Safety, University of CanberraLicensed as Creative Commons – attribution, no derivatives.