tag:theconversation.com,2011:/ca/topics/social-media-platforms-56534/articlesSocial media platforms – The Conversation2024-03-21T06:12:11Ztag:theconversation.com,2011:article/2260212024-03-21T06:12:11Z2024-03-21T06:12:11ZSocial media apps have billions of ‘active users’. But what does that really mean?<figure><img src="https://images.theconversation.com/files/583295/original/file-20240321-26-3vpdrd.jpg?ixlib=rb-1.1.0&rect=628%2C519%2C4539%2C2925&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://unsplash.com/photos/group-of-people-standing-on-brown-floor-HN6uXG7GzTE">Creative Christians/Unsplash</a></span></figcaption></figure><p>Our digital world is bigger and more connected than ever. Social media isn’t just a daily habit – <a href="https://wearesocial.com/au/blog/2024/01/digital-2024-5-billion-social-media-users/">with more than 5 billion users globally</a>, it’s woven into the very fabric of our existence.</p>
<p>These platforms offer entertainment, connection, information and support, but they’re also battlegrounds for misinformation and online harassment. </p>
<p>Platforms like Facebook, YouTube, Instagram and TikTok vie for our attention, each boasting user counts in the billions. But what do these numbers actually tell us, and should we care?</p>
<h2>What is an active user or a unique user?</h2>
<p>Behind the impressive statistics lies a complex reality. While global social media usership has hit the 5 billion mark, representing <a href="https://datareportal.com/reports/digital-2024-global-overview-report">about 62% of the world’s population</a>, these figures mask the intricacies of online participation.</p>
<p>In Australia, the average person juggles <a href="https://www.genroe.com/blog/social-media-statistics-australia/13492">nearly seven social media accounts</a> across multiple platforms. This challenges the assumption that user counts equate to unique individuals.</p>
<p>It is also important to differentiate between accounts and active users. Not all accounts represent actual engagement in the platform’s community.</p>
<p>An “active user” is typically someone who has logged into a platform within a specific timeframe, such as the past month, indicating engagement with the platform’s content and features. They’re measured with analytics tools provided by the platform itself, or with third-party software. </p>
<p>The tools track the number of unique users – that is, individual accounts – who have interacted with or been exposed to specific content, whether a post, story or advertising campaign. </p>
<p>Social media companies use these metrics to showcase the potential reach of their platform to marketers. It’s key to their business model, as advertising revenue is typically their main source of income. </p>
<p>However, the reliability of these statistics is debatable. Factors such as <a href="https://www.dw.com/en/fact-check-how-do-i-spot-fake-social-media-accounts-bots-and-trolls/a-60313035">bot accounts</a>, inactive accounts and duplicates can inflate numbers, offering a distorted view of a platform’s user base.</p>
<p>Moreover, the criteria for an “active user” vary across platforms. This makes it difficult to make comparisons between user bases and to truly understand online audiences.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/583298/original/file-20240321-22-ifb91e.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A person holding up a smartphone at a busy nightclub." src="https://images.theconversation.com/files/583298/original/file-20240321-22-ifb91e.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/583298/original/file-20240321-22-ifb91e.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/583298/original/file-20240321-22-ifb91e.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/583298/original/file-20240321-22-ifb91e.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/583298/original/file-20240321-22-ifb91e.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/583298/original/file-20240321-22-ifb91e.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/583298/original/file-20240321-22-ifb91e.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Sheer user numbers can make a social media platform influential, but there’s nuance in how we measure impact.</span>
<span class="attribution"><a class="source" href="https://unsplash.com/photos/a-person-taking-a-picture-with-a-cell-phone-D4kALj_9CEE">Michael Effendy/Unsplash</a></span>
</figcaption>
</figure>
<h2>User count isn’t always relevance</h2>
<p><a href="https://datareportal.com/reports/digital-2024-global-overview-report">TikTok boasts a staggering 1.5 billion users globally</a>. This doesn’t even include users on its Chinese counterpart, Douyin. It is also often at the centre of <a href="https://theconversation.com/tiktok-has-a-startling-amount-of-sexual-content-and-its-way-too-easy-for-children-to-access-216114">controversies</a> and <a href="https://medium.com/datasociety-points/the-politics-and-optioncs-of-the-tiktok-ban-d88bdcb532d">geopolitical tensions</a>.</p>
<p>For example, <a href="https://theconversation.com/attempts-to-ban-tiktok-reveal-the-hypocrisy-of-politicians-already-struggling-to-relate-to-voters-225870">TikTok has repeatedly faced threats of bans</a> in significant markets such as the United States, raising questions about future access. But with such a vast user base, TikTok’s impact on culture and trends – particularly among young people – is clear and far-reaching.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/if-tiktok-is-banned-in-the-us-or-australia-how-might-the-company-or-china-respond-225889">If TikTok is banned in the US or Australia, how might the company – or China – respond?</a>
</strong>
</em>
</p>
<hr>
<p>However, the true impact of platforms is further muddied by algorithms – the complex formulas that dictate the content we see and engage with. Designed to keep us scrolling and interacting, they significantly shape our online experiences.</p>
<p>They also complicate how “active” a user might appear. Someone could seem more engaged simply because the algorithm promotes content they interact with more often.</p>
<p>So, while a high active-user count might indicate a platform’s popularity and reach, it doesn’t fully capture its influence or social relevance. True engagement goes beyond numbers, delving into the depth of user interaction, the quality of the content, and the cultural impact these platforms wield.</p>
<h2>Different strokes for different ages</h2>
<p>When we look at the users’ demographics, we see <a href="https://wearesocial.com/au/blog/2024/01/digital-2024-5-billion-social-media-users/">distinct preferences across age groups</a>. </p>
<p>Among the younger crowd, specifically Gen Z, <a href="https://wearesocial.com/au/blog/2024/01/digital-2024-5-billion-social-media-users/">TikTok vastly outpaces Instagram</a> with <a href="https://explodingtopics.com/blog/tiktok-demographics">one in four users under the age of 20</a>. </p>
<p>Meanwhile, <a href="https://sproutsocial.com/insights/new-social-media-demographics/">Snapchat and Instagram</a> are the preferred platforms for people aged 18–29. </p>
<p>Facebook, with its massive user base of more than 3 billion and a <a href="https://datareportal.com/essential-facebook-stats">median user age of 32</a>, is the platform of choice for millennials, Gen X and boomers.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ok-boomer-how-a-tiktok-meme-traces-the-rise-of-gen-z-political-consciousness-165811">'OK Boomer': how a TikTok meme traces the rise of Gen Z political consciousness</a>
</strong>
</em>
</p>
<hr>
<p>People in their 30s and older <a href="https://datareportal.com/reports/digital-2024-global-overview-report">tend to use LinkedIn</a> and X (formerly Twitter) more than platforms like Snapchat.</p>
<p>But all these social media platforms tend to vary in their primary focus, from news and professional connections (like LinkedIn) to predominantly serving entertainment (like TikTok).</p>
<p>This means demographic trends also reveal how each platform impacts users differently, catering to varied content preferences – whether it’s for entertainment, staying updated on news and events, or connecting with friends and family. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/583296/original/file-20240321-30-s182sm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A group of women at a nice restaurant taking a selfie together." src="https://images.theconversation.com/files/583296/original/file-20240321-30-s182sm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/583296/original/file-20240321-30-s182sm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/583296/original/file-20240321-30-s182sm.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/583296/original/file-20240321-30-s182sm.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/583296/original/file-20240321-30-s182sm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/583296/original/file-20240321-30-s182sm.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/583296/original/file-20240321-30-s182sm.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Ultimately, social media really is about community, not global relevance.</span>
<span class="attribution"><a class="source" href="https://unsplash.com/photos/3-women-smiling-and-standing-near-table-_3Pyr85zcE8">Rendy Novantino/Unsplash</a></span>
</figcaption>
</figure>
<h2>User count isn’t what matters</h2>
<p>For content creators and news media, delving into user statistics is crucial if they want to reach their target audiences.</p>
<p>However, despite headlines often focusing on vast user numbers, do these figures actually matter to the everyday social media user? <a href="https://apo.org.au/node/322860">Research I’ve done with colleagues</a> suggests they don’t.</p>
<p>For individuals navigating these digital spaces, it’s not about which platform boasts the highest user count and is therefore deemed “important”.</p>
<p>Instead, the focus is on maintaining connections within their social circles. This preference is rooted in cultural practices, meaning it aligns with the habits, preferences and values of their own community or cultural group.</p>
<p>In other words, people are drawn to social media platforms that are popular or widely accepted among their family, friends, social allies and broader cultural community. This suggests the essence of social media lies in the quality of interactions rather than the platform’s global standing.</p>
<p>Whether for staying informed, being entertained, or nurturing relationships, people gravitate to spaces where their community or “tribe” gathers. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/its-hard-to-imagine-better-social-media-alternatives-but-scuttlebutt-shows-change-is-possible-190351">It's hard to imagine better social media alternatives, but Scuttlebutt shows change is possible</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/226021/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Milovan Savic receives funding from Australian Research Council </span></em></p>Platforms like Facebook, Instagram and TikTok vie for our attention and boast billions of users. Ultimately, what matters is connection.Milovan Savic, Research Fellow, ARC Centre of Excellence for Automated Decision-Making and Society, Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2177602023-11-27T13:41:34Z2023-11-27T13:41:34ZSupreme Court to consider giving First Amendment protections to social media posts<figure><img src="https://images.theconversation.com/files/560784/original/file-20231121-4426-i5zrwh.jpg?ixlib=rb-1.1.0&rect=0%2C22%2C3706%2C3084&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Citizens have sometimes been surprised to find public officials blocking people from viewing their social media feeds.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/illustration/businessman-standing-in-front-of-a-big-smart-royalty-free-illustration/1025098142">alashi/DigitalVision Vectors via Getty Images</a></span></figcaption></figure><p>The First Amendment does not protect messages posted on social media platforms. </p>
<p>The companies that own the platforms can – and do – remove, promote or limit the distribution of any posts <a href="https://www.freedomforum.org/free-speech-on-social-media/">according to corporate policies</a>. But all that might soon change.</p>
<p>The Supreme Court has agreed to <a href="https://www.nytimes.com/2023/10/31/opinion/social-media-supreme-court-democracy.html">hear five cases</a> during this current term, which ends in June 2024, that collectively give the court the opportunity to reexamine the nature of content moderation – the rules governing discussions on social media platforms such as Facebook and X, formerly known as Twitter – and the constitutional limitations on the government to affect speech on the platforms.</p>
<p>Content moderation, whether done manually by company employees or automatically by a platform’s software and algorithms, affects what viewers can see on a digital media page. Messages that are promoted garner greater viewership and greater interaction; those that are deprioritized or removed will obviously receive less attention. Content moderation policies reflect decisions by digital platforms about the relative value of posted messages.</p>
<p>As an attorney, <a href="https://lynngreenky.com/">professor</a> and author of a book about the <a href="https://press.uchicago.edu/ucp/books/book/distributed/W/bo156864042.html">boundaries of the First Amendment</a>, I believe that the constitutional challenges presented by these cases will give the court the occasion to advise government, corporations and users of interactive technologies what their rights and responsibilities are as communications technologies continue to evolve.</p>
<h2>Public forums</h2>
<p>In late October 2023, the Supreme Court heard oral arguments on two related cases in which both sets of plaintiffs argued that elected officials who use their social media accounts either exclusively or partially to promote their politics and policies <a href="https://www.nytimes.com/2023/04/24/us/elected-officials-social-media-supreme-court.html">cannot constitutionally block constituents</a> from posting comments on the officials’ pages.</p>
<p>In one of those cases, <a href="https://www.oyez.org/cases/2023/22-324">O’Connor-Radcliff v. Garnier</a>, two school board members from the Poway Unified School District in California blocked a set of parents – who frequently posted repetitive and critical comments on the board members’ Facebook and Twitter accounts – from viewing the board members’ accounts. </p>
<p>In the other case heard in October, <a href="https://www.oyez.org/cases/2023/22-611">Lindke v. Freed</a>, the city manager of Port Huron, Michigan, apparently angered by critical comments about a posted picture, blocked a constituent from viewing or posting on the manager’s Facebook page. </p>
<p>Courts have long held that public spaces, like parks and sidewalks, are public forums, which must <a href="https://www.oyez.org/cases/1900-1940/307us496">remain open to free and robust conversation and debate</a>, subject only to neutral rules <a href="https://firstamendment.mtsu.edu/article/time-place-and-manner-restrictions/">unrelated to the content of the speech expressed</a>. The silenced constituents in the current cases insisted that in a world where a lot of public discussion is conducted in interactive social media, digital spaces used by government representatives for <a href="https://www.nytimes.com/2023/04/24/us/elected-officials-social-media-supreme-court.html">communicating with their constituents</a> are also public forums and should be subject to the same First Amendment rules as their physical counterparts.</p>
<p>If the Supreme Court rules that public forums can be both physical and virtual, government officials will not be able to arbitrarily block users from viewing and responding to their content or remove constituent comments with which they disagree. On the other hand, if the Supreme Court rejects the plaintiffs’ argument, the only recourse for frustrated constituents will be to create competing social media spaces where they can criticize and argue at will.</p>
<h2>Content moderation as editorial choices</h2>
<p>Two other cases – <a href="https://www.oyez.org/cases/2023/22-555">NetChoice LLC v. Paxton</a> and <a href="https://www.oyez.org/cases/2023/22-277">Moody v. NetChoice LLC</a> – also relate to the question of how the government should regulate online discussions. <a href="https://perma.cc/YHK2-WVWS">Florida</a> and <a href="https://perma.cc/B2WU-M3CK">Texas</a> have both passed laws that modify the internal policies and algorithms of large social media platforms by regulating how the platforms can promote, demote or remove posts.</p>
<p>NetChoice, a tech industry trade group representing a <a href="https://netchoice.org/about/#association-members">wide range of social media platforms</a> and online businesses, including Meta, Amazon, Airbnb and TikTok, contends that the platforms are not public forums. The group says that the Florida and Texas legislation unconstitutionally restricts the social media companies’ First Amendment right to make their own <a href="https://www.oyez.org/cases/1973/73-797">editorial choices</a> about what appears on their sites.</p>
<p>In addition, NetChoice alleges that by limiting Facebook’s or X’s ability to rank, repress or even remove speech – whether manually or with algorithms – the Texas and Florida laws amount to government requirements that the <a href="https://www.oyez.org/cases/1994/94-749">platforms host speech they didn’t want to</a>, which is also unconstitutional. </p>
<p>NetChoice is asking the Supreme Court to rule the laws unconstitutional so that the platforms remain free to make their own independent choices regarding when, how and whether posts will remain available for view and comment.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/560786/original/file-20231121-15-1e40j1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A man in a military uniform stands at a lectern looking out at a group of people sitting in chairs." src="https://images.theconversation.com/files/560786/original/file-20231121-15-1e40j1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/560786/original/file-20231121-15-1e40j1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/560786/original/file-20231121-15-1e40j1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/560786/original/file-20231121-15-1e40j1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/560786/original/file-20231121-15-1e40j1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/560786/original/file-20231121-15-1e40j1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/560786/original/file-20231121-15-1e40j1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">In 2021, U.S. Surgeon General Vivek Murthy declared misinformation on social media, especially about COVID-19 and vaccines, to be a public health threat.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/surgeon-general-vivek-murthy-and-white-house-press-news-photo/1328901388">Chip Somodevilla/Getty Images</a></span>
</figcaption>
</figure>
<h2>Censorship</h2>
<p>In an effort to reduce harmful speech that proliferates across the internet – speech that supports criminal and terrorist activity as well as misinformation and disinformation – the federal government has engaged in wide-ranging discussions with internet companies about their <a href="https://www.nytimes.com/2023/07/04/business/federal-judge-biden-social-media.html">content moderation policies</a>.</p>
<p>To that end, the Biden administration has regularly advised – <a href="https://www.nytimes.com/2023/07/04/business/federal-judge-biden-social-media.html">some say strong-armed</a> – social media platforms to deprioritize or remove posts the government had flagged as misleading, false or harmful. Some of the posts <a href="https://www.nytimes.com/2023/07/04/business/federal-judge-biden-social-media.html">related to misinformation</a> about COVID-19 vaccines or promoted human trafficking. On several occasions, the officials would suggest that platform companies ban a user who posted the material from making further posts. Sometimes, the corporate representatives themselves would ask the government what to do with a particular post.</p>
<p>While the public might be generally aware that content moderation policies exist, people are not always aware of how those policies affect the information to which they are exposed. Specifically, audiences have no way to measure how content moderation policies affect the marketplace of ideas or influence debate and discussion about public issues.</p>
<p>In <a href="https://www.scotusblog.com/case-files/cases/missouri-v-biden/">Missouri v. Biden</a>, the plaintiffs argue that government efforts to persuade social media platforms to publish or remove posts were so relentless and invasive that the moderation policies no longer reflected the companies’ own editorial choices. Rather, they argue, the policies were in reality government directives that effectively silenced – <a href="https://www.oyez.org/cases/1970/1873">and unconstitutionally censored</a> – speakers with whom the government disagreed. </p>
<p>The court’s decision in this case could have wide-ranging effects on the manner and methods of government efforts to influence the information that guides the public’s debates and decisions.</p><img src="https://counter.theconversation.com/content/217760/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Lynn Greenky does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The Supreme Court will hear five cases this term that will examine the nature of online discussion spaces run by social media platforms.Lynn Greenky, Professor Emeritus of Communication and Rhetorical Studies, Syracuse UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2166472023-11-02T14:23:30Z2023-11-02T14:23:30ZSocial media content in times of war: an expert guide on how to keep violence off your feeds<figure><img src="https://images.theconversation.com/files/556623/original/file-20231030-25-2np8f3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">There are some practical ways to filter the amount of violent and graphic content you see on social media.</span> <span class="attribution"><span class="source">bubaone</span></span></figcaption></figure><p>Social media platforms are a great source of information and entertainment. They also help us to maintain contact with friends and family. But social media can also – <a href="https://theconversation.com/mounting-research-documents-the-harmful-effects-of-social-media-use-on-mental-health-including-body-image-and-development-of-eating-disorders-206170">and has</a>, <a href="https://doi.org/10.1093/joc/jqab034">often</a> – become a toxic environment for spreading disinformation, hatred and conflict. </p>
<p>Most people can’t or don’t want to opt out of social media. Efforts by courts and <a href="https://foreignpolicy.com/2022/04/25/the-real-threat-to-social-media-is-europe/">state bodies</a> to regulate or control it are slowly catching up, but so far have been unsuccessful. And social media companies have a record of <a href="https://www.technologyreview.com/2021/03/11/1020600/facebook-responsible-ai-misinformation/">prioritising engagement</a> over social benefit.</p>
<p>Users are left with a dilemma: how to benefit from social media without exposing themselves to distressing, damaging or illegal content. This becomes even more of an issue in times of heightened global tension and conflict. Both the conflict in Ukraine and now the Gaza War have increased the risk of seeing <a href="https://www.npr.org/2023/10/24/1208165068/graphic-videos-and-images-of-the-israel-hamas-war-are-flooding-social-media">horrifying and damaging images</a> on one’s feed. </p>
<p>This article, based on <a href="https://orcid.org/0000-0001-5171-663X">my research</a> on news on social media, is a guide to curating and editing your social media feeds to ensure that the content you see is suited to your needs and is not offensive or disturbing. </p>
<p>It is organised into the broadest social media categories. I’m not covering newer services such as <a href="https://www.threads.net/login">Threads</a>, <a href="https://mastodon.social/explore">Mastodon</a>, <a href="https://post.news/feed">Post</a> and <a href="https://bsky.app/">Bluesky</a>, although the principles are generally applicable. I have focused on using these apps on a mobile phone, because that’s what <a href="https://www.pewresearch.org/global/2022/12/06/internet-smartphone-and-social-media-use-in-advanced-economies-2022/">the majority of users</a> do, rather than using them on a web browser. I am concentrating mostly on video content.</p>
<p>Social media can be a powerful tool for information and learning, but it is a flawed one. Whatever approach you take to managing your feeds, remain cautious and sceptical. Pay attention to updates to policies and user agreements and consider carefully who you trust and follow. </p>
<h2>Your choice or theirs?</h2>
<p>Many social networks offer an algorithmically selected feed as your first point of contact. The specifics of the algorithms are not publicly known and the companies refine them constantly. The feed is largely based on your location and the topics and people you have expressed an interest in previously (whether following, or simply having watched or interacted with the content). It may also include other information such as your age and gender, which you may have previously given the service. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/algorithms-are-moulding-and-shaping-our-politics-heres-how-to-avoid-being-gamed-201402">Algorithms are moulding and shaping our politics. Here's how to avoid being gamed</a>
</strong>
</em>
</p>
<hr>
<p>Organisations and individuals invest money and time in ensuring that their content will be seen. Advertisers will also pay to have their content shown to customers who meet their criteria. It is also important to remember that paid content is not just goods and services for sale, but may be a political or social agenda – often one that is hidden. This is the basis of <a href="https://link.springer.com/article/10.1007/s13278-023-01028-5">fake news and deliberate misinformation</a>.</p>
<p>Here are a few ways to manage your social media feeds.</p>
<h2>Be careful who you follow</h2>
<p>On all networks except TikTok, the key is carefully selecting the people you follow.</p>
<p>On Twitter (X) the best option is to move away from the “for you” page (which is the default view) and focus on the “following” page. You can’t remove the “for you” page entirely. The “following” feed includes everyone you follow, their tweets and their retweets. </p>
<p>If you are seeing content you don’t want to, you can unfollow, block or mute them.</p>
<p>The simplest way to clean up your Facebook news feed is to “unfriend” accounts. Another option is to “unfollow” someone: you remain friends, they can see your content and engage with it, but their posts won’t appear in your feed unless they mention you or you seek it out. Or you can “take a break” from someone, which is a kind of temporary block. Blocking is the most extreme option. It will remove them and all of their content and hide all of yours from them.</p>
<p>Instagram offers similar options to unfollow and mute (similar to Facebook’s “take a break” option).</p>
<p>TikTok has only limited options for users to filter or curate their feeds. The “following” page only shows creators you are following (and ads). It isn’t and can’t be set as the default view.</p>
<p>The “for you” page is entirely algorithm driven. Clicking on a creator only allows you to follow them, not to hide or block them. You can, however, block specific users. Click on their profile, then the share icon. “Report” and “block” are below the various share options. Blocking removes their content, but not other users’ content that features them.</p>
<h2>Explore your settings</h2>
<p>Many platforms have options for limiting violent or graphic content. On Facebook this is buried in the Settings menu. From there, click on News Feed, then Reduce. You can’t remove this content, but you can move it down in your feed. </p>
<p>On TikTok, long pressing on the screen brings up the options panel. From there you can report a video; there’s also a “not interested” option to remove that video and others with similar hashtags from your feed. If you click on “details” to see which hashtags will be filtered, you can select specific ones to block. It’s not clear how reliable this is, however – hashtags change over time. A number of hashtags apparently can’t be filtered, but it’s not clear what these are or why they can’t be filtered.</p>
<p>The “content preferences” option under “settings” allows you to filter video keywords. That removes them from your “for you” page, your “following” page, or both.</p>
<p>You can also set TikTok to “restricted mode”. This limits access to “unsuitable content” – an opaque description.</p>
<h2>User beware</h2>
<p>This is not a perfect guide, since social media is not designed to be controlled by the user. These companies are based on user engagement: the more time you spend on their app, the more money they make. They’re not particularly interested in ensuring the content is helpful or accurate.</p><img src="https://counter.theconversation.com/content/216647/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Megan Knight does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Whatever approach you take to managing your feeds, remain cautious and sceptical.Megan Knight, Associate Dean, University of HertfordshireLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2092202023-07-07T03:26:31Z2023-07-07T03:26:31ZWhy Meta’s Threads app is the biggest threat to Twitter yet<figure><img src="https://images.theconversation.com/files/536220/original/file-20230707-16210-rwkct.jpeg?ixlib=rb-1.1.0&rect=12%2C0%2C4007%2C3017&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>The launch of social media app <a href="https://about.fb.com/news/2023/07/introducing-threads-new-app-text-sharing/">Threads</a> as a competitor to Twitter is a game-changer.</p>
<p>Meta, which also owns Facebook and Instagram, launched the new platform yesterday, ahead of schedule. Threads was welcomed almost immediately – especially by hordes of Twitter users that have watched in dismay as their beloved platform <a href="https://www.nbcnews.com/tech/tech-news/twitter-changes-tweetdeck-rate-limit-rcna92369">crumbles in the hands</a> of Elon Musk.</p>
<p>In less than 24 hours, Threads attracted some 30 million users. And with Meta already having more than two billion Instagram users who can directly link their accounts to it, Threads’ user base will grow fast.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/536201/original/file-20230707-19241-bpdfus.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Post by @zuck saying 'Wow, 30 million sign ups as of this morning. Feels like the beginning of something special, but we've got a lot of work ahead to build out the app." src="https://images.theconversation.com/files/536201/original/file-20230707-19241-bpdfus.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/536201/original/file-20230707-19241-bpdfus.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=347&fit=crop&dpr=1 600w, https://images.theconversation.com/files/536201/original/file-20230707-19241-bpdfus.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=347&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/536201/original/file-20230707-19241-bpdfus.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=347&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/536201/original/file-20230707-19241-bpdfus.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=436&fit=crop&dpr=1 754w, https://images.theconversation.com/files/536201/original/file-20230707-19241-bpdfus.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=436&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/536201/original/file-20230707-19241-bpdfus.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=436&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Mark Zuckerberg posted on Threads to celebrate its 30 million new users.</span>
<span class="attribution"><span class="source">Threads</span></span>
</figcaption>
</figure>
<p>With its simple black and white feed, and features that let you reply, love, quote and comment on other people’s “threads”, the similarities between Threads and Twitter are obvious. </p>
<p>The question now is: will Threads be the one that finally unseats Twitter? </p>
<h2>We’ve been here before</h2>
<p>In October of last year, Twitter users looked on helplessly as Elon Musk became CEO. Mastodon was the first “escape plan”. But many found its decentralised servers <a href="https://www.makeuseof.com/why-people-leaving-mastodon">difficult</a> and <a href="https://www.newyorker.com/culture/infinite-scroll/what-fleeing-twitter-users-will-and-wont-find-on-mastodon">confusing to use</a>, with each one having very different content rules and communities. </p>
<p>Many Twitter fans created “back up” Mastodon accounts in case Twitter crashed, and waited to see what Musk would do next. The wait wasn’t long. Platform instability and outages became common as Musk started laying off Twitter staff (he has now fired about 80% of Twitter’s original workforce). </p>
<p>Shortly after, Musk horrified users and made headlines by upending Twitter’s verification system and forcing “blue tick” holders to pay for the privilege of authentication. This opened the door for account impersonations and the sharing of misinformation at scale. Some large corporate brands left the platform, taking their <a href="https://www.nytimes.com/2023/06/05/technology/twitter-ad-sales-musk.html">advertising dollars with them</a>. </p>
<p>Musk also labelled trusted news organisations such as the BBC as “state-owned” media, until public backlash forced him to retreat. More recently, he started limiting how many tweets users can view and announced that TweetDeck (a management tool for scheduling tweets) would be limited to paid accounts.</p>
<p>Twitter users have tried several alternatives, including Spoutible and Post. Bluesky, which came from Twitter co-founder Jack Dorsey, is gaining ground – but its growth has been limited due to its invitation-only registration process.</p>
<p>Nothing had quite captured the imagination of Twitter followers … until now.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/536196/original/file-20230707-23-kmsj5c.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Andrews: Everyone right to go? Albanese: Ready over here..." src="https://images.theconversation.com/files/536196/original/file-20230707-23-kmsj5c.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/536196/original/file-20230707-23-kmsj5c.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=425&fit=crop&dpr=1 600w, https://images.theconversation.com/files/536196/original/file-20230707-23-kmsj5c.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=425&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/536196/original/file-20230707-23-kmsj5c.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=425&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/536196/original/file-20230707-23-kmsj5c.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=534&fit=crop&dpr=1 754w, https://images.theconversation.com/files/536196/original/file-20230707-23-kmsj5c.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=534&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/536196/original/file-20230707-23-kmsj5c.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=534&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Threads has been joined by a number of popular figures, including Prime Minister Anthony Albanese, Oprah Winfrey, the Dalai Lama, Shakira, Gordon Ramsay and Ellen DeGeneres.</span>
<span class="attribution"><span class="source">Threads</span></span>
</figcaption>
</figure>
<h2>Community is the key to success</h2>
<p>Before Musk’s reign, Twitter enjoyed many years of success. It had long been a home for journalists, governments, academics and the public to share information on the key issues of the day. In emergencies, Twitter offered real-time support. During some of the worst disasters, users have shared information and <a href="https://www.washingtonpost.com/nation/2022/11/19/twitter-emergencies/">made life-saving decisions</a>. </p>
<p>While not without flaws – such as trolls, <a href="https://theconversation.com/bushfires-bots-and-arson-claims-australia-flung-in-the-global-disinformation-spotlight-129556">bots</a> and online abuse – Twitter’s verification process and the ability to block and report inappropriate content was central to its success in building a thriving community. </p>
<p>This is also what sets Threads apart from competitors. By linking Threads to Instagram, Meta has given itself a significant head-start towards reaching the critical mass of users needed to establish itself as a leading platform (a privilege Mastodon didn’t enjoy).</p>
<p>Not only can Threads users retain their usernames, they can also bring their Instagram followers with them. The ability to retain community in an app that provides a similar experience to Twitter is what makes Threads the biggest threat yet.</p>
<p>My research shows that people crave authority, authenticity and community the most when they engage with online information. In our <a href="https://books.emeraldinsight.com/book/detail/looking-for-information/?k=9781803824246">new book</a>, my co-authors Donald O. Case, Rebekah Willson and I explain how users search for information from sources they know and trust.</p>
<p>Twitter fans want an alternative platform with similar functionality, but most importantly they want to quickly find “their people”. They don’t want to have to rebuild their communities. This is likely why so many have stayed on Twitter, even as Musk has done so well to run it into the ground.</p>
<h2>Challenges ahead</h2>
<p>Of course, Twitter users may also be concerned about jumping from the frying pan into the fire. Signing up to yet another Meta app comes with its own concerns.</p>
<p>New Threads users who read the fine print will note that their information will be used to “personalize ads and other experiences” across both platforms. And users have pointed out you can only delete your Threads account if you delete your Instagram account. </p>
<p>This kind of entrenchment could be off-putting for some.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1676779150235803649"}"></div></p>
<p>Moreover, Meta decided to not launch Threads anywhere in the European Union yesterday due to regulatory concerns. The EU’s new Digital Markets Act could raise challenges for Threads. </p>
<p>For example, the act sets out businesses can’t “track end users outside of [their] core platform service for the purpose of targeted advertising, without effective consent having been granted”. This may be in conflict with Threads’ <a href="https://help.instagram.com/515230437301944">privacy policy</a>. </p>
<p>Meta has also <a href="https://techcrunch.com/2023/07/05/adam-mosseri-says-metas-threads-app-wont-have-activitypub-support-at-launch/">announced plans</a> to eventually move Threads towards a decentralised infrastructure. In the app’s “How Threads Works” details, it says “future versions of Threads will work with the <a href="https://help.instagram.com/169559812696339">fediverse</a>”, enabling “people to follow and interact with each other on different platforms, including Mastodon”.</p>
<p>This means people will be able to view and interact with Threads content from non-Meta accounts, without needing to sign up to Threads. Using the ActivityPub standard (which enables decentralised interoperability between platforms), Threads could then function the same way as WordPress, Mastodon and email servers – wherein users of one server can interact with others. </p>
<p>When and how Threads achieves this plan for decentralised engagement – and how this might impact users’ experience – is unclear.</p>
<h2>Did Meta steal ‘trade secrets’?</h2>
<p>As for Musk, he’s not going down without a fight. Just hours after Threads’ release, Twitter’s lawyer Alex Spiro released a letter accusing Meta of “systematic” and “unlawful misappropriation” of trade secrets. </p>
<p>The <a href="https://cdn.sanity.io/files/ifn0l6bs/production/27109f01431939c8177d408d3c9848c3b46632cd.pdf">letter</a> alleges former Twitter employees hired by Meta were “deliberately assigned” to “develop, in a matter of months, Meta’s copycat ‘Threads’ app”. Meta has disputed these claims, <a href="https://www.cnbc.com/2023/07/06/twitter-accuses-meta-of-stealing-trade-secrets-for-its-new-threads-app.html">according to reports</a>, but the rivalry between the two companies seems far from over.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/thinking-of-breaking-up-with-twitter-heres-the-right-way-to-do-it-195002">Thinking of breaking up with Twitter? Here’s the right way to do it</a>
</strong>
</em>
</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/498128/original/file-20221129-22-imtnz0.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/498128/original/file-20221129-22-imtnz0.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=115&fit=crop&dpr=1 600w, https://images.theconversation.com/files/498128/original/file-20221129-22-imtnz0.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=115&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/498128/original/file-20221129-22-imtnz0.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=115&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/498128/original/file-20221129-22-imtnz0.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=144&fit=crop&dpr=1 754w, https://images.theconversation.com/files/498128/original/file-20221129-22-imtnz0.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=144&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/498128/original/file-20221129-22-imtnz0.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=144&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
</figcaption>
</figure>
<p><em>The Conversation is commissioning articles by academics across the world who are researching how society is being shaped by our digital interactions with each other. <a href="https://theconversation.com/uk/topics/social-media-and-society-125586">Read more here</a></em></p><img src="https://counter.theconversation.com/content/209220/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Lisa M. Given is a Fellow of the Academy of the Social Sciences in Australia. She receives funding from the Australian Research Council and the Social Sciences and Humanities Research Council of Canada.</span></em></p>In the battle for Twitter’s followers, this may be the end game.Lisa M. Given, Professor of Information Sciences & Director, Social Change Enabling Impact Platform, RMIT UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2018412023-04-20T12:41:50Z2023-04-20T12:41:50ZAs digital activists, teens of color turn to social media to fight for a more just world<figure><img src="https://images.theconversation.com/files/521711/original/file-20230418-2610-d6yi11.jpg?ixlib=rb-1.1.0&rect=7%2C45%2C780%2C518&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Self-expression and storytelling are among the primary objectives that young aspiring activists seek to achieve online.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/friends-taking-selfies-or-filming-on-the-mobile-royalty-free-image/1387529364?phrase=black%20teens%20online&adppopup=true">FG Trade via Getty Images</a></span></figcaption></figure><p>When it comes to social media use among young people, very often the concern is about potential harm.</p>
<p>Parents, policymakers and others worry that online platforms like Instagram and TikTok may <a href="https://www.nbcnews.com/pop-culture/influencers-parents-posting-kids-online-privacy-security-concerns-rcna55318">compromise children’s privacy, threaten their safety</a>, <a href="https://doi.org/10.1080/02673843.2019.1590851">undermine their mental health</a> and make them susceptible to <a href="https://doi.org/10.1080/23727810.2020.1835420">social media addiction and cyberbullying</a>, among other problems.</p>
<p>Then there are the seemingly never-ending series of <a href="https://www.healthychildren.org/English/family-life/Media/Pages/Dangerous-Internet-Challenges.aspx">dangerous and deadly internet “challenges</a>” – such as the “<a href="https://www.womenshealthmag.com/health/a38603617/blackout-challenge-tiktok-2021/">blackout challenge</a>” and the “<a href="https://time.com/5189584/choking-game-pass-out-challenge/">choking game</a>” – that encourage kids and teens to record themselves performing perilous acts online.</p>
<p>While concerns about the potential pitfalls of social media platforms are valid and should be taken seriously, they can also overshadow some of the more positive ways that young people in general – and young people of color in particular – are using social media. As I found in my dissertation – “<a href="https://www.proquest.com/openview/af75dbf19e4903207be29025afacce5f/1?pq-origsite=gscholar&cbl=18750&diss=y">#OnlineLiteraciesMatter</a>” – some young people are using social media to develop their identities as activists and to push for a more just society. In short, they are using social media platforms to engage in what I refer to as “digitized activism,” taking on issues such as systemic racism and seeking racial justice.</p>
<p>My study adds to a growing body of research that has found young people of color can bring about change when they <a href="https://www.jstor.org/stable/26492573">learn to use digital tools to explore social issues</a> and use those tools to <a href="https://doi.org/10.1002/jaal.474">stand up for their beliefs</a>.</p>
<h2>Fighting online for social justice</h2>
<p>For my study, I followed six young activists between the ages of 14 and 18 across the United States. I picked them through online recruitment efforts. I searched for various hashtags to find them, sent direct messages, or left comments on their posts to engage with them online.</p>
<p>Four of the teens identified as Black and two identified as Latina. I looked at their activism on platforms such as YouTube, Instagram, Twitter and TikTok. All of the young activists used at least one of those social media platforms for various lengths of time – from one to six years.</p>
<p>Each young person in my research represented a case study. I interviewed each one. I also created my own social media accounts to observe their social media posts and engage with them in the same online spaces. I examined their social media posts over a period of three months.</p>
<p>They often reacted to what was going on at the time of the study, which I conducted in 2021 after the <a href="https://www.nytimes.com/interactive/2020/07/03/us/george-floyd-protests-crowd-size.html">takeoff of the Black Lives Matter movement</a> in 2020. As a result, they were concerned with social justice, civil unrest, police brutality and a global pandemic. They were also concerned with increased hardships experienced by culturally and linguistically diverse communities, which often are disproportionately affected by these issues.</p>
<p>The young people in my study addressed a variety of subjects. Some of the subjects they took on could be seen through the hashtags they used, such as #systemicracism, #climatejustice and #mentalhealth.</p>
<h2>New narratives</h2>
<p>They also used social media to educate others through self-expression and to challenge what they saw as society’s negative views of young people. They placed a major emphasis on storytelling, as evidenced in hashtags such as #blackstoriesmatter, #teenwriter and #blackwriter. An overarching theme was a push for change. Their identities were reflected in hashtags such as #blackyouthvisionaries and #changemakers. They made clear that they see social media as a way to represent their values. </p>
<p>“Everything I do online is a reflection of the person I am, and I always want that image to be true to myself,” 18-year-old Laura told me in an interview. I used pseudonyms for all of the young people in my study. “Anyone who has been in a classroom or organization with me knows that I am outspoken and I always need to offer perspectives that I think are crucial to a discussion relating to social justice and I do the same online. Everything I post is a show of my values.”</p>
<p>Higher education appeared regularly in the young people’s self-expression and activism.</p>
<p>For instance, Samirah X., age 14, told me how she was inspired by the protests that followed the police killing of George Floyd to write a script for a movie called “You Change.”</p>
<p>“I take acting very seriously and enrolled in classes at a local community college – Introduction to Filmmaking, where I studied directors, and Screenwriting, where I learned basic screenwriting skills like formatting, developing characters, and their motives,” Samirah told me.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/521935/original/file-20230419-22-wsh7h2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A young African American girl looks toward the camera as she sits at a laptop wearing a pair of blue headphones and a green headband." src="https://images.theconversation.com/files/521935/original/file-20230419-22-wsh7h2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/521935/original/file-20230419-22-wsh7h2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=429&fit=crop&dpr=1 600w, https://images.theconversation.com/files/521935/original/file-20230419-22-wsh7h2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=429&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/521935/original/file-20230419-22-wsh7h2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=429&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/521935/original/file-20230419-22-wsh7h2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=539&fit=crop&dpr=1 754w, https://images.theconversation.com/files/521935/original/file-20230419-22-wsh7h2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=539&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/521935/original/file-20230419-22-wsh7h2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=539&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Teens often turn to social media for creativity and self-expression.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/afro-american-girls-using-laptop-to-connect-with-royalty-free-image/1220557346?phrase=black%20girls%20social%20media&adppopup=true">marieclaudelemay via Getty Images</a></span>
</figcaption>
</figure>
<p>Laura, the 18-year-old, tweeted about how her posts about her college classes “are pretty insightful and really push my classmates to challenge their current ways of thinking and I’m really proud of myself for that.”</p>
<p>As young people of color, they stressed the need to infuse their concerns into broader causes that don’t always take communities of color into account.</p>
<p>“The climate justice movement cannot just be advocating for preservation of parks and saving endangered species. It must be Intersectional,” Laura wrote in an Instagram post. “We have to recognize that Black and brown communities worldwide are being disproportionately disadvantaged because of air and water pollution, food insecurity, and more.”</p>
<h2>What matters most</h2>
<p>Sometimes, they used simple statements to call attention to the issues they see as being of paramount concern.</p>
<p>One of the teens in my study wrote simply:</p>
<blockquote>
<p>My mental health matters</p>
<p>My representation matters</p>
<p>My music matters</p>
<p>My joy matters</p>
<p>My art matters</p>
<p>My future matters.</p>
</blockquote>
<p>The teens made clear that they believe in the urgency of taking action now.</p>
<p>“With this generation, we are not going to wait, if we are tired, we are going to work for it, if we want something to happen we will work on it,” 16-year-old Dakari wrote in a post on YouTube and Instagram. “Stubborn, we don’t want to wait until we are older to do stuff.”</p><img src="https://counter.theconversation.com/content/201841/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dominique Skye McDaniel does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>For some teens on social media, TikTok and Twitter aren’t all about selfies or the latest craze in online “challenges.” Some teens are using social media to advocate for social justice.Dominique Skye McDaniel, Assistant Professor of English Education, Kennesaw State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1948952022-11-21T21:43:38Z2022-11-21T21:43:38ZWhat Elon Musk’s destruction of Twitter tells us about the future of social media<figure><img src="https://images.theconversation.com/files/496484/original/file-20221121-14-5wan9c.jpg?ixlib=rb-1.1.0&rect=0%2C44%2C6000%2C3943&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Since its beginnings in 2006, Twitter has grown into one of the most important social networks in the world.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><iframe style="width: 100%; height: 100px; border: none; position: relative; z-index: 1;" allowtransparency="" allow="clipboard-read; clipboard-write" src="https://narrations.ad-auris.com/widget/the-conversation-canada/what-elon-musk-s-destruction-of-twitter-tells-us-about-the-future-of-social-media" width="100%" height="400"></iframe>
<p>Elon Musk’s purchase of Twitter has been <a href="https://www.nbcnews.com/business/business-news/twitter-elon-musk-timeline-what-happened-so-far-rcna57532">a fast-moving disaster</a>. It has also created a tangible problem for journalists, politicians, activists and academic scholars: Where do we talk to each other if or when Twitter finally collapses or becomes unusable? </p>
<p>It’s a useful question. Contemplating life without Twitter pushes us to look beyond Twitter’s odious underbelly to consider what we liked about it. In doing so, it can help us understand better what social media is, for better and worse, and to consider what we want it to be. </p>
<h2>Twitter communities</h2>
<p>What I will miss about Twitter is its large scale and reach. It has become the default way for so many groups to communicate with each other and, because it’s basically just one big message board, across groups. </p>
<p>Social media companies regularly argue that this scale is why there is so much <a href="https://unesdoc.unesco.org/ark:/48223/pf0000379177">hate speech and disinformation on their networks</a>. As harmful as this speech may be, Twitter’s reach has nonetheless been a boon for, say, emerging researchers wanting to easily reach the largest number of their peers.</p>
<p>Smaller online communities are fantastic for any number of reasons. They allow members to share their interests and knowledge. Their smaller size makes them easier to moderate effectively. However, their smallness can also inhibit the serendipity of running into ideas that you wouldn’t otherwise see. </p>
<p>Furthermore, smaller online communities still depend on the benevolence of whoever happens to be in charge of the server. Twitter’s open design somewhat mitigates against the formation of strict hierarchies among groups on the platform, although as we’re learning, commercial social media still leaves us <a href="https://economictimes.indiatimes.com/magazines/panache/mass-firings-at-twitter-better-com-show-us-the-dark-side-of-digital-layoffs/articleshow/95400833.cms">subject to the owner’s whims</a>. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1593486310580555776"}"></div></p>
<h2>The end of Twitter</h2>
<p>Thinking about where to go after Twitter also highlights that social media networks are not substitutes for each other. Well, they are for advertisers, who will go wherever the audience is. But people use different social media for different purposes. </p>
<p>As an academic, TikTok has nothing to offer me in terms of creating and sharing knowledge with my peers. The Twitter-like Mastodon may allow for <a href="https://fediscience.org/server-list.html">easier communication among colleagues</a>, but it lacks Twitter’s out-of-community reach.</p>
<p>That there is no equivalent substitute for Twitter highlights that there is a strong public interest in fostering public social media, to provide communities with stable communication infrastructure.</p>
<p>Relatedly, this debacle also confirms that advertising does not provide a sustainable business model for socially responsible social media. Twitter has <a href="https://www.barrons.com/news/can-twitter-become-more-profitable-under-elon-musk-01650998108">only turned a profit in two of its 16 years</a>. <a href="https://www.bloomberg.com/news/articles/2022-11-14/elon-musk-twitter-loses-balenciaga-as-advertisers-quit">Advertisers are currently abandoning Twitter</a> in the face of <a href="https://www.wired.com/story/twitters-moderation-system-is-in-tatters/">Musk’s content-moderation follies</a> which, combined with Musk’s incompetence, could drive the company into bankruptcy. </p>
<p>Most important, however, its ad-based business model is based on the viral spread of content designed to engage our attention at any cost, be it bullying, harassment or hate speech. As journalism professor Yumi Wilson notes, “<a href="https://www.sfchronicle.com/bayarea/justinphillips/article/Elon-Musk-Twitter-17575946.php">Twitter was a scary place even before Elon</a>.”</p>
<h2>Life after Twitter</h2>
<p>All this suggests that we need to think seriously about how to move beyond ad-funded social media. Mastodon on its own <a href="https://theconversation.com/citizens-social-media-like-mastodon-can-provide-an-antidote-to-propaganda-and-disinformation-192491">offers a decentralized, community-based paradigm</a>. However, depending on the long-term commitment of volunteers and small operators is itself a recipe for instability.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/people-are-leaving-twitter-for-mastodon-but-are-they-ready-for-democratic-social-media-194220">People are leaving Twitter for Mastodon, but are they ready for democratic social media?</a>
</strong>
</em>
</p>
<hr>
<p>Much more interesting is the proposal that Mastodon-based services could be used by an arm’s length public agency like the CBC to <a href="https://theconversation.com/canadas-public-broadcaster-should-use-mastodon-to-provide-a-social-media-service-194116">publicly fund stable, well-run social media</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/496482/original/file-20221121-18-u6ygc0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a phone screen showing the twitter logo of a silhouette of a white bird on a blue background sits atop a pile of money and a photograph of elon musk" src="https://images.theconversation.com/files/496482/original/file-20221121-18-u6ygc0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/496482/original/file-20221121-18-u6ygc0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/496482/original/file-20221121-18-u6ygc0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/496482/original/file-20221121-18-u6ygc0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/496482/original/file-20221121-18-u6ygc0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/496482/original/file-20221121-18-u6ygc0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/496482/original/file-20221121-18-u6ygc0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Social media platforms, like Twitter, rely on the attention economy to make profits.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<h2>Searchability</h2>
<p>Finally, we need to talk about search engines. Twitter is valuable in part because it allows individuals to broadcast easily to a large audience. Without large-scale social media, we’re <a href="https://www.theatlantic.com/technology/archive/2022/11/twitter-facebook-social-media-decline/672074/">back to the problem of how to discover other people’s work</a> and how to get your work in front of an audience.</p>
<p>Search engines have flown under the radar in our discussions about how platforms should be governed. If we want to reduce online platform power and make the best information easily locatable, we need to reconsider whether our current search engines are good enough. </p>
<p>There is cause for concern: <a href="https://www.fastcompany.com/90673924/its-not-just-you-google-search-really-is-getting-worse">Google’s gold-standard search engine has been “getting worse,”</a> in large part because the company has been clogging its results with advertising that makes it more difficult for users to find relevant information. Given that the big online <a href="https://www.cigionline.org/articles/platform-assumptions-are-a-choice-not-a-given/">platforms</a> continue to rely heavily on advertising revenues, this is a problem that will worsen.</p>
<p>Let’s not glorify Twitter. It is, in <a href="https://www.cbsnews.com/news/twitter-bad-news-spreads-study/">many ways</a> and for many people, a malevolent force. Even pre-Musk, it was a breeding ground for <a href="https://www.amnestyusa.org/press-releases/shocking-scale-of-abuse-on-twitter-against-women-politicians-in-india/">harassment</a>, particularly of women and individuals from marginalized groups. It can enable often life-ruining <a href="https://scott.mn/2022/10/29/twitter_features_mastodon_is_better_without/">bullying</a> and disproportionate <a href="https://yalereview.org/article/online-shaming-twitter-culture-tyson">public shaming</a> of otherwise private individuals, particularly through the <a href="https://mastodon.social/@Gargron/99662106175542726">quote-tweet function</a>. </p>
<p>Twitter has had a negative effect on the quality of our social discourse, serving as a <a href="https://www.scientificamerican.com/article/experts-grade-facebook-tiktok-twitter-youtube-on-readiness-to-handle-election-misinformation1/">conduit for mis-</a> and <a href="https://www.washingtonpost.com/technology/2022/10/27/civil-rights-2022-midterms/">disinformation</a>, designed to <a href="https://www.fastcompany.com/90665826/yale-researchers-say-social-medias-outrage-machine-has-the-biggest-influence-on-moderate-groups">encourage outrage</a> rather than substantive conversation. </p>
<p>As bad as it was — and is — <a href="https://youtu.be/2595abcvh2M">you don’t know what you got till it’s gone</a>. Twitter pre-Musk was no paradise, but Musk’s rampage allows us to see both the good and bad in social media as it currently exists. And, as a result, to consider what we want (and need) social media to be.</p>
<p><a href="https://theconversation.com/topics/social-media-and-society-125586" target="_blank"><img src="https://images.theconversation.com/files/479539/original/file-20220817-20-g5jxhm.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=144&fit=crop&dpr=1" width="100%"></a></p><img src="https://counter.theconversation.com/content/194895/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Blayne Haggart receives funding from the Social Sciences and Humanities Research Council of Canada. He is a Senior Fellow with the Centre for International Governance Innovation.</span></em></p>Elon Musk’s chaotic takeover of Twitter reveals what we like and need from social media.Blayne Haggart, Associate Professor of Political Science, Brock UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1940592022-11-08T02:50:06Z2022-11-08T02:50:06ZWhat is Mastodon, the ‘Twitter alternative’ people are flocking to? Here’s everything you need to know<figure><img src="https://images.theconversation.com/files/494017/original/file-20221108-25-bwjxqv.jpg?ixlib=rb-1.1.0&rect=100%2C271%2C5075%2C2956&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Koshiro K/Shutterstock</span></span></figcaption></figure><p>In the wake of Elon Musk closing the deal to buy Twitter on October 27 and soon after firing the management, <a href="https://theconversation.com/why-elon-musks-first-week-as-twitter-owner-has-users-flocking-elsewhere-193857">users have been reconsidering the platform</a>.</p>
<p>Hashtags #TwitterMigration and #TwitterExodus are gaining popularity, and the most common name found in conjunction with it is <a href="https://joinmastodon.org/">Mastodon</a> – the new home for fleeing tweeters.</p>
<p>In fact, Mastodon is not that new. It was launched in <a href="https://en.wikipedia.org/wiki/Mastodon_(software)">October 2016</a> by German software developer <a href="https://time.com/6229230/mastodon-eugen-rochko-interview/">Eugen Rochko</a>, spurred on by <a href="https://www.forbes.com/sites/rashishrivastava/2022/11/04/mastodon-isnt-a-replacement-for-twitterbut-it-has-rewards-of-its-own/?sh=434bcb7aa6eb">his dissatisfaction with Twitter</a> and his concerns over the platform’s centralised control. </p>
<p>After its <a href="https://www.theverge.com/2017/5/22/15535374/mastodon-spotlight-aftermath">15 minutes of fame in early 2017</a>, Mastodon’s growth slowed to a crawl.</p>
<p>Now, it’s on the upswing again – <a href="https://twitter.com/joinmastodon/status/1586525904997863427">more than 70,000</a> users joined the network the day after Musk’s Twitter deal was announced. At the time of writing, Mastodon has reached more than a <a href="https://mastodon.social/@Gargron/109300967725833789">million active users</a>, with almost half a million new users since October 27.</p>
<p>Meanwhile, Twitter was <a href="https://www.reuters.com/technology/exclusive-where-did-tweeters-go-twitter-is-losing-its-most-active-users-internal-2022-10-25/">losing its most active users</a> from its 238-million-strong user base even before Musk acquired the platform.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/is-twitters-blue-tick-a-status-symbol-or-id-badge-and-what-will-happen-if-anyone-can-buy-one-193856">Is Twitter's 'blue tick' a status symbol or ID badge? And what will happen if anyone can buy one?</a>
</strong>
</em>
</p>
<hr>
<h2>How difficult is it to sign up to Mastodon?</h2>
<p>Registering on the network takes a few minutes, just like any other social media app. However, Mastodon is not a Twitter clone – you need <a href="https://joinmastodon.org/servers">to choose a server to join</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/494007/original/file-20221108-17-n12318.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Mastodon is not a single website. To use it, you need to make an account with a provider—we call them servers—that lets you connect with other people across Mastodon." src="https://images.theconversation.com/files/494007/original/file-20221108-17-n12318.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/494007/original/file-20221108-17-n12318.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=258&fit=crop&dpr=1 600w, https://images.theconversation.com/files/494007/original/file-20221108-17-n12318.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=258&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/494007/original/file-20221108-17-n12318.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=258&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/494007/original/file-20221108-17-n12318.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=324&fit=crop&dpr=1 754w, https://images.theconversation.com/files/494007/original/file-20221108-17-n12318.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=324&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/494007/original/file-20221108-17-n12318.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=324&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Screenshot of the Mastodon server selection page.</span>
<span class="attribution"><a class="source" href="https://joinmastodon.org/servers">Mastodon</a></span>
</figcaption>
</figure>
<p>Servers are grouped by topic and location, and are supposed to bring users together by common interest. The server is also where your account lives, so your account name will be nickname@server-name (more on this later).</p>
<p>There are currently <a href="https://fediverse.party/en/mastodon/">just over 4,000 servers</a> to choose from. Some are closed for registration as they have reached capacity or simply prefer to keep their communities smaller. For example, <a href="https://www.theverge.com/2017/4/7/15183128/mastodon-open-source-twitter-clone-how-to-use">Mastodon’s flagship server</a> mastodon.social is not currently accepting new members.</p>
<p>After you register by joining your chosen server, the interface looks somewhat similar to Twitter, with short posts (up to 500 characters by default) called “toots” instead of “tweets”. Given the recent spike in popularity, the app can be slow to respond, as some servers are experiencing heavy loads.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1586871318233980934"}"></div></p>
<p>For those looking for a relatively seamless transition without losing their online community, there is a Twitter migration toolkit for <a href="https://fedifinder.glitch.me/">finding your followers and follows</a> on Mastodon.</p>
<p>There is also a tool that allows you to <a href="https://crossposter.masto.donte.com.br/">cross-post</a> between the two.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1588969661789786118"}"></div></p>
<h2>Okay, so why does Mastodon have servers?</h2>
<p>Mastodon isn’t a platform, but a decentralised network of servers. This means no central authority owns and governs the entire communications platform (that is, the opposite of Musk owning Twitter and changing his mind about how the platform operates at any moment).</p>
<p>When you join a server, what you post is visible within that particular server. To an extent, your content can also be seen across the Mastodon network, depending on other servers’ policies being compatible with the one you joined.</p>
<p>This is in stark contrast to Twitter, where everything you tweet is available to all Twitter users, unless your account is protected for followers only.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/social-media-has-huge-problems-with-free-speech-and-moderation-could-decentralised-platforms-fix-this-157053">Social media has huge problems with free speech and moderation. Could decentralised platforms fix this?</a>
</strong>
</em>
</p>
<hr>
<p>The point of selecting a server on Mastodon is to let you communicate in an environment with policies you prefer and a community you like. Each server can have its own code of conduct and moderation policies. Individual server admins can also ban users and other servers from accessing their content and posting.</p>
<p>Furthermore, all servers form part of an interconnected network called the <a href="https://en.wikipedia.org/wiki/Fediverse#Fediverse_software_platforms">fediverse</a>. The fediverse can comprise any social media app that uses the same decentralised principles as Mastodon. That means users within the fediverse could potentially follow each other across servers. </p>
<h2>Is Mastodon safe? What about moderation?</h2>
<p>In principle, <a href="https://policyreview.info/concepts/decentralisation">decentralisation</a> can ensure greater freedom of speech, one of the main concerns users have about Twitter’s future.</p>
<p>Twitter provides content through opaque <a href="https://www.forbes.com/sites/enriquedans/2020/10/03/biased-algorithms-does-anybody-believe-twitter-isracist/?sh=1cc0f0df8466">AI-based algorithms</a> that select what you see on your feed. Mastodon shows posts in chronological order without curation. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/citizens-social-media-like-mastodon-can-provide-an-antidote-to-propaganda-and-disinformation-192491">Citizens' social media, like Mastodon, can provide an antidote to propaganda and disinformation</a>
</strong>
</em>
</p>
<hr>
<p>You might be worried that if there is no central authority, it will be complete chaos, with people posting dangerous and offensive content. </p>
<p>However, thanks to community moderation, most servers hold users to a high standard, and can easily ban or filter hate speech, illegal content, racism, discrimination against marginalised groups, and more. In 2017, Vice journalist Sarah Jeong even called it “<a href="https://www.vice.com/en/article/783akg/mastodon-is-like-twitter-without-nazis-so-why-are-we-not-using-it">Twitter without Nazis</a>”.</p>
<p>Community moderation has shown its force in practice: when the far-right platform Gab <a href="https://www.theverge.com/2019/7/12/20691957/mastodon-decentralized-social-network-gab-migration-fediverse-app-blocking">moved to Mastodon in 2019</a>, many servers across the network banned it without any central direction. While it might still be using Mastodon code, Gab <a href="https://news.ycombinator.com/item?id=25714010">doesn’t appear to be part of</a> the fediverse any more.</p>
<h2>Is Mastodon the new Twitter?</h2>
<p>All in all, Mastodon is neither a replacement for Twitter nor a decentralised replica of it – the presence of individual servers makes it fundamentally different to any social media platform.</p>
<p>As an open-source, decentralised network, Mastodon appeals to young, tech-savvy users, and it will not come as a surprise if many of them find Mastodon to be a welcome upgrade to Twitter.</p>
<p>Additionally, freedom-of-speech seekers worried about central authority censorship could be another group finding a new home there. For now, it’s too soon to tell which user groups will become the most active, and how large Mastodon will become.</p><img src="https://counter.theconversation.com/content/194059/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nataliya Ilyushina receives funding from the ARC Centre of Excellence.</span></em></p>Thousands of Twitter users are jumping ship – and Mastodon might become their new home. But it’s not a clone of the ‘blue bird site’.Nataliya Ilyushina, Research Fellow, RMIT UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1870482022-08-22T18:19:08Z2022-08-22T18:19:08Z‘Digilantism,’ ‘hackbacks’ and mutual aid are used by online activists to fight trolls<figure><img src="https://images.theconversation.com/files/479424/original/file-20220816-12125-wpw8xu.jpg?ixlib=rb-1.1.0&rect=43%2C0%2C4179%2C2896&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The Black Lives Matter movement began as a hashtag started by Black women in the United States, and grew into a global protest.</span> <span class="attribution"><span class="source">(AP Photo/Frank Augstein, File)</span></span></figcaption></figure><p>On Aug. 5, 2022, digital trans activist <a href="https://www.cbc.ca/news/canada/london/trans-twitch-star-arrested-at-gunpoint-fears-for-life-after-someone-sent-police-to-her-london-ont-home-1.6546015">Clara Sorrenti found herself arrested at gunpoint</a> at her home in London, Ont. <a href="https://globalnews.ca/news/9069338/london-police-swatting-twitch-streamer-clara-sorrenti-keffals/">Anti-trans trolls had falsely reported</a> she had killed her mother and was planning a shooting at city hall.</p>
<p>Sorrenti had been swatted.</p>
<p>Swatting involves calling 911 to falsely report a high-risk emergency at their victim’s home, triggering deployment of a SWAT team. In some swatting cases, victims have <a href="https://www.washingtonpost.com/nation/2019/03/29/prankster-sentenced-years-fake-call-that-led-police-kill-an-innocent-man/">died at the hands of police</a>.</p>
<p>Sorrenti’s experience is consistent with my findings in <a href="https://www.ubcpress.ca/transformative-media">long-term research with intersectional global media activists</a>. </p>
<p>She is a new type of intersectional digital activist. These activists work on intersectional issues, drawing connections between systems of oppression including race, gender, sexuality, and so on. And a great deal of their activism takes place online. </p>
<p>Digital campaigns such as #MeToo and #BlackLivesMatter have been successful partially because <a href="https://doi.org/10.1111/2041-9066.12021">young women</a>, <a href="https://doi.org/10.1111/amet.12112">Black people</a> <a href="https://doi.org/10.1007/s10560-018-0577-x">and LGBTQ+</a> are the power users of social media — they are online more often and particularly adept at using social networks.</p>
<p>But despite successes in social justice campaigns, intersectional activists are increasingly at risk — both online and off.</p>
<h2>The emotional tax</h2>
<p>The online trolling and offline swatting of Sorrenti illustrate how <a href="https://doi.org/10.1177/1350506818765318">intersectional activists face an emotional tax</a> — <a href="https://www.amnesty.org/en/latest/research/2018/03/online-violence-against-women-chapter-1-1/">emotional stress over and above everyday norms</a> — mostly from dealing with <a href="https://digitalcommons.schulichlaw.dal.ca/cjlt/vol19/iss2/2/s">violent attacks by online trolls</a>.</p>
<p>Intersectional activists are also doxxed at higher rates, meaning <a href="https://doi.org/10.1080/17440572.2019.1591952">personal information is dumped online</a>, such as their address, phone number or workplace. Sorrenti’s swatting is a textbook example — there are ongoing emotional impacts of her doxxing, including confronting transphobic police behaviours such as using her deadname (<a href="https://www.healthline.com/health/transgender/deadnaming">the name used before transitioning</a>) and incorrect gender.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/NQQ_eFIabFI?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Global News reports on the swatting of activist Clara Sorrenti, who was arrested at gunpoint.</span></figcaption>
</figure>
<h2>Bias in the technology</h2>
<p>A deeper problem is that internet <a href="https://www.codedbias.com/">users are not all treated equally by the internet’s technical codes</a>.</p>
<p>Research has repeatedly demonstrated that <a href="https://nyupress.org/9781479837243/algorithms-of-oppression/">algorithms — the computer codes that program the internet — are biased</a>. </p>
<p>Algorithms and the big data that drives them are often <a href="https://ijoc.org/index.php/ijoc/article/view/6182">racist</a>, <a href="https://carolinecriadoperez.com/book/invisible-women/">gendered</a> or <a href="https://www.gaytascience.com/transphobic-algorithms/">transphobic</a>.</p>
<h2>Made invisible</h2>
<p>One type of algorithmic bias is shadowbanning, which happens when a platform <a href="https://doi.org/10.1002/poi3.287">limits the visibility of specific users</a> without outright banning them. Activists have noted that social media content about intersectional issues is often shadowbanned. </p>
<p>For example, on May 5, 2021 — Red Dress Day in Canada — almost all posts on Instagram <a href="https://www.cbc.ca/news/indigenous/instagram-stories-vanish-mmiwg-red-dress-day-1.6017113">related to missing and murdered Indigenous women disappeared </a>. Instagram claimed it was a “technical issue,” whereas users claimed it was a shadowbanning of intersectional female, Indigenous activist content. But <a href="https://doi.org/10.1177/01634437221077174">shadowbanning is often difficult to prove</a>.</p>
<p>There is also evidence that the popular video-hosting platform TikTok has <a href="https://www.bbc.com/news/technology-54102575">shadowbanned intersectional LGBTQ+, disability, size activism and anti-racist content</a>.</p>
<p>Algorithmic bias and shadowbanning of marginalized users can make intersectional activists feel invisible, with their posts facing challenges to achieve the virality crucial to activist campaigns.</p>
<h2>Response strategies</h2>
<p>One tactic activists have used to address intersectionality online is to create a “breakaway hashtag.” The <a href="https://www.mdpi.com/2075-471X/7/2/21">#MeToo movement</a> is a powerful example of hashtag activism that drew global attention to sexual harassment and abuse. However, for <a href="https://us.macmillan.com/books/9780374536657/headscarvesandhymens">Egyptian-American writer Mona Eltahawy</a>, #MeToo did not feel like the right space for her as a Muslim woman. She created <a href="https://time.com/5170236/mona-eltahawy-mosquemetoo/">#MosqueMeToo to draw attention to sexual assault in the Muslim community</a>, focusing on the intersectional context of gender, Islamophobia and racism. </p>
<p>Breakaway hashtags like #MosqueMeToo add intersectional dimensions to the premise of a mainstream hashtag, both relying on the original hashtag’s virality and challenging its limitations.</p>
<p><div data-react-class="InstagramEmbed" data-react-props="{"url":"https://www.instagram.com/p/CARnd2DHqQY","accessToken":"127105130696839|b4b75090c9688d81dfd245afe6052f20"}"></div></p>
<h2>Digilante justice</h2>
<p>Young feminist women who are trolled online use the tactic of “<a href="https://doi.org/10.1177/2056305117705996">digilante justice</a>,” or “digilantism,” which involves using digital means to fight for justice, in this case against trolls. They learn how to hack social media platforms to <a href="https://doi.org/10.1177/1350506818765318">reveal the identities of trolls and confront them in real life</a>. Activists have also excluded trolls from their personal social networks through “hackback” tactics, which are hacker tactics used against hackers.</p>
<p>In another example, feminist game developer Randi Harper was intensely trolled by misogynists in an incident known as <a href="https://www.igi-global.com/chapter/the-gamergate-files/153208">GamerGate</a>. In response, Harper developed <a href="https://github.com/freebsdgirl/ggautoblocker">Good Game Auto Blocker (ggautoblocker)</a> that blocks users who follow misogynist Twitter accounts, the digital equivalent of walking out of a room when someone spews hateful speech.</p>
<h2>Digital solidarity</h2>
<p>Digital activists understand that social media platforms are designed for the capitalist exploitation of content and data produced by everyday users. Countering this, intersectional hacktivists (hacker activists) have <a href="https://mitpress.mit.edu/9780262043458/">designed technologies</a> for solidarity rather than exploitation. </p>
<p>For example, activists in Athens <a href="https://doi.org/10.1145/3025453.3025490">designed an app to share text message costs</a> so media activists within a group would not have to foot the whole bill. The program itself was designed with sharing in mind, illustrating that technologies do not have to be exploitative.</p>
<p>Intersectional activists <a href="https://ijoc.org/index.php/ijoc/article/view/15766/3424">aim to empower both givers and receivers</a> of support, acknowledging that all citizens play both roles, sometimes needing support and other times contributing it. This is sometimes called mutual aid.</p>
<p>Digital mutual aid can take place through <a href="https://www.interfacejournal.net/2018/12/interface-volume-10-issues-1-2-open-issue/">mentorship and skillshare workshops</a> that might teach new marginalized activists how to code computers, promote social media posts, produce radio shows or write media releases. Workshops are conducted by individuals sharing some aspect of their identities with participants to create a safer space through a shared experience of lived oppression.</p>
<p>Digital solidarity and mutual aid are important strategies of support and care that can work toward countering the negative emotional tax of being trolled, doxxed, shadowbanned or subjected to algorithmic bias.</p>
<h2>More work to be done</h2>
<p>Beyond intersectional digital activism, more work needs to be done by the tech industry, police services and broader social movements to eliminate the colonialism, racism, sexism and transphobia of online interactions and the devastating offline impacts they can have in people’s everyday lives. </p>
<p>This work is important to a well-functioning, inclusive and diverse democracy, as it aims to ensure that online participation is available equally — and safely — to all citizens.</p><img src="https://counter.theconversation.com/content/187048/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sandra Jeppesen receives funding from Social Sciences and Humanities Research Council of Canada</span></em></p>Digital activists are targeted for their work on intersectional issues. But they have developed strategies to deal with online and offline hate.Sandra Jeppesen, Professor of Media, Film, and Communications, Lakehead UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1822712022-05-04T14:17:39Z2022-05-04T14:17:39ZElon Musk’s proposed takeover of Twitter raises questions about its role in the digital social infrastructure<figure><img src="https://images.theconversation.com/files/461119/original/file-20220503-12-rp8d5l.jpg?ixlib=rb-1.1.0&rect=0%2C125%2C4912%2C6992&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Internet technologies have meant that the public sphere has now become digital, but what does that mean for its ownership?</span> <span class="attribution"><span class="source">(Gian Cescon/Unsplash)</span></span></figcaption></figure><p>Over the last few weeks, there has been <a href="https://www.economist.com/business/2022/04/30/elon-musk-is-taking-twitters-public-square-private">a lot of talk of the public square</a> fuelled by Elon Musk’s recent proposed takeover of Twitter. Many have <a href="https://www.theguardian.com/commentisfree/2022/may/01/elon-twitter-is-not-the-town-square-its-just-a-private-shop-square-belongs-to-us-all">balked at the idea that a billionaire would entirely control another one of the world’s important social networks</a>, one that has been adopted by academics and politicians as a choice venue for public debates. </p>
<p>But what is the public square, and what can we do to save it?</p>
<h2>Squares and spheres</h2>
<p>The concept of the public square is one that has a rich history in communications and technology studies. Historically, <a href="https://www.merriam-webster.com/dictionary/public%20square">the public square was a central location</a> where townspeople could gather and debate issues of the day. Each public square can be considered part of the public sphere, which is the area outside of the home where people engage in all kinds of public activities, such as debating, working, engaging in the community, and so on.</p>
<p>German philosopher Jürgen Habermas described the ideal public sphere as <a href="https://mitpress.mit.edu/books/structural-transformation-public-sphere">being composed of spaces in which a diverse set of ideas were debated freely until those present converged on a common ground</a>. Habermas provided the example of 17th-century coffeehouses in London, <a href="https://www.oxfordreference.com/view/10.1093/acref/9780195104301.001.0001/acref-9780195104301-e-137">where male intellectuals and politicians mingled to discuss the societal issues of the moment</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/461018/original/file-20220503-19-v210wy.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A monochrome illustration of four men around a table playing draughts" src="https://images.theconversation.com/files/461018/original/file-20220503-19-v210wy.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/461018/original/file-20220503-19-v210wy.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=432&fit=crop&dpr=1 600w, https://images.theconversation.com/files/461018/original/file-20220503-19-v210wy.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=432&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/461018/original/file-20220503-19-v210wy.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=432&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/461018/original/file-20220503-19-v210wy.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=542&fit=crop&dpr=1 754w, https://images.theconversation.com/files/461018/original/file-20220503-19-v210wy.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=542&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/461018/original/file-20220503-19-v210wy.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=542&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An 18th-century illustration of men playing draughts in a London coffeehouse.</span>
<span class="attribution"><a class="source" href="https://wellcomecollection.org/works/h83k3fu5">(S. Ireland/Wellcome Collection)</a></span>
</figcaption>
</figure>
<p>Habermas also criticized radio and television — the communications technologies of the 1960s, which arguably continued well into the 1990s. He argued that their one-way dissemination of information eroded the public sphere, and made people passive recipients of information without giving them the opportunity to respond.</p>
<h2>Virtual public sphere</h2>
<p>With the arrival of the internet and social media, the public sphere appeared to be revived. People could share their own ideas, not only with their immediate community, but with others around the world. Compared to earlier venues of public debate, the internet appeared to be more inclusive, allowing people of any gender, nationality or social class to participate, rather than only those with social privilege. </p>
<p>However, with this came <a href="https://www.wiley.com/en-ca/The+Digital+Divide-p-9781509534456">new modes of exclusion</a> based on language, literacy, digital skills and internet access.</p>
<p>There were other issues too. Many argued that <a href="https://www.tandfonline.com/doi/full/10.1080/23808985.2021.1976070">social media was polarizing</a>, allowing for the viral spreading of misinformation, and ultimately destabilizing for democracies. This has, in fact, been the subject of <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2595096">ardent debate</a> in the digital public square for more than a decade.</p>
<p>One of the current criticisms of Musk’s attempted acquisition of Twitter is that <a href="https://techcrunch.com/2022/04/26/elon-you-have-no-idea-what-the-hell-youre-talking-about/">he doesn’t understand the public sphere</a> or Twitter’s role in it. As such, Musk might not take the right measures to protect and improve it, particularly when it comes to minority rights. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/461022/original/file-20220503-14-cm66tx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="three people in a row with mobile phones in their hands" src="https://images.theconversation.com/files/461022/original/file-20220503-14-cm66tx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/461022/original/file-20220503-14-cm66tx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/461022/original/file-20220503-14-cm66tx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/461022/original/file-20220503-14-cm66tx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/461022/original/file-20220503-14-cm66tx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/461022/original/file-20220503-14-cm66tx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/461022/original/file-20220503-14-cm66tx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Social media has become a space for access to information and the exchange of ideas.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<h2>Privately owned public squares</h2>
<p>Like Habermas, many commentators today are <a href="https://www.noemamag.com/the-antidote-to-digital-disconnectivity/">worried about the erosion of the public sphere</a>. This space, even in a digital setting, is meant to allow people to discuss issues, access different perspectives and converge on common values and objectives. </p>
<p>While Twitter is often used for <a href="https://planable.io/blog/dumb-tweets/">less lofty objectives</a>, this kind of debate does exist on the platform. It is also used for other important objectives, such as <a href="https://dx.doi.org/10.2139/ssrn.3954252">disseminating information about humanitarian crises</a> or <a href="https://twitter.com/missingkids/">finding missing children</a>.</p>
<p>Twitter, if it can be considered a public square, is part of the global public sphere, which is largely composed of social media platforms. Some of the largest — Facebook, Instagram and WhatsApp — are owned by Mark Zuckerberg.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-digital-town-square-what-does-it-mean-when-billionaires-own-the-online-spaces-where-we-gather-182047">The 'digital town square'? What does it mean when billionaires own the online spaces where we gather?</a>
</strong>
</em>
</p>
<hr>
<p>As we have seen in numerous recent examples, <a href="https://english.elpais.com/usa/2021-04-22/how-algorithmic-recommendations-can-push-internet-users-into-more-radical-views-and-opinions.html">the algorithms that run these platforms can easily be modified by social media companies</a>, with immense effects on public opinion. Having these algorithms effectively owned by a few very wealthy individuals who can manipulate opinions — and thus votes — veers us further away from democracy.</p>
<h2>Social media as a public good</h2>
<p>Many national and international bodies today are examining the idea of <a href="https://www.un.org/techenvoy/content/digital-public-goods">digital public goods</a>. In this context, it would mean that social media platforms should be available to all and regulated through international law, acknowledging their critical role in our social infrastructure.</p>
<p>Within this framework, an international body, such as the <a href="https://www.itu.int/en/Pages/default.aspx">UN International Telecommunications Union</a>, which oversees radio and other communications technologies, could co-ordinate an international convention on digital public goods, including social media. </p>
<p>This could then lead to signatory countries implementing stronger and more nuanced national regulations, particularly in terms of the monitoring of hate speech and misinformation. As it stands, social media companies often resolve these issues <a href="https://www.vox.com/2019/4/16/18410931/twitter-abuse-update-health-technology-harassment">internally after the fact</a>.</p>
<p>Furthermore, efforts could be made to encourage further diversity in social media platforms. For example, <a href="https://techpolicy.press/why-social-media-needs-mandatory-interoperability/">the platforms could be interoperable</a>, as Facebook and Instagram are (both owned by Meta), in order to allow people to access their networks and share content from smaller platforms.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/if-elon-musk-succeeds-in-his-twitter-takeover-it-would-restrict-rather-than-promote-free-speech-181576">If Elon Musk succeeds in his Twitter takeover, it would restrict, rather than promote, free speech</a>
</strong>
</em>
</p>
<hr>
<p>Manipulation of public opinion on social media to obtain political outcomes is <a href="https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/93/2019/09/CyberTroop-Report19.pdf">already common</a>. However, the extent to which social media companies should be held accountable for the content they host is a constant tug-of-war with regulators. Recent examples include <a href="https://www.reuters.com/article/us-myanmar-rohingya-facebook-idUSKCN1GO2PN">Facebook’s role in spreading hate speech that contributed to ethnic violence against the Rohingya in 2018</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/461117/original/file-20220503-24-cl722a.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a young man in a kufi holding a placard reading STOP KILLING ROHINGYA" src="https://images.theconversation.com/files/461117/original/file-20220503-24-cl722a.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/461117/original/file-20220503-24-cl722a.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/461117/original/file-20220503-24-cl722a.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/461117/original/file-20220503-24-cl722a.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/461117/original/file-20220503-24-cl722a.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/461117/original/file-20220503-24-cl722a.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/461117/original/file-20220503-24-cl722a.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A UN investigation found that Facebook was used to spread hate speech against the Rohingya.</span>
<span class="attribution"><span class="source">(Lens Hitam/Shutterstock)</span></span>
</figcaption>
</figure>
<p>Finally, it might still be relevant to review the internal governance structures of social media platforms to prevent networks above a certain size from being owned by a single person.</p>
<p>But this is after the other important steps related to diversity in platforms and clearer guidelines — and stronger sanctions for manipulative algorithms or dangerous content.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/unliked-how-facebook-is-playing-a-part-in-the-rohingya-genocide-89523">Unliked: How Facebook is playing a part in the Rohingya genocide</a>
</strong>
</em>
</p>
<hr>
<h2>Clear, global regulation</h2>
<p>The current debate around Twitter challenges its transformation into a private company. However, addressing this might mean more than simply allowing members of the public to become corporate shareholders again. In fact, this public outrage can be interpreted as a convergence towards making social media platforms global public goods.</p>
<p>Ultimately, much clearer regulation, and at an international level, will be necessary.</p>
<p>It’s easy to find fault in a billionaire’s ownership of a place of public deliberation. However, the governance of social media in our society was never ideal to begin with. Let’s take this opportunity to improve the digital public sphere, regardless of who owns a particular space.</p><img src="https://counter.theconversation.com/content/182271/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Eleonore Fournier-Tombs does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Elon Musk’s purchase of Twitter is seen as a threat to the digital public square. International regulation is required to protect internet users’ access to democratic public spaces.Eleonore Fournier-Tombs, Senior Researcher, Data and Technology, Institute in Macau (UNU-Macau), United Nations UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1787752022-03-27T12:55:19Z2022-03-27T12:55:19ZThe COVID-19 pandemic pushed social media to become increasingly tribal<figure><img src="https://images.theconversation.com/files/454239/original/file-20220324-25-16tq9mb.jpg?ixlib=rb-1.1.0&rect=0%2C22%2C3024%2C1987&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">As the COVID-19 pandemic pushed people online, the result has been increasing divisions on social media.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>Media theorist Marshall McLuhan suggested that <a href="https://mitpress.mit.edu/books/understanding-media">each media-related extension of man comes at the expense of another organ</a>. For example, by increasing reliance on visual media, we lose touch with oral communication. </p>
<p>McLuhan also formulated the <a href="https://utorontopress.com/9780802077158/laws-of-media/">laws of media</a> which states that all media aim to extend the body, and when they do so some media become obsolete, some get revived and when a new medium is pushed to its limits, it reverts to an early version.</p>
<p>McLuhan’s theories take on a new significance as we witness <a href="https://dx.doi.org/10.1111/spc3.12636">a reversion of social media</a>, which I refer to as “tribal media.” By this, I mean media that reflects a fragment of a society consisting of like-minded people within specific political, economic, cultural and personal parameters.</p>
<p>Social media has now been around for two decades, and has been treated <a href="https://www.pewresearch.org/internet/2015/01/15/psychological-stress-and-social-media-use-2/">with ambivalence since its inception</a>. The global COVID-19 pandemic may have pushed social media to its limits, and reverted it to an earlier version: chatrooms. </p>
<p>Until a few years ago, one of the greatest worries about the internet was how addictive it could be. However, when we studied the <a href="https://doi.org/10.2196/11485">relationship between screen addiction and stress</a>, we found a <a href="https://www.thestar.com/opinion/contributors/2019/08/31/could-our-addiction-to-screens-have-a-silver-lining.html">silver lining</a>: There was a possibility that addiction to screens helped reduce the emotional burden of other stressors, such as financial worries or relationship problems.</p>
<p>The COVID-19 pandemic forced a different consideration of whether or not social media use produced stress and anxiety. Those who were searching for the potential harms of screen addiction on brain development now had to contend with life and work activities moving online. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/454241/original/file-20220324-21-10d2ghy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A phone's screen showing the twitter feed for twitter spaces" src="https://images.theconversation.com/files/454241/original/file-20220324-21-10d2ghy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/454241/original/file-20220324-21-10d2ghy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/454241/original/file-20220324-21-10d2ghy.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/454241/original/file-20220324-21-10d2ghy.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/454241/original/file-20220324-21-10d2ghy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/454241/original/file-20220324-21-10d2ghy.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/454241/original/file-20220324-21-10d2ghy.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Twitter Spaces is an example of how a social media platform has reverted to an earlier version of online social communication.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<h2>Pandemic reversal</h2>
<p>In March 2020, our research team used the occasion of the pandemic to explore <a href="https://doi.org/10.2196/20186">whether social media causes or relieves stress</a>. We asked respondents about the change in their patterns of different media usage as a result of the pandemic. One year later, we repeated the same question. What we found was a significant change in the nature of people’s interactions with social media — users avoided what was perceived as sensational and political content, but gravitated towards building community. </p>
<p>We observed this trend in another independent analysis of how older adults used social media and communications technology to cope with public health measures in response to the COVID-19 pandemic. We found that, for them, <a href="https://www.springerprofessional.de/en/role-of-social-media-in-coping-with-covid-19-stress-searching-fo/19324966">social media and new platforms such as Zoom were important only in as far as they connected them to their own families and communities</a>. </p>
<p>The pandemic made social media and communication platforms the inevitable extension of us. But by bringing us into this forced global embrace, <a href="https://www.nytimes.com/2022/01/27/opinion/covid-tribalism-politics.html">it may have also forced us to split along tribal divisions</a> — what anthropologist Gregory Bateson refers to as <a href="https://www.jstor.org/stable/2789408"><em>schismogenesis</em></a>. These divisions occur because of, and are exacerbated by, increasing conflict in communications about contentious topics such as lockdowns and mandatory vaccinations.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/454242/original/file-20220324-25-1hi8i67.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a group of people surrounding a banner reading WE DO NOT CONSENT" src="https://images.theconversation.com/files/454242/original/file-20220324-25-1hi8i67.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/454242/original/file-20220324-25-1hi8i67.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/454242/original/file-20220324-25-1hi8i67.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/454242/original/file-20220324-25-1hi8i67.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/454242/original/file-20220324-25-1hi8i67.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/454242/original/file-20220324-25-1hi8i67.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/454242/original/file-20220324-25-1hi8i67.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Protests against lockdowns were held globally, like this one in London, U.K. in March 2021, and the social divisions were also reflected online.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<h2>Chatroom revival</h2>
<p>COVID-19 revealed that social media companies are neither neutral nor benevolent. They pick their own tribes too. And when this happened, users reacted.</p>
<p>Research by <a href="https://www.pewresearch.org/fact-tank/2018/09/05/americans-are-changing-their-relationship-with-facebook/">the Pew Research Center</a> found that more than 40 per cent of Facebook users had begun abandoning the social network before the pandemic. </p>
<p>This followed a chain of controversies involving <a href="https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html">selling data to Cambridge Analytica</a> to gathering data about the psychological profile of American voters and allowing the <a href="https://www.reuters.com/article/usa-election-facebook-russia-idUSKBN25S5UC">Russians to interfere with an American election</a>. </p>
<p>When Facebook was accused of profiting from the spread of misinformation, they used the same type of data-mining methods <a href="https://www.forbes.com/sites/jemimamcevoy/2021/08/18/facebook-says-it-has-removed-20-million-pieces-of-covid-misinformation-but-sees-signs-vaccine-hesitancy-is-declining/">to monitor and censor posts on their platform</a>. Users could no longer ignore the fact that <a href="https://www.technologyreview.com/2021/03/11/1020600/facebook-responsible-ai-misinformation/">Facebook gathered and capitalized on their information for corporations that would pay for the data</a>. </p>
<p>As a result of this accelerated exodus, <a href="https://www.forbes.com/sites/greatspeculations/2022/02/08/meta-platforms-stock-dropped-by-25-last-week-what-next/?sh=3841de28182e">the company’s shares dropped by 25 per cent</a>. But Facebook acquired <a href="https://www.bloomberg.com/news/features/2020-12-09/facebook-fb-plans-to-turn-messaging-app-whatsapp-into-a-moneymaking-business">the end-to-end encrypted group chat app WhatsApp</a> and launched <a href="https://about.fb.com/news/2020/04/introducing-messenger-rooms/">private chatrooms unregulated by censoring algorithms</a>.</p>
<p>Both of these platforms represented a revival of chatrooms.</p>
<h2>Tribal platforms</h2>
<p>Donald Trump’s use of Twitter as his personal propaganda machine, especially in relation to his public health disinformation, pushed social media <a href="https://www.tweetbinder.com/blog/trump-twitter/">to a new edge</a>. When <a href="https://blog.twitter.com/en_us/topics/company/2020/suspension">Twitter blocked Trump’s account</a>, it illustrated the power of social media in political interference. <a href="https://www.theguardian.com/media/2022/mar/11/social-media-facebook-google-russia-ukraine">Media commentators sounded the alarm</a>, concerned that a corporation’s meddling in determining the legitimacy of narratives sets a dangerous precedence and threatens the right to the freedom of expression. </p>
<p>When cultural and ideological <a href="https://dx.doi.org/10.1057/s41290-020-00121-y">schismogenesis surfaced in different narratives of health and safety</a>, Twitter decisively took a position. In response, Trump created his own media platform: <a href="https://www.theverge.com/2022/2/21/22944179/truth-social-launch-ios-donald-trump-twitter-platform">Truth Social</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/Oz0YRceNwic?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">CNN asks whether Trump’s social media platform is something to be taken seriously.</span></figcaption>
</figure>
<p>There might still be a silver lining in changing our habits with regards to tribalized media usage. Anthropologist Heidi Larson, director of <a href="https://www.vaccineconfidence.org/">The Vaccine Confidence Project</a>, warns that <a href="https://doi.org/10.1038/d41586-020-00920-w">centralized “censorship” of information runs a greater risk in creating conspiratorial forms of information communications</a>. Larson suggests that <a href="https://doi.org/10.1038/d41586-018-07034-4">targeted social media is better suited to promote trust and serve public safety</a>.</p>
<p>It is not surprising that over the past two decades of globalized social media, we are now returning to the controlled-access chatrooms for people with proven ties and loyalties to each other. Whether this ‘tribalization’ is an effective response to how we cope with the stress of a world in which <a href="https://www.npr.org/2022/03/19/1087265230/4-reasons-why-social-media-can-give-a-skewed-account-of-the-war-in-ukraine">social media can be weaponized in times of war</a> remains to be seen.</p><img src="https://counter.theconversation.com/content/178775/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Najmeh Khalili-Mahani is a research associate at McGill University (McGill Centre for Integrative Neuroscience) and Concordia University (engAGE Centre for Studies in Aging). For her research, she has received funding from FRQSC-AUDACE. She is the founding director of Media Health Laboratory and the Game Clinic, which are dedicated to examining the implications of new media technologies in public health.</span></em></p>People used social media to connect with others, but after the pandemic, social media is increasingly fractured. Users adopt closed media spaces where they feel safe to express personal values.Najmeh Khalili-Mahani, Researcher, Director of Media-Health/Game-Clinic laboratory, Concordia UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1727432021-11-29T04:10:53Z2021-11-29T04:10:53ZThe government’s planned ‘anti-troll’ laws won’t help most victims of online trolling<figure><img src="https://images.theconversation.com/files/434348/original/file-20211129-21-1cyuuci.jpeg?ixlib=rb-1.1.0&rect=0%2C7%2C5000%2C3315&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Yesterday, Prime Minister Scott Morrison and Attorney-General Michaelia Cash <a href="https://www.attorneygeneral.gov.au/media/media-releases/combatting-online-trolls-and-strengthening-defamation-laws-28-november-2021">announced</a> proposed new legislation aimed at making online “trolls” accountable for their actions. </p>
<p>Over the past few weeks, we’ve heard Morrison decry trolls as “cowardly” and “un-Australian”, language that made it into the talking points at yesterday’s media conference. But is his new-found concern about trolling all it’s cracked up to be?</p>
<p>The proposed new legislation would give courts the power to force social media companies to pass on to people the details of their trolls, so they can pursue defamation action against them. </p>
<p>This decision is largely a reaction to the High Court’s <a href="https://theconversation.com/high-court-rules-media-are-liable-for-facebook-comments-on-their-stories-heres-what-that-means-for-your-favourite-facebook-pages-167435">upholding</a> of the ruling in the Dylan Voller case, which now holds media companies responsible for defamatory comments posted on their social media pages. But there are some things that we need to be wary of in this legislation.</p>
<h2>Defamation isn’t the same as trolling</h2>
<p>Speaking to the media yesterday, Morrison argued this legislation is a necessary means to curb online trolling. But the policy proposal largely deals with issues of defamation, which isn’t necessarily the same thing. </p>
<p>As I have <a href="https://theconversation.com/the-media-dangerously-misuses-the-word-trolling-79999">previously pointed out</a>, trolling is a grossly overused term that encompasses a range of activities. Defamation, meanwhile, is far more specific and legally defined. To prove defamation, one has to prove the content posted has damaged the victim’s reputation. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/high-court-rules-media-are-liable-for-facebook-comments-on-their-stories-heres-what-that-means-for-your-favourite-facebook-pages-167435">High Court rules media are liable for Facebook comments on their stories. Here's what that means for your favourite Facebook pages</a>
</strong>
</em>
</p>
<hr>
<p>Framing this announcement in the context of the very real harms of targeted online bullying and harassment is, I believe, disingenuous. I say this because those who suffer this kind of harassment aren’t likely to be bringing defamation suits. In short, this legislation won’t necessarily help them.</p>
<p>What’s more, a version of the newly announced powers already exists anyway. The recent <a href="https://www.esafety.gov.au/sites/default/files/2021-07/Online%20Safety%20Act%20-%20Fact%20sheet.pdf">Online Safety Act 2021</a> allows the e-Safety Commissioner to order social media companies to remove bullying or harassing content within 24 hours, or face a A$555,000 fine. Crucially, it also gives the commissioner powers to demand information about the owners of anonymous accounts who engage in online abuse.</p>
<p>Where social media companies fail to provide information about the offending poster, the newly announced laws would see them held accountable for the defamatory content. But that assumes they know this information in the first place.</p>
<p>Social media companies already collect users’ details on sign-up, including their name, email address, country of residence and, increasingly, telephone number. But for many social media platforms, there is nothing to stop users setting up an account with a fake name, using a throwaway email address or a “burner” phone, and then ditching all of that but maintaining the account once the information has been initially verified.</p>
<p>Even if the information provided is correct, it doesn’t mean the person will necessarily answer their phone or respond to an email. As one journalist asked yesterday, should social media companies be held accountable in that instance? The standard <a href="https://community.hrdaily.com.au/profiles/blogs/putting-the-reasonable-person-to-the-test">“reasonable person” assessment in law</a> would likely find not, meaning any defamation action brought against the company itself would likely fail.</p>
<h2>Social media ID laws by stealth</h2>
<p>My main concern with this proposed legislation is that it will prompt social media companies to collect enough information on their users so they become readily identifiable upon request. This seems a very similar concept to the government’s suggestion earlier this year that Australians who set up social media accounts should have to provide 100 points of identification. </p>
<p>That proposal was met with a <a href="https://www.smh.com.au/politics/federal/it-s-a-long-bow-social-media-id-push-dubbed-a-privacy-risk-20210402-p57g7d.html">barrage of criticism</a>, both for reasons of simple privacy, and because some experts, including myself, believe removing anonymity <a href="https://theconversation.com/ending-online-anonymity-wont-make-social-media-less-toxic-172228">won’t fix online toxicity anyway</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ending-online-anonymity-wont-make-social-media-less-toxic-172228">Ending online anonymity won't make social media less toxic</a>
</strong>
</em>
</p>
<hr>
<p>The other real issue, ironically enough, is one of user safety. Yes, online anonymity gives trolls a mask to hide behind, but it also allows people to access support for addiction or mental health issues, for example, or for a young LGBTQI+ person in fear of real-world violence or disapproval to find a community online. Online anonymity can be a crucial shield for victims of domestic violence who want to avoid being found by their abusers.</p>
<p>Forcing social media companies to provide users’ details to a court also opens up the possibility of “abuse of process”. This is where the legal process itself is used as a form of intimidation and bullying or, worse, for an abuser to gain access to their victim. The government has assured us the policy will contain safeguards against this, but has provided no detail so far on how this will be achieved.</p>
<p>Finally, it’s worth noting that several of the highest-profile current plaintiffs in Australian defamation cases involving social media defamation are to be found among the government itself. So while it might sound cynical, we’re entitled to wonder whom this policy is really designed to help.</p><img src="https://counter.theconversation.com/content/172743/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jennifer Beckett does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The government’s plan to make social media companies hand over trolls’ details aims to make it easier for victims to sue their harassers for defamation. But this conflates two very different concepts.Jennifer Beckett, Lecturer in Media and Communications, The University of MelbourneLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1663352021-08-19T04:05:44Z2021-08-19T04:05:44ZIs it actually false, or do you just disagree? Why Twitter’s user-driven experiment to tackle misinformation is complicated<figure><img src="https://images.theconversation.com/files/416891/original/file-20210819-23-rgit78.jpeg?ixlib=rb-1.1.0&rect=74%2C9%2C6155%2C4138&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption"></span> </figcaption></figure><p>Over the past year, we’ve seen how dramatically misinformation can impact the lives of people, communities and entire countries. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/public-protest-or-selfish-ratbaggery-why-free-speech-doesnt-give-you-the-right-to-endanger-other-peoples-health-165079">Public protest or selfish ratbaggery? Why free speech doesn't give you the right to endanger other people's health</a>
</strong>
</em>
</p>
<hr>
<p>In a bid to better understand how misinformation spreads online, Twitter has started an experimental trial in Australia, the United States and South Korea, allowing users to flag content they deem misleading.</p>
<p>Users in these countries can now flag tweets as misinformation through the same process by which other harmful content is reported. When reporting a post there is an option to choose “it’s misleading” — which can then be further categorised as related to “politics”, “health” or “something else”. </p>
<p><a href="https://www.theverge.com/2021/8/17/22629097/twitter-misinformation-health-covid19-reporting-feature-white-house">According to</a> Twitter, the platform won’t necessarily follow up on all flagged tweets, but will use the information to learn about misinformation trends. </p>
<p>Past research has suggested such “crowdsourced” approaches to reducing misinformation <a href="https://www.pnas.org/content/pnas/116/7/2521.full.pdf">may be promising</a> in highlighting untrustworthy sources online. That said, the usefulness of Twitter’s experiment will depend on the accuracy of users’ reports. </p>
<p>Twitter’s general policy describes a somewhat nuanced <a href="https://blog.twitter.com/en_us/topics/product/2020/updating-our-approach-to-misleading-information">approach</a> to moderating dubious posts, distinguishing between “unverified information”, “disputed claims” and “misleading claims”. A post’s “propensity for harm” determines whether it is flagged with a label or a warning, or is removed entirely.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/416895/original/file-20210819-27-t696vx.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/416895/original/file-20210819-27-t696vx.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/416895/original/file-20210819-27-t696vx.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=356&fit=crop&dpr=1 600w, https://images.theconversation.com/files/416895/original/file-20210819-27-t696vx.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=356&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/416895/original/file-20210819-27-t696vx.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=356&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/416895/original/file-20210819-27-t696vx.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=448&fit=crop&dpr=1 754w, https://images.theconversation.com/files/416895/original/file-20210819-27-t696vx.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=448&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/416895/original/file-20210819-27-t696vx.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=448&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">In a 2020 blog post, Twitter said it categorised false or misleading content into three broad categories.</span>
<span class="attribution"><span class="source">Screenshot</span></span>
</figcaption>
</figure>
<p>But the platform has not explicitly defined “misinformation” for users who will engage in the trial. So how will they know whether something is indeed “misinformation”? And what will stop users from flagging content they simply disagree with?</p>
<h2>Familiar information feels right</h2>
<p>As individuals, what we consider to be “true” and “reliable” can be driven by subtle cognitive biases. The more you hear certain information repeated, the more familiar it will feel. In turn, this feeling of familiarity tends to be taken as a sign of truth.</p>
<p>Even “deep thinkers” <a href="https://www.sciencedirect.com/science/article/abs/pii/S1053810019301977">aren’t immune</a> to this cognitive bias. As such, repeated exposure to certain ideas may get in the way of our ability to detect misleading content. Even if an idea is misleading, if it’s familiar enough <a href="https://behavioralpolicy.org/wp-content/uploads/2017/05/BSP_vol1is1_Schwarz.pdf">it may still pass the test</a>.</p>
<p>In direct contrast, content that is unfamiliar or difficult to process — but highly valid — may be incorrectly flagged as misinformation.</p>
<h2>The social dilemma</h2>
<p>Another challenge is a social one. Repeated exposure to information can also convey a social consensus, wherein our own attitudes and behaviours are shaped by <a href="https://onlinelibrary.wiley.com/doi/abs/10.1111/spc3.12155">what others think</a>. </p>
<p><a href="https://www.sciencedirect.com/science/article/pii/S1364661318300172?casa_token=0u_fb7ajFmIAAAAA:7-VSoOf8LaO9r5AP2xyz8dVPOG9HRU_NPTwcTiiSnfQL7S-PAqpCILJ67-TRwUtz6k2ggITpnA">Group identity</a> influences what information we think is factual. We think something is more “true” when it’s associated with our own group and comes from an in-group member (as opposed to an out-group member). </p>
<p><a href="https://journals.sagepub.com/doi/abs/10.1037/1089-2680.2.2.175">Research</a> has also shown we are inclined to look for evidence that supports our existing beliefs. This raises questions about the efficacy of Twitter’s user-led experiment. Will users who participate really be capturing false information, or simply reporting content that goes against their beliefs?</p>
<p>More strategically, there are social and political actors who deliberately try to downplay certain views of the world. Twitter’s misinformation experiment could be abused by well-resourced and motivated <a href="https://www.icrc.org/en/doc/assets/files/other/irrc_860_reicher.pdf">identity entrepreneurs</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/416893/original/file-20210819-19-2as24k.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/416893/original/file-20210819-19-2as24k.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/416893/original/file-20210819-19-2as24k.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=406&fit=crop&dpr=1 600w, https://images.theconversation.com/files/416893/original/file-20210819-19-2as24k.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=406&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/416893/original/file-20210819-19-2as24k.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=406&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/416893/original/file-20210819-19-2as24k.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=510&fit=crop&dpr=1 754w, https://images.theconversation.com/files/416893/original/file-20210819-19-2as24k.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=510&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/416893/original/file-20210819-19-2as24k.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=510&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Twitter has added an option to report ‘misleading’ content for users in the US, Australia and South Korea.</span>
<span class="attribution"><span class="source">Screenshot</span></span>
</figcaption>
</figure>
<h2>How to take a more balanced approach</h2>
<p>So how can users increase their chances of effectively detecting misinformation? One way is to take a consumer-minded approach. When we make purchases as consumers, we often compare products. We should do this with information, too. </p>
<p>“<a href="https://cor.stanford.edu/research/lateral-reading-and-the-nature-of-expertise/">Searching laterally</a>”, or comparing different sources of information, helps us <a href="https://link.springer.com/content/pdf/10.3758/s13421-020-01047-z.pdf">better discern</a> what is true or false. This is the kind of approach a fact-checker would take, and it’s often more effective than sticking with a single source of information.</p>
<p>At the supermarket we often look beyond the packaging and read a product’s ingredients to make sure we buy what’s best for us. Similarly, there are many new and <a href="https://misinforeview.hks.harvard.edu/article/global-vaccination-badnews/">interesting ways</a> to learn about disinformation tactics intended to mislead us online. </p>
<p>One example is <a href="https://www.getbadnews.com/#play">Bad News</a>, a free online game and media literacy tool which researchers found could “confer psychological resistance against common online misinformation strategies”.</p>
<p>There is also evidence that people who think of themselves as <a href="https://psycnet.apa.org/record/2013-30167-000">concerned citizens with civic duties</a> are more likely to weigh evidence in a balanced way. In an online setting, this kind of mindset may leave people better placed to identify and flag misinformation.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/vaccine-selfies-may-seem-trivial-but-they-show-people-doing-their-civic-duty-and-probably-encourage-others-too-164950">Vaccine selfies may seem trivial, but they show people doing their civic duty — and probably encourage others too</a>
</strong>
</em>
</p>
<hr>
<h2>Leaving the hard work to others</h2>
<p>We know from research that <a href="https://www.sciencedirect.com/science/article/abs/pii/S0010027719302276">thinking about accuracy</a> or the possible presence of misinformation in a space can reduce some of our cognitive biases. So actively thinking about accuracy when engaging online is a good thing. But what happens when I know someone else is onto it? </p>
<p>The behavioural sciences and game theory tell us people may be less inclined to make an effort themselves if they feel like they can <a href="https://www.britannica.com/topic/collective-action-problem-1917157">free-ride</a> on the effort of others. Even armchair activism may be reduced if there is a view misinformation is being solved. </p>
<p>Worse still, this belief may lead people to trust information more easily. In Twitter’s case, the misinformation-flagging initiative may lead some users to think any content they come across is likely true. </p>
<h2>Much to learn from these data</h2>
<p>As countries engage in vaccine rollouts, misinformation poses a significant threat to public health. Beyond the pandemic, misinformation <a href="https://theconversation.com/fossil-fuel-misinformation-may-sideline-one-of-the-most-important-climate-change-reports-ever-released-165887">about climate change</a> and political issues continues to present concerns for the health of our environment and our democracies. </p>
<p>Despite the many factors that influence how individuals identify misleading information, there is still much to be learned from how large groups come to identify what <em>seems</em> misleading. </p>
<p>Such data, if made available in some capacity, have great potential to benefit the science of misinformation. And combined with moderation and objective fact-checking approaches, it might even help the platform mitigate the spread of misinformation.</p><img src="https://counter.theconversation.com/content/166335/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Kate Reynolds has received funding from the Australian Research Council related to the impact of social identity on well-being attitudes and behaviour.</span></em></p><p class="fine-print"><em><span>Eryn Newman does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>We all have biases that impact what information we choose to accept and reject. But there are some ways we can train ourselves to become more discerning.Eryn Newman, Senior Lecturer, Research School of Psychology, Australian National UniversityKate Reynolds, Professor, Research School of Psychology, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1653232021-08-02T20:12:16Z2021-08-02T20:12:16ZInstagram’s privacy updates for kids are positive. But plans for an under-13s app means profits still take precedence<figure><img src="https://images.theconversation.com/files/414095/original/file-20210802-13-1j1w9w7.jpeg?ixlib=rb-1.1.0&rect=67%2C40%2C4425%2C2950&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Facebook <a href="https://about.instagram.com/blog/announcements/giving-young-people-a-safer-more-private-experience">recently announced</a> significant changes to Instagram for users aged under 16. New accounts will be private by default, and advertisers will be limited in how they can reach young people. </p>
<p>The new changes are long overdue and welcome. But Facebook’s commitment to childrens’ safety is still in question as it continues to develop a separate version of Instagram for kids aged under 13. </p>
<p>The company received <a href="https://www.theguardian.com/technology/shortcuts/2021/may/11/instagram-for-kids-the-social-media-site-no-one-asked-for">significant backlash</a> after the initial announcement in May. In fact, more than 40 US Attorneys General who usually support big tech <a href="https://www.cnbc.com/2021/05/10/attorneys-general-ask-facebook-to-abandon-instagram-for-kids-plans.html">banded together</a> to ask Facebook to stop building the under-13s version of Instagram, citing privacy and health concerns.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/is-social-media-damaging-to-children-and-teens-we-asked-five-experts-126499">Is social media damaging to children and teens? We asked five experts</a>
</strong>
</em>
</p>
<hr>
<h2>Privacy and advertising</h2>
<p>Online default settings matter. They set expectations for how we should behave online, and many of us <a href="https://doi.org/10.1016/j.chb.2019.07.001">will never shift away</a> from this by changing our default settings. </p>
<p>Adult accounts on Instagram are public by default. Facebook’s shift to making under-16 accounts private by default means these users will need to actively change their settings if they want a public profile. Existing under-16 users with public accounts will also get a prompt asking if they want to make their account private.</p>
<p>These changes normalise privacy and will encourage young users to focus their interactions more on their circles of friends and followers they approve. Such a change could go a long way in helping young people navigate online privacy.</p>
<p>Facebook has also limited the ways in which advertisers can target Instagram users under age 18 (or older in some countries). Instead of targeting specific users based on their interests gleaned via data collection, advertisers can now only broadly reach young people by focusing ads in terms of age, gender and location. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-companies-learn-what-children-secretly-want-63178">How companies learn what children secretly want</a>
</strong>
</em>
</p>
<hr>
<p>This change follows <a href="https://au.reset.tech/uploads/resettechaustralia_profiling-children-for-advertising-1.pdf">recently publicised research</a> that showed Facebook was allowing advertisers to target young users with risky interests — such as smoking, vaping, alcohol, gambling and extreme weight loss — with age-inappropriate ads.</p>
<p>This is particularly worrying, given Facebook’s <a href="https://about.fb.com/news/2021/07/age-verification/">admission</a> there is “no foolproof way to stop people from misrepresenting their age” when joining Instagram or Facebook. The apps ask for date of birth during sign-up, but have no way of verifying responses. Any child who knows basic arithmetic can work out how to bypass this gateway.</p>
<p>Of course, Facebook’s new changes do not stop Facebook itself from collecting young users’ data. And when an Instagram user becomes a legal adult, all of their data collected up to that point will then likely inform an incredibly detailed profile which will be available to facilitate Facebook’s main business model: extremely targeted advertising.</p>
<h2>Deploying Instagram’s top dad</h2>
<p>Facebook has been highly strategic in how it released news of its recent changes for young Instagram users. In contrast with Facebook’s chief executive Mark Zuckerberg, Instagram’s head <a href="https://www.instagram.com/mosseri/">Adam Mosseri</a> has turned his status as a parent into a significant element of his public persona. </p>
<p>Since Mosseri <a href="https://www.amazon.com.au/Instagram-Visual-Social-Media-Cultures/dp/1509534393">took over</a> after Instagram’s creators left Facebook in 2018, his profile has consistently emphasised he has three young sons, his curated Instagram stories include #dadlife and Lego, and he often signs off Q&A sessions on Instagram by mentioning he needs to spend time with his kids.</p>
<figure class="align-center ">
<img alt="Adam Mosseri's Instagram Profile" src="https://images.theconversation.com/files/413928/original/file-20210730-19-1s5i9f9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/413928/original/file-20210730-19-1s5i9f9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=359&fit=crop&dpr=1 600w, https://images.theconversation.com/files/413928/original/file-20210730-19-1s5i9f9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=359&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/413928/original/file-20210730-19-1s5i9f9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=359&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/413928/original/file-20210730-19-1s5i9f9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=452&fit=crop&dpr=1 754w, https://images.theconversation.com/files/413928/original/file-20210730-19-1s5i9f9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=452&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/413928/original/file-20210730-19-1s5i9f9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=452&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Adam Mosseri’s Instagram Profile on July 30 2021.</span>
<span class="attribution"><span class="source">Instagram</span></span>
</figcaption>
</figure>
<p>When Mosseri posted about the changes for under-16 Instagram users, he carefully framed the news as coming from a parent first, and the head of one of the world’s largest social platforms second. Similar to <a href="https://reallifemag.com/layers-of-identity/">many influencers</a>, Mosseri knows how to position himself as relatable and authentic.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1420014166652461060"}"></div></p>
<h2>Age verification and ‘potentially suspicious’ adults</h2>
<p>In a <a href="https://about.fb.com/news/2021/07/age-verification/">paired announcement</a> on July 27, Facebook’s vice-president of youth products Pavni Diwanji announced Facebook and Instagram would be doing more to ensure under-13s could not access the services.</p>
<p>Diwanji said Facebook was using artificial intelligence algorithms to stop “adults that have shown potentially suspicious behavior” from being able to view posts from young people’s accounts, or the accounts themselves. But Facebook has not offered an explanation as to how a user might be found to be “suspicious”. </p>
<p>Diwanji notes the company is “building similar technology to find and remove accounts belonging to people under the age of 13”. But this technology isn’t being used yet. </p>
<p>It’s reasonable to infer Facebook probably won’t actively remove under-13s from either Instagram or Facebook until the new Instagram For Kids app is launched — ensuring those young customers aren’t lost to Facebook altogether.</p>
<p>Despite public backlash, Diwanji’s post confirmed Facebook is indeed still building “a new Instagram experience for tweens”. As I’ve argued in the past, an Instagram for Kids — much like Facebook’s <a href="https://www.abc.net.au/news/2017-12-28/messenger-kids-is-facebooks-strategy-video-messeging-app-google/9285530">Messenger for Kids before it</a> — would be less about providing a gated playground for children and more about getting children familiar and comfortable with Facebook’s family of apps, in the hope they’ll stay on them for life.</p>
<p>A Facebook spokesperson told The Conversation that a feature introduced in March prevents users registered as adults from sending direct messages to users registered as teens who are not following them. </p>
<p>“This feature relies on our work to predict peoples’ ages using machine learning technology, and the age people give us when they sign up,” the spokesperson said.</p>
<p>They said “suspicious accounts will no longer see young people in ‘Accounts Suggested for You’, and if they do find their profiles by searching for them directly, they won’t be able to follow them”. </p>
<h2>Resources for parents and teens</h2>
<p>For parents and teen Instagram users, the recent changes to the platform are a useful prompt to begin or to revisit conversations about privacy and safety on social media. </p>
<p>Instagram does provide some <a href="https://about.instagram.com/community/parents">useful resources</a> for parents to help guide these conversations, including a bespoke Australian version of their <a href="https://about.instagram.com/en-us/file/217520986937315/IG-Parents-Guide-English-(Australia).pdf/">Parent’s Guide to Instagram </a> created in partnership with <a href="https://parents.au.reachout.com/landing/parentsguidetoinsta">ReachOut</a>. There are many other online resources, too, such as CommonSense Media’s <a href="https://www.commonsensemedia.org/blog/parents-ultimate-guide-to-instagram">Parents’ Ultimate Guide to Instagram</a>.</p>
<p>Regarding Instagram for Kids, a Facebook spokesperson told The Conversation the company hoped to “create something that’s really fun and educational, with family friendly safety features”. </p>
<p>But the fact that this app is still planned means Facebook can’t accept the most straightforward way of keeping young children safe: keeping them off Facebook and Instagram altogether. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/anorexia-coach-sexual-predators-online-are-targeting-teens-wanting-to-lose-weight-platforms-are-looking-the-other-way-162938">'Anorexia coach': sexual predators online are targeting teens wanting to lose weight. Platforms are looking the other way</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/165323/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tama Leaver receives funding from the Australian Research Council (ARC) as a chief investigator in the ARC Centre of Excellence for the Digital Child.</span></em></p>The changes do not stop Facebook itself from collecting young users’ data and keeping it.Tama Leaver, Professor of Internet Studies, Curtin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1629382021-07-25T19:57:24Z2021-07-25T19:57:24Z‘Anorexia coach’: sexual predators online are targeting teens wanting to lose weight. Platforms are looking the other way<figure><img src="https://images.theconversation.com/files/412836/original/file-20210723-15-8kitss.jpg?ixlib=rb-1.1.0&rect=9%2C24%2C1068%2C1894&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption"></span> <span class="attribution"><span class="license">Author provided</span></span></figcaption></figure><p>There’s no shortage of people online looking to exploit and manipulate the vulnerable among us. One such group is anorexia coaches, or “anacoaches”. </p>
<p>They are typically middle-aged, male sexual predators who go online to find impressionable young people to exploit under the guise of providing weight-loss “coaching”.</p>
<p>I have been researching how anacoaches operate. I’ve found they are facilitated by flaws within social media algorithms, as well as large numbers of young people seeking weight-loss help online.</p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/411732/original/file-20210717-19-1l2o5qt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/411732/original/file-20210717-19-1l2o5qt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=1007&fit=crop&dpr=1 600w, https://images.theconversation.com/files/411732/original/file-20210717-19-1l2o5qt.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=1007&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/411732/original/file-20210717-19-1l2o5qt.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=1007&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/411732/original/file-20210717-19-1l2o5qt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1265&fit=crop&dpr=1 754w, https://images.theconversation.com/files/411732/original/file-20210717-19-1l2o5qt.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1265&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/411732/original/file-20210717-19-1l2o5qt.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1265&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">An anacoach message on Tumblr.</span>
<span class="attribution"><span class="source">Author provided</span></span>
</figcaption>
</figure>
<p>My ongoing research, coupled with other media reports, indicates opportunity for <a href="https://www.theguardian.com/society/2019/mar/01/anorexia-coaches-kik-app-prey-eating-disorders">anacoaches</a> has risen in the past few years. My analysis showed that on Twitter alone there are about 300 unique requests for anacoaches around the world daily.</p>
<p>Anacoaches operate on numerous channels, including established social platforms such as Twitter, TikTok, Tumblr and Kik. Despite this, these platforms haven’t addressed the problem.</p>
<h2>Targeting teens</h2>
<p>An <a href="https://www2.deloitte.com/au/en/pages/economics/articles/butterfly-report-paying-price-eating-disorders.html">estimated 4% of Australians</a>, or roughly one million people, are affected by eating disorders. And almost two-thirds (63%) of these people are thought to be female. </p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/411731/original/file-20210717-15-1qy19qc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/411731/original/file-20210717-15-1qy19qc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=1267&fit=crop&dpr=1 600w, https://images.theconversation.com/files/411731/original/file-20210717-15-1qy19qc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=1267&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/411731/original/file-20210717-15-1qy19qc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=1267&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/411731/original/file-20210717-15-1qy19qc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1592&fit=crop&dpr=1 754w, https://images.theconversation.com/files/411731/original/file-20210717-15-1qy19qc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1592&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/411731/original/file-20210717-15-1qy19qc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1592&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Screenshot from TikTok.</span>
<span class="attribution"><span class="source">Author provided</span></span>
</figcaption>
</figure>
<p><a href="https://headtopics.com/uk/how-anorexic-kids-as-young-as-thirteen-are-targeted-by-anacoach-trolls-who-force-them-to-starve-th-5579246">Teenagers</a> with eating disorders are more likely to experience poor mental health and impaired functioning in social environments — which leaves them more vulnerable to the influence of anacoaches. </p>
<p>Also, <a href="https://www.sciencedirect.com/science/article/pii/S019074092032082X">research</a> has <a href="https://www.frontiersin.org/articles/10.3389/fpsyg.2017.01351/full">shown</a> social media use can exacerbate the extent to which teenagers and young adults chase a “thin” ideal. </p>
<p><a href="https://www.hetckm.nl/nieuws-en-publicaties/pro-ana-coaches-maken-bewust-misbruik-van-meisjes-met-eetstoornis/1">One study</a> published by a Dutch human rights <a href="https://www.dutchnews.nl/news/2019/05/researchers-raise-the-alarm-about-predatory-anorexia-coaches/">law group</a> on the predatory behaviours of anacoaches found self-reporting victims had been sexually assaulted and even raped. </p>
<p>And with anacoaching comes the potential for other forms of criminal abuse, such as paedophilia, forced prostitution and even human trafficking.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-virtual-door-to-online-child-sexual-grooming-is-wide-open-90972">The virtual door to online child sexual grooming is wide open</a>
</strong>
</em>
</p>
<hr>
<h2>Social media provides the platform</h2>
<p>With the rise of online platforms there has been an emergence of communities pursuing a thin ideal. These networks tend to share content that endorses extreme thinness. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1358778122372411394"}"></div></p>
<p>Group identity is formed through interactions and hashtag sharing, with a focus on terms used regularly in the context of eating disorders. Common hashtags include #proana (pro-anorexia), #bonespo (bone inspiration), #edtw (eating disorder trigger warning), #promia (pro bulimia), #bulimia, #thighgap, #uw (ultimate weight), #cw (current weight), #gw (goal weight) and #tw (trigger warning).</p>
<p>As highlighted in my previous <a href="https://mental.jmir.org/2021/7/e24340/">research</a>, communication in these communities includes exchanging weight-loss tips, diet plans, extreme exercise plans, imagery of thin bodies and emotional “support”.</p>
<p>Anacoaches lurk in chat forums focused on thin ideals. Each coach will tend to be present in numerous chatrooms, luring teenagers with stories of their past “successes” from coaching. </p>
<p>They market themselves with dubious claims. Some will assign themselves labels such as “strict coach” or “mean coach”. The screenshots below show messages posted on the app Kik.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/410541/original/file-20210709-25-ilssn6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/410541/original/file-20210709-25-ilssn6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=282&fit=crop&dpr=1 600w, https://images.theconversation.com/files/410541/original/file-20210709-25-ilssn6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=282&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/410541/original/file-20210709-25-ilssn6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=282&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/410541/original/file-20210709-25-ilssn6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=354&fit=crop&dpr=1 754w, https://images.theconversation.com/files/410541/original/file-20210709-25-ilssn6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=354&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/410541/original/file-20210709-25-ilssn6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=354&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Screenshot from Kik.</span>
<span class="attribution"><span class="source">Author provided</span></span>
</figcaption>
</figure>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/410534/original/file-20210709-17-ch1a9x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/410534/original/file-20210709-17-ch1a9x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=201&fit=crop&dpr=1 600w, https://images.theconversation.com/files/410534/original/file-20210709-17-ch1a9x.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=201&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/410534/original/file-20210709-17-ch1a9x.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=201&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/410534/original/file-20210709-17-ch1a9x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=252&fit=crop&dpr=1 754w, https://images.theconversation.com/files/410534/original/file-20210709-17-ch1a9x.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=252&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/410534/original/file-20210709-17-ch1a9x.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=252&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Screenshot from Kik.</span>
<span class="attribution"><span class="source">Author provided</span></span>
</figcaption>
</figure>
<p>The coaching predominantly involves sharing pictures and videos for nude body checks (or in undergarments), weekly weigh-ins, and enforcing strict rules on what foods to eat and avoid. </p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/410537/original/file-20210709-27-10bz3vp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/410537/original/file-20210709-27-10bz3vp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=391&fit=crop&dpr=1 600w, https://images.theconversation.com/files/410537/original/file-20210709-27-10bz3vp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=391&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/410537/original/file-20210709-27-10bz3vp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=391&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/410537/original/file-20210709-27-10bz3vp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=491&fit=crop&dpr=1 754w, https://images.theconversation.com/files/410537/original/file-20210709-27-10bz3vp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=491&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/410537/original/file-20210709-27-10bz3vp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=491&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Screenshot from Kik.</span>
<span class="attribution"><span class="source">Author provided</span></span>
</figcaption>
</figure>
<p>While there’s currently no way to know how long coaching lasts on average, the harms are extensive. Because of the way its content algorithms work, TikTok, which has a massive young following, will start to recommend user accounts centred around eating disorders once such content is initially sought.</p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/410972/original/file-20210713-23-o8rcac.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/410972/original/file-20210713-23-o8rcac.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=1267&fit=crop&dpr=1 600w, https://images.theconversation.com/files/410972/original/file-20210713-23-o8rcac.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=1267&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/410972/original/file-20210713-23-o8rcac.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=1267&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/410972/original/file-20210713-23-o8rcac.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1592&fit=crop&dpr=1 754w, https://images.theconversation.com/files/410972/original/file-20210713-23-o8rcac.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1592&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/410972/original/file-20210713-23-o8rcac.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1592&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Screenshot from TikTok.</span>
<span class="attribution"><span class="source">Author provided</span></span>
</figcaption>
</figure>
<h2>What is being done?</h2>
<p>There are currently not enough regulations in place by platforms to prevent anacoaches from operating, despite an array of reports highlighting the issue. </p>
<p>Best efforts so far <a href="https://gizmodo.com/instagram-tiktok-and-pinterest-add-support-features-to-1846331651">have involved</a> Instagram, TikTok and Pinterest filtering out selected words such as “proana” or “thinspo” and banning searches for content that promotes extreme thinness. </p>
<p>A TikTok spokesperson told The Conversation the platform does not allow content depicting, promoting or glorifying eating disorders. </p>
<p>“When a user searches for terms related to eating disorders, we don’t return results and instead we direct them to the Butterfly Foundation and provide them with helpful and appropriate advice. We’ve also introduced permanent public service announcements (PSAs) on related hashtags to help provide support for our community,” the spokesperson said. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/410934/original/file-20210713-27-1f2yfha.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/410934/original/file-20210713-27-1f2yfha.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/410934/original/file-20210713-27-1f2yfha.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=751&fit=crop&dpr=1 600w, https://images.theconversation.com/files/410934/original/file-20210713-27-1f2yfha.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=751&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/410934/original/file-20210713-27-1f2yfha.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=751&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/410934/original/file-20210713-27-1f2yfha.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=944&fit=crop&dpr=1 754w, https://images.theconversation.com/files/410934/original/file-20210713-27-1f2yfha.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=944&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/410934/original/file-20210713-27-1f2yfha.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=944&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Screenshot from TikTok.</span>
<span class="attribution"><span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>The spokesperson said accounts found to be engaging in sexual harassment may be banned. Platforms will ban users if they violate user guidelines, but anacoaches will <a href="https://mental.jmir.org/2021/7/e24340/">often reappear</a> under a new account name.</p>
<p>According to Twitter, evading account bans is against the rules. Earlier this year Twitter announced it would enable a <a href="https://www.theverge.com/2021/2/25/22301388/twitter-auto-block-mute-abusive-accounts-safety-mode">safety mode</a> that will allow users to turn on the proactive screening of spammy and abusive content. It remains to be seen what role this will play in curbing targeted attacks from anacoaches.</p>
<p>A <a href="https://5rightsfoundation.com/uploads/Pathways-how-digital-design-puts-children-at-risk.pdf">research-based report</a> released this month by the 5Rights Foundation has detailed how minors online are targeted with sexual and suicide-related content. It references platforms including Twitter, TikTok, Instagram, Snapchat, Facebook, Discord, Twitch, Yubo, YouTube and Omegle. </p>
<p>The research showed children as young as 13 are directly targeted with harmful content online within 24 hours of creating an account online. </p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/410948/original/file-20210713-25-w89ch8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/410948/original/file-20210713-25-w89ch8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=1267&fit=crop&dpr=1 600w, https://images.theconversation.com/files/410948/original/file-20210713-25-w89ch8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=1267&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/410948/original/file-20210713-25-w89ch8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=1267&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/410948/original/file-20210713-25-w89ch8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1592&fit=crop&dpr=1 754w, https://images.theconversation.com/files/410948/original/file-20210713-25-w89ch8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1592&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/410948/original/file-20210713-25-w89ch8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1592&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Screenshot from TikTok.</span>
<span class="attribution"><span class="source">Author provided</span></span>
</figcaption>
</figure>
<p>They may receive unsolicited messages from adults offering pornography, as well as recommendations for eating disorder content, extreme diets, self-harm, suicide and sexualised or distorted body images. </p>
<p>Australia’s policies involving platforms need to be overhauled to ensure platforms adhere to community guidelines and are held accountable when violations occur. </p>
<p>The government should prescribe set rules, informed by the eSafety office, regarding how vulnerable youth online should be helped.</p>
<p>A nuanced intervention approach would generate better outcomes for users with eating disorders as each user would have a <a href="https://mental.jmir.org/2021/7/e24340/">different set</a> of circumstances and a different mental health state.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1368586388820557826"}"></div></p>
<p>Anacoaches on social media should be considered and dealt with like criminals. And platforms that fail to uphold this should face fines for failing to provide a safe user environment for the vulnerable.</p>
<p>In the past the European Union <a href="https://www.bbc.com/news/technology-45495544">has fined</a> platforms for allowing terrorist content. Social media giants have also hired contract workers to screen content for examples of terrorism, paedophilia and abuse. This effort should be extended to include anacoaches. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/410535/original/file-20210709-15-wu6ic5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/410535/original/file-20210709-15-wu6ic5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/410535/original/file-20210709-15-wu6ic5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=246&fit=crop&dpr=1 600w, https://images.theconversation.com/files/410535/original/file-20210709-15-wu6ic5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=246&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/410535/original/file-20210709-15-wu6ic5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=246&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/410535/original/file-20210709-15-wu6ic5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=309&fit=crop&dpr=1 754w, https://images.theconversation.com/files/410535/original/file-20210709-15-wu6ic5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=309&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/410535/original/file-20210709-15-wu6ic5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=309&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Screenshot taken from Kik.</span>
<span class="attribution"><span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>The Conversation approached Tumblr for comment but did not receive replies within the deadline allocated. Popular messaging app Kik was <a href="https://techcrunch.com/2019/10/19/medialab-kik-messenger-app-portfolio/">acquired by</a> MediaLab in 2019. The Conversation approached MediaLab for comment but did not receive a response within the allocated timeframe.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1217175326616367107"}"></div></p><img src="https://counter.theconversation.com/content/162938/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Suku Sukunesan receives funding from NHMRC-MRFF examining social media content involving Eating Disorders. </span></em></p>Eating disorder ‘communities’ online can be dangerous places for young and impressionable teens. And social media algorithms further spread harmful content.Suku Sukunesan, Senior Lecturer in Information Systems, Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1621722021-06-20T20:18:58Z2021-06-20T20:18:58ZIs your phone really listening to your conversations? Well, turns out it doesn’t have to<figure><img src="https://images.theconversation.com/files/407172/original/file-20210618-27-os1quw.jpeg?ixlib=rb-1.1.0&rect=209%2C7%2C4782%2C2986&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Have you ever chatted with a friend about buying a certain item and been targeted with an ad for that same item the next day? If so, you may have wondered whether your smartphone was “listening” to you. </p>
<p>But is it really? Well, it’s no coincidence the item you’d been interested in was the same one you were targeted with. </p>
<p>But that doesn’t mean your device is actually listening to your conversations — it doesn’t need to. There’s a good chance you’re already giving it all the information it needs. </p>
<h2>Can phones hear?</h2>
<p>Most of us regularly <a href="https://www.emeraldgrouppublishing.com/archived/learning/management_thinking/articles/cookies.htm">disclose our</a> information to a wide range of websites and apps. We do this when we grant them certain permissions, or allow “cookies” to track our online activities.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/94-of-australians-do-not-read-all-privacy-policies-that-apply-to-them-and-thats-rational-behaviour-96353">94% of Australians do not read all privacy policies that apply to them – and that’s rational behaviour</a>
</strong>
</em>
</p>
<hr>
<p>So-called “first-party cookies” allow websites to “remember” certain details about our interaction with the site. For instance, login cookies let you save your login details so you don’t have to re-enter them each time.</p>
<p>Third-party cookies, however, are created by domains that are external to the site you’re visiting. The third party will often be a marketing company in a partnership with the first-party website or app. </p>
<p>The latter will host the marketer’s ads and grant it access to data it collects from you (which you will have given it permission to do — perhaps by clicking on some innocuous looking popup).</p>
<p>As such, the advertiser can build a picture of your life: your routines, wants and needs. These companies constantly seek to gauge the popularity of their products and how this varies based on factors such as a customer’s age, gender, height, weight, job and hobbies. </p>
<p>By classifying and clustering this information, advertisers improve their recommendation algorithms, using something called <a href="https://link.springer.com/article/10.1007/s40747-020-00212-w">recommender systems</a> <a href="https://arxiv.org/pdf/2009.06861.pdf">to target</a> the right customers with the right ads.</p>
<h2>Computers work behind the scenes</h2>
<p>There are several machine-learning techniques in artificial intelligence (AI) that help systems filter and analyse your data, such as data clustering, classification, association and <a href="https://bdtechtalks.com/2019/05/28/what-is-reinforcement-learning/">reinforcement learning</a> (RL). </p>
<p>An RL agent can <a href="https://bdtechtalks.com/2021/02/22/reinforcement-learning-ad-optimization/">train itself</a> based on feedback gained from user interactions, akin to how a young child will learn to repeat an action if it leads to a reward.</p>
<p>By viewing or pressing “like” on a social media post, you send a reward signal to an RL agent confirming you’re attracted to the post — or perhaps interested in the person who posted it. Either way, a message is sent to the RL agent about your personal interests and preferences.</p>
<p>If you start actively liking posts about “mindfulness” on a social platform, its system will learn to send you advertisements for companies that can offer related products and content. </p>
<p>Ad recommendations may be based on other data, too, including but not limited to:</p>
<ul>
<li><p>other ads you clicked on through the platform</p></li>
<li><p>personal details you provided the platform (such as your age, email address, gender, location and which devices you access the platform on)</p></li>
<li><p>information shared with the platform by other advertisers or marketing partners that already have you as a customer</p></li>
<li><p>specific pages or groups you have joined or “liked” on the platform.</p></li>
</ul>
<p>In fact, AI algorithms can help marketers take huge pools of data and use them to construct your entire social network, ranking people around you based on how much you “care about” (interact with) them. </p>
<p>They can then start to target you with ads based on not only your own data, but on data collected from your friends and family members using the same platforms as you. </p>
<p>For example, Facebook might be able to recommend you something your friend recently bought. It didn’t need to “listen” to a conversation between you and your friend to do this.</p>
<p><div data-react-class="InstagramEmbed" data-react-props="{"url":"https://www.instagram.com/p/CPhRFBjBX17/?utm_medium=copy_link","accessToken":"127105130696839|b4b75090c9688d81dfd245afe6052f20"}"></div></p>
<h2>Exercising your right to privacy is a choice</h2>
<p>While app providers are <em>supposed</em> to provide clear terms and conditions to users about how they collect, store and use data, nowadays it’s on users to be careful about which permissions they give to the apps and sites they use. </p>
<p>When in doubt, give permissions on an as-needed basis. It makes sense to give WhatsApp access to your camera and microphone, as it can’t provide some of its services without this. But not all apps and services will ask for only what is necessary. </p>
<p>Perhaps you don’t mind receiving targeted ads based on your data, and may find it appealing. <a href="https://hbr.org/2020/10/when-do-we-trust-ais-recommendations-more-than-peoples">Research</a> has shown people with a more “utilitarian” (or practical) worldview actually prefer recommendations from AI to those from humans. </p>
<p>That said, it’s possible AI recommendations can constrain people’s choices and <a href="https://theconversation.com/ai-is-killing-choice-and-chance-which-means-changing-what-it-means-to-be-human-151826">minimise serendipity</a> in the long term. By presenting consumers with algorithmically curated choices of what to watch, read and stream, companies may be implicitly keeping our tastes and lifestyle within a narrower frame.</p>
<h2>Don’t want to be predicted? Don’t be predictable</h2>
<p>There are some simple tips you can follow to limit the amount of data you share online. First, you should review your phone’s app permissions regularly. </p>
<p>Also, think twice before an app or website asks you for certain permissions, or to allow cookies. Wherever possible, avoid using your social media accounts to connect or log in to other sites and services. In most cases there will be an option to sign up via email, which could even be a <a href="https://helpdeskgeek.com/free-tools-review/5-best-free-disposable-email-accounts/">burner email</a>.</p>
<p>Once you do start the sign-in process, remember you only have to share as much information as is needed. And if you’re sensitive about privacy, perhaps consider installing a virtual private network (VPN) on your device. This will mask your IP address and encrypt your online activities.</p>
<h2>Try it yourself</h2>
<p>If you still think your phone is listening to you, there’s a simple experiment you can try.</p>
<p>Go to your phone’s settings and restrict access to your microphone for all your apps. Pick a product you know you haven’t searched for in any of your devices and talk about it out loud at some length with another person. </p>
<p>Make sure you repeat this process a few times. If you still don’t get any targeted ads within the next few day, this suggests your phone isn’t really “listening” to you. </p>
<p>It has other ways of finding out what’s on your mind.</p><img src="https://counter.theconversation.com/content/162172/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dana Rezazadegan is affiliated with Swinburne University of Technology. She is Superstar of STEM at Science and Technology Australia and Honorary fellow at Macquarie University.</span></em></p>Have you ever been targeted with ads that are scarily specific to you, and wondered how the app or website could have known?Dana Rezazadegan, Lecturer, Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1574492021-03-30T08:49:02Z2021-03-30T08:49:02ZSelfie culture: what your choice of camera angle says about you<figure><img src="https://images.theconversation.com/files/391936/original/file-20210326-17-14100gt.jpg?ixlib=rb-1.1.0&rect=0%2C1311%2C2630%2C1653&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Camera angles and selfie composition are proxies for how you might position yourself in a room </span> <span class="attribution"><a class="source" href="https://pxhere.com/en/photo/794140">PxHere</a>, <a class="license" href="http://artlibre.org/licence/lal/en">FAL</a></span></figcaption></figure><p>Over the past decade, selfies have become a mainstay of popular culture. If <a href="https://www.theguardian.com/technology/2013/jul/14/how-selfies-became-a-global-phenomenon">the #selfie hashtag first appeared in 2004</a>, it was the release of the iPhone 4 in 2010 that saw the pictures go viral. Three years later, the Oxford English Dictionary crowned “selfie” word of the year. </p>
<p>We use selfies for <a href="https://iafor.org/journal/iafor-journal-of-cultural-studies/volume-2-issue-2/article-5/">a variety of purposes</a>, ranging from the social to the professional. <a href="https://www.statista.com/statistics/304861/us-adults-shared-selfie-generation/">According to a 2018 survey</a>, 82% of US adults under 34 had posted a selfie on social media. Until the pandemic hit pause on public gatherings, an entire industry was dedicated to generating selfie <a href="https://www.nytimes.com/2018/09/26/arts/color-factory-museum-of-ice-cream-rose-mansion-29rooms-candytopia.html">events</a> and <a href="https://www.wired.com/story/selfie-factories-instagram-museum/">museums</a>. </p>
<p>Given this tremendous reach and popularity, the last four years have seen the phenomenon begin to <a href="https://www.frontiersin.org/research-topics/4557/understanding-selfies">receive attention</a> within the cognitive sciences. As recent studies have shown, <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0238588">including a recent one I led</a>, the way we take selfies – and the specific camera angles we choose – varies depending on what we intend to do with them.</p>
<h2>The left bias</h2>
<p><a href="https://www.nature.com/articles/243271a0">Since the 1970s</a> we have known that in historical western portraiture, artists favoured depicting the left cheek of their sitters, particularly when painting women. A 2017 study showed that when it comes to taking selfies, <a href="https://www.frontiersin.org/articles/10.3389/fpsyg.2017.01460/full">people tend to angle their smartphone in order to photograph their own left cheek too</a>. </p>
<p>Patterns have also been detected in the way selfie-takers position their cameras vertically. <a href="https://www.frontiersin.org/articles/10.3389/fpsyg.2017.00604/full">Another 2017 study</a> of selfies posted on Tinder found that when looking to hook up, women most often choose to shoot their selfies from above, and men from below. </p>
<p>My colleagues and I looked at how this might vary on a different platform. <a href="https://journals.plos.org/plosone/article/peerReview?id=10.1371/journal.pone.0238588">We considered</a> 2,000 selfies posted on a random sample of 200 different Instagram accounts – ten selfies per person. For each selfie, we recorded the gender of the user as apparent from the photograph, and whether they took their selfie from above, from below or frontally. We found that all the users – regardless of gender – tended to place the camera above their heads. </p>
<figure class="align-right ">
<img alt="A woman in costume holds up a smartphone with a selfie she's just taken" src="https://images.theconversation.com/files/391932/original/file-20210326-21-13blf8s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/391932/original/file-20210326-21-13blf8s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=900&fit=crop&dpr=1 600w, https://images.theconversation.com/files/391932/original/file-20210326-21-13blf8s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=900&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/391932/original/file-20210326-21-13blf8s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=900&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/391932/original/file-20210326-21-13blf8s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1131&fit=crop&dpr=1 754w, https://images.theconversation.com/files/391932/original/file-20210326-21-13blf8s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1131&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/391932/original/file-20210326-21-13blf8s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1131&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Selfies have been defined as a form of self-disclosure.</span>
<span class="attribution"><a class="source" href="https://pxhere.com/en/photo/1551241?utm_content=shareClip&utm_medium=referral&utm_source=pxhere">pxhere.com</a>, <a class="license" href="http://artlibre.org/licence/lal/en">FAL</a></span>
</figcaption>
</figure>
<p>These differences in camera position create different kinds of selfie. The question is why. But how do these choices relate to what the selfies are being used for, the platforms they’re posted on?</p>
<h2>Facial expressivity</h2>
<p>Most <a href="https://www.citylit.ac.uk/blog/how-take-great-selfie?gclid=Cj0KCQjw9YWDBhDyARIsADt6sGbnSRgyjDVuNGS6N_GYZkEhQtrEuYJqpBqpXUHOyeCjD7LWKnnfIHIaAlS5EALw_wcB">“how to take the best selfie” guides</a> emphasise that photographing your face at an angle and from above makes makes you look better. </p>
<p>This is borne out by <a href="https://www.frontiersin.org/articles/10.3389/fpsyg.2017.00604/full">a study of Tinder selfies</a> wherein the authors determined that men taking selfies from below was, partly, out of an attempt to appear taller and therefore more masculine. Women taking selfies from above, meanwhile, was said to achieve the opposite, and make them look shorter and more feminine. </p>
<p>Elsewhere, <a href="https://core.ac.uk/download/pdf/77612791.pdf">research</a> has looked at the early trends in selfie poses and how some were about angling and composing your face so as to look thinner and more vulnerable – which is also equated with being more attractive. </p>
<p>In trying to explain why a historical painter might have preferred the left side of their sitter’s face, researchers <a href="https://www.nature.com/articles/243271a0">explored several possibilities</a>. These ranged from whether the artist was left or right-handed, where the sitter sat in relation to the painter, or whether there was, in fact, a superiority of the left visual half-field in facial recognition: in other words, might a profile painted to the left of the canvas be more easily perceived? </p>
<p>The data though was inconclusive on all those theories, save perhaps the possibility, the authors of the study said, of a basic visual preference. It might be, they suggested, that we simply find the left side more attractive than the right. In selfies, both left and right-handed people showed the same left-cheek bias – so here too, it’s not about handiness. Instead, this prevalence suggests that we know, instinctively, that showing our left side is the better option. </p>
<p><a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3566189/">Recent evidence</a> provides a clearer reason why this might be. The left side of the face is controlled by the right hemisphere of the brain, which in turn is responsible for <a href="https://pubmed.ncbi.nlm.nih.gov/7220762/">communicating emotions</a>. Thus, the left side is the more emotionally expressive. </p>
<p>Researchers have also found that we tend to perceive ourselves as <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5318447/#:%7E:text=Previous%20research%20indicates%20that%20left,et%20al.%2C%202016">more attractive and likable in our selfies</a>, than in photographs other people take of us. </p>
<p>The degree of expressivity we go for depends on what we intend to communicate, and the platform we’re communicating on. By showing the left cheek – or shooting from above – we look more expressive. Placing the camera frontally, meanwhile, achieves a neutral look. </p>
<h2>Selfie proxemics</h2>
<p>Selfie-takers, in their choice of pose and other pictorial features, are providing nonverbal, social and emotional signals to their viewers. These signals <a href="https://www.frontiersin.org/articles/10.3389/fcomp.2020.00012/full">can be thought of</a> as the 2D equivalent of the nonverbal signals that we use in face-to-face communication. </p>
<p>In person, individuals control their posture and facial expressions, and how far they stand from each other, to express degrees of intimacy or avoidance. Since Edward Hall’s seminal 1960s work, <a href="https://escholarship.org/uc/item/4774h1rm">The Hidden Dimension</a>, we have called this spacing behaviour or proxemics. </p>
<p>In selfies, as in photography or <a href="https://doi.org/10.1162/105474601753272844">cinematography</a>, you have only got pictorial space to play with. But this too provides a set of proxemics: the way the subject is oriented, any left-right asymmetry in the composition, questions of relative size between objects in the frame. </p>
<p>These variables, which are determined through the distance from the camera, and, crucially, the camera angle, contribute to non-verbally communicating the selfie-taker’s motivations, intentions, or emotional states. </p>
<p>This chimes with the way selfies have been defined as a form of <a href="https://iafor.org/journal/iafor-journal-of-cultural-studies/volume-2-issue-2/article-5/">self-disclosure</a>. It’s not just about someone presenting or representing themselves, pictorially, in <a href="https://blogs.getty.edu/iris/whats-the-difference-between-a-selfie-and-a-self-portrait/">the way that self-portraits do</a> (a difference which my current research is looking at), but a means of revealing personal information within a dialogue. </p>
<p>The throwaway nature of the selfie sets it apart from the more considered, artistic intention of a self-portrait. Likewise, the way a selfie is all about context and interaction. As writer, theorist <a href="https://museuminabottle.com/2015/01/22/whats-the-difference-between-a-selfie-and-a-self-portrait/">and the person behind the Museum Selfies tumblr</a> puts it, “selfies are shared as part of a conversation”.</p><img src="https://counter.theconversation.com/content/157449/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Alessandro Soranzo does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Selfie takers often choose to shoot the left side of their face, from above. But why exactly is that thought to make you look better?Alessandro Soranzo, Reader in Experimental Psychology, Sheffield Hallam UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1561282021-03-09T19:08:49Z2021-03-09T19:08:49ZNew evidence shows half of Australians have ditched social media at some point, but millennials lag behind<figure><img src="https://images.theconversation.com/files/388444/original/file-20210309-21-13mxx29.jpg?ixlib=rb-1.1.0&rect=25%2C10%2C2409%2C1664&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>A recent nationally representative survey has shown Australians are willing and able to pull the plug on social media. </p>
<p>But it turns out the generation you were born in, as well as your level of education, will likely have a bearing on whether you do. This is important, as recent events have set the precedent for tech giants to pull or change content at any time. </p>
<p>Short-lived as it was, Facebook’s removal of Australian news raised interesting questions about our dependence on social media and whether we can do without it. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/google-is-leading-a-vast-covert-human-experiment-you-may-be-one-of-the-guinea-pigs-154178">Google is leading a vast, covert human experiment. You may be one of the guinea pigs</a>
</strong>
</em>
</p>
<hr>
<h2>Growing frustration with platforms</h2>
<p>Facebook’s actions (coupled with Google’s earlier threat to pull its Search function from Australia) prompted widespread criticism. </p>
<p>Twitter users got #deletefacebook <a href="https://www.cnbc.com/2021/02/19/australians-respond-to-facebooks-news-ban.html">trending</a>, while news columns called on Australians to <a href="https://meanjin.com.au/blog/what-is-news-and-who-decides/">consider</a> <a href="https://www.theguardian.com/commentisfree/2021/feb/19/facebook-is-gambling-australia-cant-live-without-it-imagine-if-we-prove-them-wrong">distancing</a> themselves from the platform. But it’s difficult to know exactly how many did.</p>
<p>The Australian Survey of Social Attitudes (AUSSA) is one of <a href="https://www.pewresearch.org/fact-tank/2019/04/10/share-of-u-s-adults-using-social-media-including-facebook-is-mostly-unchanged-since-2018/">few</a> studies uniquely placed to provide a balanced view on Australians’ social media use.</p>
<p>The randomised, nationally representative sample of the Australian population captures those who have never used social media, those who have curbed their use and those who have never stopped or reduced their use.</p>
<p>Results from the <a href="https://www.acspri.org.au/aussa/2019">2019–20 survey</a> show many Australians have either cut back on social media, or quit it altogether. Half the respondents had reduced their use at some point. </p>
<h2>Reasons for disconnecting</h2>
<p>People disconnect from social media for various reasons. These include <a href="https://www.theguardian.com/news/2018/mar/26/the-cambridge-analytica-files-the-story-so-far">concerns over privacy</a>, an “always on” <a href="https://journals.sagepub.com/doi/full/10.1177/1461444817711449">digital culture</a>, pressure from <a href="https://dl.acm.org/doi/10.1145/2470654.2466446">being on display to the public</a> and pressure from comparing <a href="https://dl.acm.org/doi/abs/10.1016/j.chb.2016.07.049">oneself to others</a>.</p>
<p>Others hold practical concerns such as <a href="https://dl.acm.org/doi/10.1145/2470654.2466446">wasting time</a>, being too busy to use social media, losing interest or being <a href="https://www.pewresearch.org/internet/2013/02/05/coming-and-going-on-facebook/">bored</a>. The majority (52%) of AUSSA respondents cited “boredom” and “time wasting” as the main reasons for limiting social media use. </p>
<p>Considering this, Facebook’s threat to become news-free may have constituted self-sabotage; it would have made the platform a blander, less informative and more disposable space.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/if-facebook-really-pulls-news-from-its-australian-sites-well-have-a-much-less-compelling-product-145380">If Facebook really pulls news from its Australian sites, we'll have a much less compelling product</a>
</strong>
</em>
</p>
<hr>
<p>Australians registered other concerns too, but in lower numbers. For instance, 18% cited frustration with online personas (such as excessive social comparisons and inauthenticity) as their main reason for disconnecting, while 15% cited privacy concerns.</p>
<p>Meanwhile, 14% of respondents had never used social media and 36% continued to use it consistently.</p>
<h2>Breakdown by education</h2>
<p>Past research has raised concerns over “<a href="https://isiarticles.com/bundles/Article/pre/pdf/115030.pdf">internet addiction</a>”, which refers to becoming so embedded in social media it becomes difficult to exit. </p>
<p>And the AUSSA survey reveals some of us seem more likely (and possibly more able) than others to disconnect from digital life. </p>
<p>Education was an important predictor of social media use and disconnection. Of those who hadn’t completed high school, 45% had reduced their social media use. </p>
<p>This rose to 51% among those with a high-school or post-school certificate — and to 56% among degree holders.</p>
<p>The link between higher education and social media use speaks to a certain “privilege of disconnection”, whereby the choice to disengage is easier for those with certain resources. </p>
<p>For example, when tertiary-educated people give up social media, they may be better placed to replace the networks and information lost with other sources of connection and capital. </p>
<h2>Generational gaps</h2>
<p>There were also notable differences in social media use between generations, although usage generally increased as generations became younger. </p>
<p>Of the Silent Generation (currently 76-93 years old), 40% had never used social media. This dropped to 0% among Gen Z (9-24 years old). </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/388148/original/file-20210307-23-7m30kv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/388148/original/file-20210307-23-7m30kv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=311&fit=crop&dpr=1 600w, https://images.theconversation.com/files/388148/original/file-20210307-23-7m30kv.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=311&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/388148/original/file-20210307-23-7m30kv.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=311&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/388148/original/file-20210307-23-7m30kv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=391&fit=crop&dpr=1 754w, https://images.theconversation.com/files/388148/original/file-20210307-23-7m30kv.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=391&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/388148/original/file-20210307-23-7m30kv.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=391&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">This graph shows the proportion of respondents from each generation who’d never used social media platforms.</span>
<span class="attribution"><span class="source">Roger Patulny</span></span>
</figcaption>
</figure>
<p>At 62%, Gen X (41-56 years old) led the way in social media reduction and disconnection. They were significantly more likely to have used and disconnected than baby boomers (57-75 years old). </p>
<p>But the rates of reduction and disconnection among millennials (25-40 years old) decreased, before increasing again for Gen Z. Millennials were also much more likely than Gen X to have never reduced their social media use at any point.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/388149/original/file-20210307-19-1e1893x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/388149/original/file-20210307-19-1e1893x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=351&fit=crop&dpr=1 600w, https://images.theconversation.com/files/388149/original/file-20210307-19-1e1893x.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=351&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/388149/original/file-20210307-19-1e1893x.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=351&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/388149/original/file-20210307-19-1e1893x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=441&fit=crop&dpr=1 754w, https://images.theconversation.com/files/388149/original/file-20210307-19-1e1893x.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=441&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/388149/original/file-20210307-19-1e1893x.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=441&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The proportion of each generation which either reduced or ceased social media usage.</span>
<span class="attribution"><span class="source">Roger Patulny</span></span>
</figcaption>
</figure>
<p>The relatively lower disconnection rate and higher usage rate among millennials is perhaps concerning. </p>
<p>This group may simply not have found a good reason to disconnect. However, since millennials were raised with social media strongly integrated into their teenage and adult lives, it may harder for them to kick the habit when needed. </p>
<p>The slight increase in disconnection among Gen Z is telling here, as it suggests the generation to follow may have developed a little more critical awareness of the downsides of making social media omnipresent in one’s life.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/388449/original/file-20210309-21-knmtvh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Young people studying together." src="https://images.theconversation.com/files/388449/original/file-20210309-21-knmtvh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/388449/original/file-20210309-21-knmtvh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/388449/original/file-20210309-21-knmtvh.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/388449/original/file-20210309-21-knmtvh.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/388449/original/file-20210309-21-knmtvh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/388449/original/file-20210309-21-knmtvh.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/388449/original/file-20210309-21-knmtvh.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">It’s often assumed school-aged kids are the most obsessed with social media. But while they might use it often, this happens alongside a growing awareness of the potential harms of excessive use.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>Managing a challenging relationship</h2>
<p>The survey findings suggest social media use is indeed ubiquitous among young people. </p>
<p>But they also suggest claims of a widespread rise in “internet addiction” are excessive, since the majority of respondents from Gen X onward had either reduced or halted their social media use. </p>
<p>This is good news. Tech platforms at times have shown an ethically questionable willingness to sacrifice our privacy and agency for personal gain, with both <a href="https://www.pnas.org/content/111/24/8788.full">Facebook</a> and Google guilty of covertly experimenting on users in the past.</p>
<p>These survey findings suggest we have some agency of our own. Tech giants can’t rely on user loyalty, or inertia and certainly not addiction. </p>
<p>Users may happily switch platforms — or switch off altogether — if they continue to be treated like bargaining chips in business deals. Big tech, take note.</p><img src="https://counter.theconversation.com/content/156128/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Roger Patulny does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Gen X is leading the way in kicking the social media habit. And concerns about an overall ‘internet addiction’ seem overblown.Roger Patulny, Associate Professor of Sociology, University of WollongongLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1529242021-01-08T22:46:21Z2021-01-08T22:46:21ZTwitter permanently suspends Trump after U.S. Capitol siege, citing risk of further violence<figure><img src="https://images.theconversation.com/files/377847/original/file-20210108-15-1wwugk0.jpg?ixlib=rb-1.1.0&rect=36%2C27%2C6002%2C3983&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Twitter and Facebook suspended Donald Trump's accounts after his posts commenting on the Capitol riot.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>In the wake of the assault on the U.S. Capitol by Donald Trump’s supporters, Twitter has permanently suspended the president’s account “<a href="https://blog.twitter.com/en_us/topics/company/2020/suspension.html">due to the risk of further incitement of violence</a>.”</p>
<figure class="align-center ">
<img alt="Screenshot of @realDonaldTrump's suspended Twitter account" src="https://images.theconversation.com/files/377859/original/file-20210108-23-3wr78m.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/377859/original/file-20210108-23-3wr78m.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=304&fit=crop&dpr=1 600w, https://images.theconversation.com/files/377859/original/file-20210108-23-3wr78m.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=304&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/377859/original/file-20210108-23-3wr78m.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=304&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/377859/original/file-20210108-23-3wr78m.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=382&fit=crop&dpr=1 754w, https://images.theconversation.com/files/377859/original/file-20210108-23-3wr78m.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=382&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/377859/original/file-20210108-23-3wr78m.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=382&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">@realDonaldTrump’s Twitter account is now permanently suspended.</span>
<span class="attribution"><a class="source" href="https://twitter.com/realDonaldTrump">Twitter</a></span>
</figcaption>
</figure>
<p>This follows a growing number of <a href="https://twitter.com/MichelleObama/status/1347284244763127810">influential voices</a> calling on social media platforms to stop enabling Trump’s inflammatory postings.</p>
<p>Prior to Twitter’s permanent ban, Twitter and Facebook had locked <a href="https://www.nytimes.com/2021/01/06/technology/capitol-twitter-facebook-trump.html">Trump’s account on their respective platforms</a>. Both companies took this unprecedented step after the president used the platforms to spread misinformation about <a href="https://www.washingtonpost.com/video/politics/fact-checking-trumps-jan-6-speech-to-stop-the-steal-protesters/2021/01/06/7037b1ec-3a64-4f3a-83ee-d21764b3f40f_video.html">election fraud</a> and openly condoned violence on Capitol Hill.</p>
<p>At one point, Trump posted a video <a href="https://www.theguardian.com/us-news/live/2021/jan/07/joe-biden-donald-trump-mike-pence-capitol-congress-us-election-coronavirus-live-updates">lauding the mob</a> as “<a href="https://www.bloomberg.com/news/articles/2021-01-06/trump-says-stay-peaceful-after-urging-supporters-to-protest">great patriots</a>.”</p>
<h2>Trump’s suspensions</h2>
<p>Twitter suspended the president’s account for <a href="https://twitter.com/TwitterSafety/status/1346970432017031178">12 hours</a> and required Trump to remove three tweets “for repeated and severe violations of [Twitter’s] Civic Integrity policy.” The company further warned the president that any future violations of their policies “will result in <a href="https://twitter.com/TwitterSafety/status/1346970433049022471">permanent suspension</a> of the @realDonaldTrump account.” </p>
<p>Facebook quickly followed suit and suspended Trump’s account. Initially the company announced a <a href="https://about.fb.com/news/2021/01/responding-to-the-violence-in-washington-dc/">24-hour</a> suspension, but in a new post the day after the Capitol attack, <a href="https://about.fb.com/news/2021/01/responding-to-the-violence-in-washington-dc/">Facebook said it would be placing a block</a> on Trump’s “Facebook and Instagram accounts indefinitely and for at least the next two weeks until the peaceful transition of power is complete.”</p>
<p>After Twitter unblocked his account, Trump wrote two tweets on Jan. 8 that the company said could incite further violence and violated its Glorification of Violence policy. Twitter permanently suspended @realDonaldTrump from using the service.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/L9UIWjIfIFA?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">SkyNews covers Trump’s suspension by Twitter and Facebook.</span></figcaption>
</figure>
<p>These decisions represent a major escalation by Twitter and Facebook. It’s also a reversal of their previous controversial policy to not block politicians such as Trump, but instead simply reduce engagement and <a href="https://variety.com/2020/digital/news/election-day-2020-facebook-twitter-misinformation-declare-victory-1234821661/">to apply misinformation warning labels</a>. </p>
<p>Digital democracy experts and social media researchers have <a href="https://www.bnnbloomberg.ca/social-media-giants-shown-up-once-again-in-effort-to-combat-election-misinformation-1.1516362">long warned</a> that on social media, speed is the enemy and that labels alone are not enough. A post with a misleading or false claim by a politician can spread in mere seconds — labelling a post as such after the fact cannot undo the damage already done.</p>
<h2>Handmaidens to the chaos</h2>
<p>While much of the blame for this week’s Capitol Hill attack lies with Trump, his enablers in the Republican Party and right-wing media, social media platforms have played handmaidens to the chaos. </p>
<p>Social media platforms have long touted the benefits of “<a href="https://techcrunch.com/2017/06/22/bring-the-world-closer-together/">connecting people</a>” and bringing them together in new communities, while downplaying societal costs. What goes unmentioned in all the glossy advertisements and news releases is that these platforms also connect and enable extremists and conspiracy theorists.</p>
<p>In their blind pursuit for eyeballs and profits, social media platforms have built technologies that amplify hate, misinformation and conspiracy theories. The recommendation and personalization algorithms they have created facilitated the rise of extremism and conspiracy theories, which in turn have undermined people’s trust in each other, and in governments and institutions.</p>
<p>In the early days of the pandemic, such algorithms helped fan the flames of rampant <a href="https://www.mcgill.ca/oss/article/covid-19-pseudoscience/anti-vaccine-movement-2020">COVID-19 denialism and extremism</a>, and impeded the work of public health officials. </p>
<p>In a study we conducted last year at the <a href="https://socialmedialab.ca/">Ryerson University Social Media Lab</a>, we were able to show how a single tweet, with a simple hashtag #FilmYourHospital, can spawn a viral COVID-19 conspiracy theory that quickly spreads around the world from the U.S. to Brazil to Japan.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/conspiracy-theorists-are-falsely-claiming-that-the-coronavirus-pandemic-is-an-elaborate-hoax-135985">Conspiracy theorists are falsely claiming that the coronavirus pandemic is an elaborate hoax</a>
</strong>
</em>
</p>
<hr>
<h2>Algorithmic promotion</h2>
<p>Facebook uses algorithms to amplify content. According to an internal 2018 presentation, Facebook’s <a href="https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499">own researchers</a> show that “64 per cent of all extremist group joins are due to our recommendation tools.” Despite this finding, <a href="https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499">Facebook executives sidelined the research</a>. </p>
<p>This decision is problematic in the aftermath of the recent U.S. election, when Facebook’s algorithmic recommendations helped to make the “Stop the Steal” Facebook groups — a hotbed for U.S. election misinformation — “<a href="https://www.forbes.com/sites/jackbrewster/2020/11/06/facebook-banned-stop-the-steal-then-other-groups-popped-up-in-its-place/">one of the fastest-growing Facebook groups in history</a>.” In the days after the U.S. election, Facebook’s algorithm drove 100 new people to join the first “Stop The Steal” group every 10 seconds, <a href="https://www.nytimes.com/2020/11/05/technology/stop-the-steal-facebook-group.html">helping the group to amass more than 320,000 members in a little over 20 hours</a>.</p>
<h2>Lost social media megaphone</h2>
<p>Trump’s penchant for lying and inciting his supporters on social media is finally being taken to task by social media platforms. But it took an insurrection for Trump to lose his social media megaphone.</p>
<figure class="align-center ">
<img alt="Donald Trump types on his mobile phone while sitting next to a glass of water" src="https://images.theconversation.com/files/377836/original/file-20210108-13-esk8eq.jpg?ixlib=rb-1.1.0&rect=12%2C6%2C4071%2C2712&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/377836/original/file-20210108-13-esk8eq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/377836/original/file-20210108-13-esk8eq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/377836/original/file-20210108-13-esk8eq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/377836/original/file-20210108-13-esk8eq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/377836/original/file-20210108-13-esk8eq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/377836/original/file-20210108-13-esk8eq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">President Donald Trump looks at his phone during a roundtable with governors in the White House, on June 18, 2020.</span>
<span class="attribution"><span class="source">(AP Photo/Alex Brandon)</span></span>
</figcaption>
</figure>
<p>The riot at the Capitol was the culmination of a decade of unfettered growth, combined with lackadaisical self-regulation by digital platforms. Social media platforms are now at a point where their inattention endangers our health and democratic systems. </p>
<p>For too long, government and regulators tasked with oversight of these companies have abdicated their responsibility and have allowed them to grow unchecked. Continuing to allow social media platforms to arbitrarily label a post here, ban an account there, reminds us of the complete lack of transparency that governs decision-making at social media companies. </p>
<p>Once the dust from the Capitol attack settles, social media companies will likely have to contend with an empowered Congress, united in its desire to act and, at long last, create a meaningful check on the power of social media platforms.</p><img src="https://counter.theconversation.com/content/152924/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Philip Mai does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Twitter has permanently suspended the account of the U.S. president, saying Donald Trump’s tweets glorified the violence of the siege on the Capitol.Philip Mai, Co-director and Senior Researcher, Ryerson Social Media Lab, Toronto Metropolitan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1475092020-10-21T15:45:32Z2020-10-21T15:45:32ZWe must make moral choices about how we relate to social media apps<figure><img src="https://images.theconversation.com/files/364456/original/file-20201020-14-nybccz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">The Social Dilemma/Netflix</span></span></figcaption></figure><p>Recently a South African <a href="https://www.kfm.co.za/Show/kfm-breakfast">radio show</a> asked, “If you had to choose between your mobile phone and your pet, which would choose?” Think about that for a moment. Many callers responded they would choose their phone. I was shocked… But to be honest, I give more attention to my phone than to my beloved dogs!</p>
<p>Throughout history there have been discoveries that have changed society in unimaginable ways. Written language made it possible to communicate over space and time. The printing press, say historians, helped shape societies <a href="https://www.jstor.org/stable/24357082">through</a> the mass dissemination of ideas. New modes of transport <a href="https://hrcak.srce.hr/index.php?id_clanak_jezik=237992&show=clanak">radically transformed</a> social norms by bringing people into contact with new cultures.</p>
<p>Yet these pale in comparison to how the internet is shaping, and misshaping, our individual and social <a href="https://www.counterpointknowledge.org/social-media-as-religion-unexamined-desire-and-mis-information/">identities</a>. I remember the first time I heard a teenager speaking with an American accent and discovered she’d never been out of South Africa but picked up her accent from watching YouTube. We shape our technologies, but they also shape us. </p>
<p>The potentially negative impacts of social media have again been highlighted by <a href="https://www.imdb.com/title/tt11464826/"><em>The Social Dilemma</em></a> on Netflix. The documentary, which Facebook has <a href="https://www.indiewire.com/2020/10/facebook-response-the-social-dilemma-1234590361/">slammed</a> as sensational and unfair, shows how dominant and largely unregulated social media companies manipulate users by harvesting personal data, while using <a href="https://theconversation.com/do-social-media-algorithms-erode-our-ability-to-make-decisions-freely-the-jury-is-out-140729">algorithms</a> to push information and ads that can lead to social media addiction – and dangerous anti-social behaviour. Among others, the show makes an example of the conspiracy theory <a href="https://theconversation.com/how-qanon-uses-satanic-rhetoric-to-set-up-a-narrative-of-good-vs-evil-146281">QAnon</a>, which is <a href="https://www.dailymaverick.co.za/article/2020-09-26-qanon-originated-in-south-africa-now-that-the-global-cult-is-back-here-we-should-all-be-afraid/">increasingly</a> <a href="https://www.thedailybeast.com/qanon-targets-africa-with-new-conspiracy-that-democrats-are-stealing-local-children">targeting</a> Africans.</p>
<p>Despite its flaws, the doccie got me wondering what our relationship should be to social media? As an ethics professor, I’ve come to realise that we must make moral choices about how we relate to our technologies. This requires an honest evaluation of our needs and weaknesses, and a clear understanding of the intentions of these platforms. </p>
<h2>Tug-of-war with technology</h2>
<p><a href="https://www.ynharari.com">Yuval Noah Harari</a>, author of <a href="https://www.theguardian.com/books/2014/sep/11/sapiens-brief-history-humankind-yuval-noah-harari-review"><em>Sapiens</em></a>, contends it’s our ability to inhabit “fiction” that differentiates humans. <a href="https://www.harpercollins.com/products/sapiens-yuval-noah-harari?variant=32207215656994">He claims</a> you “could never convince a monkey to give you a banana by promising him limitless bananas after death in monkey heaven”. Humans have a capacity to believe in things we cannot see – which changes things that do exist. Ideas like prejudice and hatred, for example, are powerful enough to cause wars that displace thousands. </p>
<p>The wall between Israel and Palestine was conceived in people’s minds before being transformed into bricks and barbed wire. Philosopher Oliver Razac’s book <a href="https://thenewpress.com/books/barbed-wire"><em>Barbed Wire: A political history</em></a> traces how this razor-sharp technology has been deployed from farms that displaced indigenous peoples to the trenches of World War I and the prisons of contemporary democracies. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/364455/original/file-20201020-22-1v3ttyf.jpg?ixlib=rb-1.1.0&rect=344%2C2%2C1572%2C778&q=45&auto=format&w=1000&fit=clip"><img alt="A young woman in a bathroom is engaged with her mobile phone, reflected in a mirror." src="https://images.theconversation.com/files/364455/original/file-20201020-22-1v3ttyf.jpg?ixlib=rb-1.1.0&rect=344%2C2%2C1572%2C778&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/364455/original/file-20201020-22-1v3ttyf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=245&fit=crop&dpr=1 600w, https://images.theconversation.com/files/364455/original/file-20201020-22-1v3ttyf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=245&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/364455/original/file-20201020-22-1v3ttyf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=245&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/364455/original/file-20201020-22-1v3ttyf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=308&fit=crop&dpr=1 754w, https://images.theconversation.com/files/364455/original/file-20201020-22-1v3ttyf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=308&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/364455/original/file-20201020-22-1v3ttyf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=308&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Sophia Hammons as Isla in <em>The Social Dilemma</em>.</span>
<span class="attribution"><span class="source">The Social Dilemma/Netflix</span></span>
</figcaption>
</figure>
<p>Technology is in a constant psychological, political and economic tug-of-war with humanity. Yet, some of today’s technologies are much more subtle than barbed wire. They are deeply <a href="https://books.google.co.za/books?hl=en&lr=&id=9wq9DwAAQBAJ&oi=fnd&pg=PA85&dq=info:gxEdWsbuE_0J:scholar.google.com&ots=5b6P23i9n9&sig=oonwZAiBsas7XNjTpP7e8pXq2XM&redir_esc=y#v=onepage&q&f=false">integrated into</a> our lives – they know us better than we know ourselves.</p>
<p>I have thousands of ‘friends’ on social media – far too many to relate to meaningfully. Yet, at times I can be more present to people that I have never met than I am to my family. This is not by chance – social media platforms are <a href="https://www.counterpointknowledge.org/social-media-as-religion-unexamined-desire-and-mis-information/">designed</a> to seek and hold our attention. They are businesses, intent on making money. Harvard University professor <a href="https://www.theguardian.com/books/2019/oct/04/shoshana-zuboff-surveillance-capitalism-assault-human-automomy-digital-privacy">Shoshana Zuboff</a>, who features in the documentary, explains in <a href="https://profilebooks.com/the-age-of-surveillance-capitalism.html"><em>The Age of Surveillance Capitalism</em></a> that social media “trades exclusively in human futures”.</p>
<h2>We are the product</h2>
<p>Zuboff says that social media platforms exploit our emotions and pre-cognate needs like belonging, recognition, acceptance and pleasure that are ‘hard wired’ into us to secure our survival. </p>
<p>Recognition relates to two of the primary <a href="https://books.google.co.za/books/about/The_Primal_Feast.html?id=TJF_xQAuLOYC&redir_esc=y">functions of the brain</a>, avoiding danger and finding ways to meet our basic survival needs (such as food or a mate to perpetuate our gene pool). These corporations, she says, are hiring the smartest engineers, social psychologists, behavioural economists and artists to hold our attention, while interspersing adverts between our videos, photos and status updates. They make money by offering a future that their advertisers will sell you. </p>
<p>Or, as former Google and Facebook employee Justin Rosenstein, says in <em>The Social Dilemma</em>:</p>
<blockquote>
<p>Our attention is the product being sold to advertisers. </p>
</blockquote>
<p>If our adult brains are so susceptible to this kind of manipulation, what effects are they having on the developing minds of children?</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/uaaC57tcci0?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Trailer for <em>The Social Dilemma</em>.</span></figcaption>
</figure>
<p>The documentary also reminds the viewer that social media has a more subtle and powerful influence on our lives – shaping our social and political realities. </p>
<h2>Fake news and hate speech</h2>
<p>The documentary uses an example from 2017 in which Facebook use is linked to <a href="https://www.reuters.com/article/us-facebook-india-content-idUSKBN1X929F">violence</a> that led to the displacement of close to 700,000 Rohingya persons in Myanmar. Something that doesn’t really exist (a social media platform) violently changed something that does exist (the safety of people). Facebook was a primary means of communication in Myanmar. New phones came with Facebook pre-installed. What users were unaware of was a ‘third person’ – Facebook’s algorithms – feeding information that included hate speech and fake news into their conversations. In Africa, similar reports have emerged from <a href="https://www.buzzfeednews.com/article/jasonpatinkin/how-to-get-people-to-murder-each-other-through-fake-news-and#.cfxZRym4z">South Sudan</a> and <a href="https://theconversation.com/a-vicious-online-propaganda-war-that-includes-fake-news-is-being-waged-in-zimbabwe-99402">Zimbabwe</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/netflixs-the-social-dilemma-highlights-the-problem-with-social-media-but-whats-the-solution-147351">Netflix's The Social Dilemma highlights the problem with social media, but what's the solution?</a>
</strong>
</em>
</p>
<hr>
<p>Another example used is the <a href="https://www.theguardian.com/technology/2019/mar/17/the-cambridge-analytica-scandal-changed-the-world-but-it-didnt-change-facebook">Cambridge Analytica</a> <a href="https://theconversation.com/why-facebook-is-the-reason-fake-news-is-here-to-stay-94308">scandal</a>, which also played out in <a href="https://qz.com/africa/1089911/bell-pottinger-and-cambridge-analyticas-work-in-south-africa-kenya-is-raising-questions/">Africa</a>, most notably in <a href="https://theconversation.com/how-the-nigerian-and-kenyan-media-handled-cambridge-analytica-128473">Nigeria and Kenya</a>. Facebook user information was mined and sold to nefarious political actors. This information (like what people feared and what upset them) was used to spread misinformation and manipulate their voting decisions on important elections.</p>
<h2>What to do about it?</h2>
<p>So, what do we do? We can’t very well give up on social media completely, and I don’t think it is necessary. These technologies are already deeply intertwined with our daily lives. We cannot deny they have some value. </p>
<p>However, just like humans had to adapt to the responsible use of the printing press or long distance travel, we will need to be more intentional about how we relate to these new technologies. We can begin by cultivating healthier social media <a href="https://books.google.co.za/books?hl=en&lr=&id=9wq9DwAAQBAJ&oi=fnd&pg=PA85&dq=info:gxEdWsbuE_0J:scholar.google.com&ots=5b6P23i9n9&sig=oonwZAiBsas7XNjTpP7e8pXq2XM&redir_esc=y#v=onepage&q&f=false">habits</a>.</p>
<p>We should also develop a greater awareness of the aims of these companies and how they achieve them, while understanding how our information is being used. This will allow us to make some simple commitments that align our social media usage to our better values.</p><img src="https://counter.theconversation.com/content/147509/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dion Forster receives funding from the South African National Research Foundation. </span></em></p>As more comes to light about the money-making tactics of social media platforms we need to reevaluate our relationship with them.Dion Forster, Head of Department, Systematic Theology and Ecclesiology, Stellenbosch UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1479482020-10-15T12:06:34Z2020-10-15T12:06:34Z‘Film Your Hospital’ – the anatomy of a COVID-19 conspiracy theory<figure><img src="https://images.theconversation.com/files/363448/original/file-20201014-17-c4310k.jpg?ixlib=rb-1.1.0&rect=916%2C584%2C3104%2C2091&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/fake-pandemic-conspiracy-theory-text-on-1740541940">Shutterstock/MartaDM</a></span></figcaption></figure><p>It’s widely believed that social media conspiracy theories are driven by malicious and anonymous “bots” set up by shadowy third parties. But <a href="https://www.jmir.org/2020/10/e22374#ref9">my new research</a> – which examined an extremely successful COVID-19 conspiracy theory – has shown that ordinary citizen accounts can be just as culpable when it comes to spreading dangerous lies and misinformation.</p>
<p>The pandemic has fuelled <a href="https://allianceforscience.cornell.edu/blog/2020/04/covid-top-10-current-conspiracy-theories/">at least ten conspiracy theories</a> this year. Some linked the spread of the disease to <a href="https://www.jmir.org/2020/10/e22374#ref4">the 5G network</a>, leading to phone masts being vandalised. Others argued that <a href="https://theconversation.com/coronavirus-is-not-a-bioweapon-but-bioterrorism-is-a-real-future-threat-135984">COVID-19 was a biological weapon</a>. Research has shown that conspiracy theories could contribute to people <a href="https://www.cambridge.org/core/journals/psychological-medicine/article/coronavirus-conspiracy-beliefs-mistrust-and-compliance-with-government-guidelines-in-england/9D6401B1E58F146C738971C197407461/core-reader">ignoring social distancing rules</a>. </p>
<p>The <em>#FilmYourHospital</em> movement was one such theory. It encouraged people to record videos of themselves in seemingly empty, or less-than-crowded, hospitals to <a href="https://globalnews.ca/news/6777373/bc-man-hot-water-covid-19-test-site-hospital/">prove the pandemic is a hoax</a>. Many videos <a href="https://observers.france24.com/en/20200422-debunked-fake-film-hospital-lies-covid19">showing empty corridors and wards</a> were shared.</p>
<p><a href="https://www.jmir.org/2020/10/e22374#ref9">Our research</a> sought to identify the drivers of the conspiracy and examine whether the accounts that propelled it in April 2020 were bots or real people. </p>
<h2>Scale of the conspiracy</h2>
<p>The <a href="https://theconversation.com/four-experts-investigate-how-the-5g-coronavirus-conspiracy-theory-began-139137">5G conspiracy</a> attracted 6,556 Twitter users over the course of a single week. The <em>#FilmYourHospital</em> conspiracy was much larger than 5G, with a total of 22,785 tweets sent over a seven day period by 11,333 users. It also had strong international backing. </p>
<figure class="align-center ">
<img alt="A social network graph which highlights how the conspiracy theory discussion was broken up into groups." src="https://images.theconversation.com/files/363172/original/file-20201013-19-tkfgaf.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/363172/original/file-20201013-19-tkfgaf.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=437&fit=crop&dpr=1 600w, https://images.theconversation.com/files/363172/original/file-20201013-19-tkfgaf.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=437&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/363172/original/file-20201013-19-tkfgaf.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=437&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/363172/original/file-20201013-19-tkfgaf.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=549&fit=crop&dpr=1 754w, https://images.theconversation.com/files/363172/original/file-20201013-19-tkfgaf.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=549&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/363172/original/file-20201013-19-tkfgaf.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=549&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Graph shows how the conspiracy theory discussion was broken up into different groups.</span>
<span class="attribution"><span class="source">Wasim Ahmed</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>The visualisation above shows each Twitter user as a small circle and the overall discussion is clustered into a number of different groups. These groups are formed based on how users were mentioning and re-tweeting each other.</p>
<p>The visualisation highlights how the three largest groups were responsible for spreading the conspiracy the furthest. For instance, the discussion in groups one and two was centred around a single tweet that was highly re-tweeted. The tweet suggested the public were being misled and that hospitals were not busy or overrun – as had been reported by the mainstream media. The tweet then requested other users to film their hospitals using the hashtag so that it could become a trending topic. The graphic shows the reach and size of these groups.</p>
<h2>Where are the bots?</h2>
<p>We used <a href="https://botometer.osome.iu.edu/faq">Botometer</a> to detect bots that draw on a machine learning algorithm. The tool calculates a score where low scores indicate human behaviour and a high score indicates a bot. Botometer works by extracting various features from an account such as its profile, friends, social network, patterns in temporal activity, language and sentiment. Our study took a 10% systematic representative sample of users to run through Botometer. </p>
<p>Our results indicated that the rate of automated accounts was likely to be low. We used the raw scores from Botometer to attach a probability label of whether the account was likely to be a bot. These ranged from very low, low, low-medium and high probability.</p>
<p>At best, only 9.2% of the sample that we looked at resembled highly suspicious account behaviour or bots. That means over 90% of accounts we examined were probably genuine. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/363227/original/file-20201013-17-t4n7q.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/363227/original/file-20201013-17-t4n7q.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=285&fit=crop&dpr=1 600w, https://images.theconversation.com/files/363227/original/file-20201013-17-t4n7q.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=285&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/363227/original/file-20201013-17-t4n7q.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=285&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/363227/original/file-20201013-17-t4n7q.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=358&fit=crop&dpr=1 754w, https://images.theconversation.com/files/363227/original/file-20201013-17-t4n7q.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=358&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/363227/original/file-20201013-17-t4n7q.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=358&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Figure shows how many of the accounts were suspicious or bot-like.</span>
<span class="attribution"><span class="source">Wasim Ahmed</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>Interestingly, we also found that deleted accounts and automated accounts contained keywords such as “Trump” and “Make America Great Again” in their user-bios. Around the same time President Donald Trump had been <a href="https://www.bbc.co.uk/news/world-us-canada-52274969">in disagreement</a> with scientific advisers on when to lift lockdown rules. </p>
<h2>Where did it come from?</h2>
<p>When we examined the most influential users connected to the hashtag we found that the conspiracy theory was driven by influential conservative politicians as well as far-right political activists. <a href="https://theconversation.com/coronavirus-and-conspiracies-how-the-far-right-is-exploiting-the-pandemic-145968">Scholars</a> have noted how the far right has been exploiting the pandemic. For example, some of have set up channels on Telegram, a cloud-based instant messaging service, to discuss COVID-19 and have amplified disinformation. </p>
<p>But once the conspiracy theory began to generate attention it was sustained by ordinary citizens. The campaign also appeared to be supported and driven by pro-Trump Twitter accounts and our research found that some accounts that behaved like “bots” and deleted accounts tended to be pro-Trump. It is important to note that not all accounts that behave like bots are bots, as there might be users who are highly active who could receive a high score. And, conversely, not all bots are harmful as some have been set up for legitimate purposes. </p>
<p>Twitter users frequently shared YouTube videos in support of the theory and YouTube was an influential source.</p>
<h2>Can they be stopped?</h2>
<p>Social media organisations can monitor for suspicious accounts and content and if they violate the terms of service, the content should be removed quickly. Twitter experimented with attaching <a href="https://techcrunch.com/2020/06/09/twitter-5g-coronavirus-labels/?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAACUaeHrbj2O8wmBsfa_M97Pd-WeMv1KJyp9T0tQEGYqjrl589KeJYiiQ7GpZjZZiwvgALP-lg8JbNpdWpKM4XQi6DiAT8lkmWcS5HRW-RbGlyH9RkmprkaNtcZ43tDIom8EjdoOIW0g1Wy2CnWYmmq2ZL9iXdzFeFRGwjteG_bgQ">warning labels on tweets</a>. This was initially unsuccessful because Twitter <a href="https://www.businessinsider.com/twitter-5g-coronavirus-label-blames-algorithm-encourages-conspiracy-theories-2020-6?r=US&IR=T">accidentally mislabelled some tweets</a>, which might have inadvertently pushed conspiracies further. But if they manage to put together a better labelling technique this could be an effective method. </p>
<p>Conspiracies can also be countered by providing trustworthy information, delivered from public health authorities as well as popular culture “influencers”. For instance, Oldham City Council in the UK, enlisted the help of actor James Buckley – famous for his role as Jay in the E4 sitcom The Inbetweeners – <a href="https://www.manchestereveningnews.co.uk/news/showbiz-news/oldham-council-uses-inbetweeners-star-18759983">to spread public health messages</a>.</p>
<p>And other research highlights that explaining flawed arguments and describing scientific consensus may help <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0175799">reduce the effect of misinformation.</a>. Sadly, no matter what procedures and steps are put in place, there will always be people who will believe in conspiracies. The onus must be on the platforms to make sure these theories are not so easily spread.</p><img src="https://counter.theconversation.com/content/147948/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Wasim Ahmed does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>It’s not just bots which spread misinformation on social media.Wasim Ahmed, Lecturer in Digital Business, Newcastle UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1473512020-10-06T05:28:06Z2020-10-06T05:28:06ZNetflix’s The Social Dilemma highlights the problem with social media, but what’s the solution?<figure><img src="https://images.theconversation.com/files/361801/original/file-20201006-18-1m19l67.png?ixlib=rb-1.1.0&rect=64%2C7%2C2453%2C1105&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Netflix/Screenshot</span></span></figcaption></figure><p>Facebook has <a href="https://www.businessinsider.com.au/facebook-says-netflix-documentary-the-social-dilemma-sensationalist-2020-10?r=US&IR=T">responded</a> to Netflix documentary The Social Dilemma, saying it “buries the substance in sensationalism”.</p>
<p>The show is currently in Netflix Australia’s top ten list and has been popular around the globe. Some <a href="https://www.independent.co.uk/arts-entertainment/films/features/social-dilemma-netflix-film-media-facebook-twitter-algorithm-addiction-conspiracy-b454736.html">media pundits</a> suggest it’s “the most important documentary of our times”. </p>
<p>The Social Dilemma focuses on how big social media companies manipulate users by using algorithms that encourage addiction to their platforms. It also shows, fairly accurately, how platforms harvest personal data to target users with ads – and have so far gone largely unregulated. </p>
<p>But what are we meant to do about it? While the Netflix feature educates viewers about the problems social networks present to both our privacy and agency, it falls short of providing a tangible solution.</p>
<h2>A misleading response</h2>
<p>In a statement responding to the documentary, Facebook <a href="https://about.fb.com/wp-content/uploads/2020/10/What-The-Social-Dilemma-Gets-Wrong.pdf">denied</a> most of the claims made by former Facebook and other big tech company employees interviewed in The Social Dilemma. </p>
<p>It took issue with the allegation users’ data are harvested to sell ads and that this data (or the behavioural predictions drawn from it) represents the “product” sold to advertisers. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/if-its-free-online-you-are-the-product-95182">If it’s free online, you are the product</a>
</strong>
</em>
</p>
<hr>
<p>“Facebook is an ads-supported platform, which means that selling ads allows us to offer everyone else the ability to connect for free,” Facebook says.</p>
<p>However, this is a bit like saying chicken food is free for battery hens. Harvesting users’ data and selling it to advertisers, even if the data is not “<a href="https://about.fb.com/wp-content/uploads/2020/10/What-The-Social-Dilemma-Gets-Wrong.pdf">personally identifiable</a>”, is undeniably Facebook’s business model.</p>
<h2>The Social Dilemma doesn’t go far enough</h2>
<p>That said, The Social Dilemma sometimes resorts to simplistic metaphors to illustrate the harms of social media. </p>
<p>For example, a fictional character is given an “executive team” of people operating behind the scenes to maximise their interaction with a social media platform. This is supposed to be a metaphor for algorithms, but is a little creepy in its implications.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A character from The Social Dilemma looks at his phone." src="https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=303&fit=crop&dpr=1 600w, https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=303&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=303&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=381&fit=crop&dpr=1 754w, https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=381&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/361798/original/file-20201006-24-1i45bg9.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=381&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The Social Dilemma uses dramatisations (which aren’t necessarily accurate) to explore how social media algorithms are designed to be addictive.</span>
<span class="attribution"><span class="source">IMDB</span></span>
</figcaption>
</figure>
<p><a href="https://www.cnbc.com/2020/09/18/netflixs-the-social-dilemma-results-in-people-deleting-facebook-instagram.html">News reports</a> allege large numbers of people have <a href="https://www.theage.com.au/national/victoria/it-makes-you-want-to-throw-your-phone-in-the-bin-the-film-turning-teens-off-social-media-20200926-p55zhi.html">disconnected</a> or are taking “breaks” from social media after watching The Social Dilemma. </p>
<p>But although one of the interviewees, <a href="https://www.smithsonianmag.com/innovation/what-turned-jaron-lanier-against-the-web-165260940/">Jaron Lanier</a>, has a book called “10 Reasons To Delete your Social Accounts”, the documentary does not explicitly call for this. No immediately useful answers are given.</p>
<p>Filmmaker Jeff Orlowski seems to frame <a href="https://theconversation.com/ethical-design-is-the-answer-to-some-of-social-medias-problems-89531">“ethical” platform design</a> as the antidote. While this is an important consideration, it’s not a complete answer. And this framing is one of several issues in The Social Dilemma’s approach.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=425&fit=crop&dpr=1 600w, https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=425&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=425&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=534&fit=crop&dpr=1 754w, https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=534&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/361800/original/file-20201006-20-14pl8c7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=534&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Ethical design considers the moral consequences of the design choices in a platform. It is design made with the intent to ‘do good’.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>The program also relies uncritically on interviews with former tech executives, who apparently never realised the consequences of manipulating users for monetary gain. It propagates the Silicon Valley fantasy they were just innocent geniuses wanting to improve the world (despite ample <a href="https://medium.com/@rossformaine/i-was-googles-head-of-international-relations-here-s-why-i-left-49313d23065">evidence</a> to the <a href="https://www.wired.com/story/cambridge-analytica-50m-facebook-users-data/">contrary</a>). </p>
<p>As tech policy expert Maria Farell suggests, these retired “<a href="https://conversationalist.org/2020/03/05/the-prodigal-techbro/">prodigal tech bros</a>”, who are now safely insulated from consequences, are presented as the moral authority. Meanwhile, the digital rights and privacy activists who have worked for decades to hold them to account are largely omitted from view. </p>
<h2>Behavioural change</h2>
<p>Given the documentary doesn’t really tell us how to fight the tide, what can you, as the viewer, do? </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/a-month-at-sea-with-no-technology-taught-me-how-to-steal-my-life-back-from-my-phone-127501">A month at sea with no technology taught me how to steal my life back from my phone</a>
</strong>
</em>
</p>
<hr>
<p>Firstly, you can take The Social Dilemma as a cue to become more aware of how much of your data is given up on a daily basis – and you can change your behaviours accordingly. One way is to change your social media privacy settings to restrict (as much as possible) the data networks can gather from you. </p>
<p>This will require going into the “settings” on every social platform you have, to restrict both the audience you share content with and the number of third parties the platform shares your behavioural data with. </p>
<p>In Facebook, you can actually <a href="https://theconversation.com/how-to-stop-haemorrhaging-data-on-facebook-94511">switch off “platform apps” entirely</a>. This restricts access by partner or third-party applications. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-to-stop-haemorrhaging-data-on-facebook-94511">How to stop haemorrhaging data on Facebook</a>
</strong>
</em>
</p>
<hr>
<p>Unfortunately, even if you do restrict your privacy settings on platforms (particularly Facebook), they can still collect and use your “platform” data. This includes content you read, “like”, click and hover over.</p>
<p>So, you may want to opt for limiting the time you spend on these platforms. This is not always practical, given how <a href="https://www.vox.com/culture/2018/3/22/17146776/delete-facebook-how-to-quit-difficult">important they are in our lives</a>. But if you want to do so, there are dedicated tools for this in some mobile operating systems. </p>
<p>Apple’s iOS, for example, has implemented “screen time” tools aimed at minimising time spent on apps such as Facebook. Some have argued, though, this can <a href="https://www.theatlantic.com/technology/archive/2019/09/why-apple-screen-time-mostly-makes-things-worse/597397/">make things worse</a> by making the user feel bad, while still easily side-stepping the limitation.</p>
<p>As a user, the best you can do is tighten your privacy settings, limit the time you spend on platforms and carefully consider whether you need each one. </p>
<h2>Legislative reform</h2>
<p>In the long run, stemming the flow of personal data to digital platforms will also need legislative change. While legislation can’t fix everything, it can encourage systemic change. </p>
<p>In Australia, we need stronger data privacy protections, preferably in the form of blanket legislative protection such as the General Data Protection Regulation <a href="https://eur-lex.europa.eu/content/news/general-data-protection-regulation-GDPR-applies-from-25-May-2018.html">implemented in Europe</a> in 2018. </p>
<p>The GDPR was designed to bring social media platforms to heel and is geared towards providing individuals more control over their personal data. Australians don’t yet have similar comprehensive protections, but regulators have been making inroads. </p>
<p>Last year, the Australian Competition and Consumer Commission finalised its <a href="https://www.accc.gov.au/speech/the-acccs-digital-platforms-inquiry-and-the-need-for-competition-consumer-protection-and-regulatory-responses">Digital Platforms Inquiry</a> investigating a range of issues relating to tech platforms, including data collection and privacy.</p>
<p>It made a number of recommendations that will hopefully result in legislative change. These focus on improving and bolstering the definitions of “consent” for consumers, including explicit understanding of when and how their data is being tracked online. </p>
<p>If what we’re facing is indeed a “social dilemma”, it’s going to take more than the remorseful words of a few Silicon Valley tech-bros to solve it.</p><img src="https://counter.theconversation.com/content/147351/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The documentary educates viewers about the problems social networks present to both our privacy and agency online. But it doesn’t really tell us how to fight the tide.Belinda Barnet, Senior Lecturer in Media and Communications, Swinburne University of TechnologyDiana Bossio, Lecturer, Media and Communications, Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1457562020-09-08T05:43:48Z2020-09-08T05:43:48ZTikTok suicide video: it’s time platforms collaborated to limit disturbing content<figure><img src="https://images.theconversation.com/files/356876/original/file-20200908-16-pazcjt.jpg?ixlib=rb-1.1.0&rect=0%2C29%2C4898%2C3201&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>A disturbing video purporting to show a suicide is reportedly doing the rounds on the popular short video app TikTok, reigniting debate about what social media platforms are doing to limit circulation of troubling material.</p>
<p>According to media <a href="https://www.news.com.au/technology/online/social/parents-warned-about-shocking-suicide-video-on-tiktok-that-may-be-hidden-in-other-content/news-story/c135157c5a009fdcaaa7f9d54b146f7a">reports</a>, the video first showed up on Facebook in late August but has been re-uploaded and shared across Instagram and TikTok — reportedly sometimes <a href="https://heavy.com/news/2020/09/tiktok-viral-suicide-video/">cut with seemingly harmless content</a> such as cat videos.</p>
<p>TikTok users have <a href="https://www.tiktok.com/@aesthetically_80s/video/6869650300878261510">warned</a> others to swipe away quickly if they see a video pop up showing a man with long hair and a beard. </p>
<p>A statement by TikTok <a href="https://www.news.com.au/technology/online/social/parents-warned-about-shocking-suicide-video-on-tiktok-that-may-be-hidden-in-other-content/news-story/c135157c5a009fdcaaa7f9d54b146f7a">quoted</a> by News.com.au said:</p>
<blockquote>
<p>Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide.</p>
<p>We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.</p>
</blockquote>
<p>Schools and child safety advocates have warned parents to be alert for the possibility their child may see — or may have already seen — the video if they are a TikTok or Instagram user.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1302982768981413888"}"></div></p>
<p>The sad reality is users will continue to post disturbing content and it is impossible for platforms to moderate before posting. And once a video is live, it doesn’t take long for the content to migrate across to other platforms. </p>
<p>Pointing the finger at individual platforms such as TikTok won’t solve the problem. What’s needed is a coordinated approach where the big social media giants work together. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/dont-just-blame-youtubes-algorithms-for-radicalisation-humans-also-play-a-part-125494">Don't just blame YouTube’s algorithms for ‘radicalisation’. Humans also play a part</a>
</strong>
</em>
</p>
<hr>
<h2>Evading moderation</h2>
<p>Post-moderation means even the worst content can be published. Either the platforms identify it with machine learning systems, or users report it to be processed by human moderators. But it can be live for five minutes, an hour or longer.</p>
<p>Once a video is up, it can be downloaded by bad actors, modulated to reduce the chance of detection by content moderation machine learning systems, and shared across multiple platforms — Reddit, Instagram, Facebook or more.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/356878/original/file-20200908-24-1uv4l6t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A teen looks at their phone." src="https://images.theconversation.com/files/356878/original/file-20200908-24-1uv4l6t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/356878/original/file-20200908-24-1uv4l6t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/356878/original/file-20200908-24-1uv4l6t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/356878/original/file-20200908-24-1uv4l6t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/356878/original/file-20200908-24-1uv4l6t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/356878/original/file-20200908-24-1uv4l6t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/356878/original/file-20200908-24-1uv4l6t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">TikTok is particularly popular among young people.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>These bad actors can cut the video slightly differently, edit it within harmless material, put filters on it or distort the audio to make it difficult for the content moderation programs to automatically identify disturbing videos. Machine learning with visual content is advancing but it’s not perfect.</p>
<p>This is broadly what happened with video of the Christchurch massacre, where content taken from the gunman’s Facebook livestream of his attack was downloaded and then shared across various platforms. </p>
<p>By the time Facebook took down the original video, people already had copies of it and were uploading to Facebook, Reddit, YouTube and more. It very quickly became a cross-platform problem. These bad actors can also add hashtags (some very innocent-sounding) to target a particular community.</p>
<p>One of the key draws of TikTok as a social media platform is its “spreadability”; how easily it facilitates creating and sharing new videos based on the one a user was just watching. </p>
<p>With just a few taps users can create a “duet” video showing themselves reacting to the disturbing content. Bad actors, too, can easily re-upload videos that have been removed. Now this purported suicide video is out in the wild, it will be difficult for TikTok to control its spread.</p>
<h2>What about copyright takedowns?</h2>
<p>Some have noted social media platforms appear very adept at quickly removing copyrighted material from their services (and thereby avoiding huge fines), but can seem more tardy when it comes to disturbing content.</p>
<p>However, copyright videos are, in many ways, easier for machine learning moderation systems to detect. Existing systems used to limit the spread of copyrighted material have been built specifically for copyright enforcement.</p>
<p>For example, TikTok uses a system for detecting coprighted material (specifically music licensed by major record labels) to <a href="https://techcrunch.com/2020/08/12/acrcloud-profile/">automatically identify a song’s fingerprint</a>. </p>
<p>Even so, TikTok has faced a range of issues <a href="https://www.musicbusinessworldwide.com/nmpa-calls-for-scrutiny-of-tiktok-says-platform-has-consistently-violated-us-copyright-law-and-the-rights-of-songwriters-and-music-publishers/">relating to copyright enforcement</a>. Detecting hate speech or graphic videos on the platform is much more difficult. </p>
<h2>Room for improvement</h2>
<p>Certainly, there’s room for improvement. It’s a platform-wide, society-wide problem — we can’t just say TikTok is doing a bad job, it’s something all the platforms need to tackle together.</p>
<p>But asking market competitors to come up with a coordinated approach is not easy; platforms normally don’t share resources and work together globally to handle content moderation. But maybe they should.</p>
<p>TikTok employs massive teams of human moderators in addition to their algorithmically driven automated content moderation. These human content moderators work in many regions and languages to monitor content that may violate terms of use. </p>
<p>Recent events show TikTok is aware of growing demand for improved content moderation practices. In March 2020, responding to national security concerns, TikTok’s parent company ByteDance committed to <a href="https://www.wsj.com/articles/tiktok-to-stop-using-china-based-moderators-to-monitor-overseas-content-11584300597">stop using moderation teams based in China</a> to moderate international content. It also established a “transparency centre” in March 2020 to allow outside observers and experts <a href="https://techcrunch.com/2020/03/11/tiktok-to-open-a-transparency-center-where-outside-experts-can-examine-its-moderation-practices/">to scrutinise the platform’s moderation practices</a>. </p>
<p>These platforms have enormous power, and with that comes responsibility. We know content moderation is hard and nobody is saying it needs to be fixed overnight. More and more users know how to game the system, and there’s no single solution that will make the problem go away. It’s an evolving problem and the solution will need to constantly evolve too.</p>
<h2>Improving digital citizenship skills</h2>
<p>There’s a role for citizens, too. Every time these disturbing videos do the rounds, many more people go online to find the video - they talk about it with their friends and contribute to its circulation. </p>
<p>Complicating matters is the fact reporting videos on TikTok is not as straightforward as it is on other platforms, such as Facebook or Instagram. A recent <a href="https://journals.sagepub.com/doi/10.1177/2050157920952120">study</a> I (Bondy Kaye) was involved in compared features on TikTok with its Chinese counterpart, Douyin. We found the report function was located in the “share” menu accessed from the main viewing screen on both platforms — not a place many would think to look. </p>
<p>So if you’re a TikTok user and you encounter this video, don’t share it around - even in an effort to condemn it. You can report the video by clicking the share icon and selecting the appropriate reporting option. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/becoming-more-like-whatsapp-wont-solve-facebooks-woes-heres-why-113368">Becoming more like WhatsApp won't solve Facebook’s woes – here's why</a>
</strong>
</em>
</p>
<hr>
<p><em>Anyone seeking support and information about suicide can contact Lifeline on 131 114 or Beyond Blue on 1300 224 636.</em></p><img src="https://counter.theconversation.com/content/145756/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ariadna Matamoros-Fernández have received an award from Facebook, which includes research funding.</span></em></p><p class="fine-print"><em><span>D. Bondy Valdovinos Kaye does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A video purporting to show a suicide is reportedly circulating on TikTok, reigniting debate about content moderation on social media. Collaborating with competitors may be the key.Ariadna Matamoros-Fernández, Lecturer in Digital Media at the School of Communication, Queensland University of TechnologyD. Bondy Valdovinos Kaye, PhD Candidate / Editorial Assistant, Queensland University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1440022020-08-16T20:12:25Z2020-08-16T20:12:25ZTikTok can be good for your kids if you follow a few tips to stay safe<figure><img src="https://images.theconversation.com/files/352151/original/file-20200811-14-1p4pud3.jpg?ixlib=rb-1.1.0&rect=391%2C6%2C3225%2C2194&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Tashatuvango/Shutterstock</span></span></figcaption></figure><p>The video-sharing app <a href="https://www.tiktok.com/en/">TikTok</a> is a <a href="https://www.abc.net.au/news/science/2020-07-08/tiktok-national-safety-china-social-media-ban/12434308">hot political potato</a> amid concerns over who has access to users’ personal data.</p>
<p>The United States has moved to <a href="https://www.theguardian.com/technology/2020/aug/06/us-senate-tiktok-ban">ban the app</a>. <a href="https://inews.co.uk/news/technology/tiktok-ban-india-banning-app-china-border-how-safe-mean-explained-460619">Other countries</a>, including <a href="https://www.abc.net.au/news/2020-08-02/tiktok-under-investigation-in-australia-over-privacy-concerns/12513466">Australia</a>, have expressed concern.</p>
<p>But does this mean your children who use this app are at risk? If you’re a parent, let me explain the issues and give you a few tips to make sure your kids stay safe. </p>
<h2>A record-breaker</h2>
<p>Never has an app for young people been so popular. By April this year the TikTok app had been downloaded more than <a href="https://sensortower.com/blog/tiktok-downloads-2-billion">2 billion times worldwide</a>.</p>
<p>The app recently broke all records for the <a href="https://www.businessinsider.com.au/tiktok-app-2-billion-downloads-record-setting-q1-sensor-tower-2020-4">most downloaded app</a> in a quarterly period, with 315 million downloads globally in the first three months of 2020. </p>
<p><iframe id="tc-infographic-510" class="tc-infographic" height="400px" src="https://cdn.theconversation.com/infographics/510/e0bb42676754c25d2ca3734a7308e06f46d5dae8/site/index.html" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>Its popularity with young Aussies has sky-rocketed. Around <a href="https://www.roymorgan.com/findings/8289-launch-of-tiktok-in-australia-december-2019-202002240606">1.6 million Australians</a> use the app, including about one in five people born since 2006. That’s an estimated 537,000 young Australians.</p>
<p>Like all social media apps, TikTok siphons <a href="https://www.tiktok.com/legal/privacy-policy?lang=en">data about its users</a> such as email address, contacts, IP address and geolocation information.</p>
<p><a href="https://www.abc.net.au/news/2019-03-01/tiktok-app-child-privacy-violation-most-downloaded-apps-youtube/10862462">TikTok was fined $US5.8 million</a> (A$8 million) to settle US government claims it illegally collected personal information from children.</p>
<p>As a Chinese company, <a href="https://www.bytedance.com/en/">ByteDance</a>, owns TikTok, US President Donald Trump and others <a href="https://www.theguardian.com/technology/2020/jul/16/there-are-calls-to-ban-tiktok-in-australia-but-you-should-worry-about-facebook-too">are also worried</a> about the app handing over this data to the Chinese state. <a href="https://www.abc.net.au/news/2020-07-21/tiktok-security-concerns-executive-issues-china-denial/12475308">TikTok denies</a> it does this.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/china-could-be-using-tiktok-to-spy-on-australians-but-banning-it-isnt-a-simple-fix-142157">China could be using TikTok to spy on Australians, but banning it isn’t a simple fix</a>
</strong>
</em>
</p>
<hr>
<p>Just days ago the Trump administration <a href="https://www.whitehouse.gov/presidential-actions/executive-order-addressing-threat-posed-tiktok/">signed an executive order</a> to seek a <a href="https://edition.cnn.com/2020/08/06/politics/trump-executive-order-tiktok/index.html">ban on TikTok</a> operating or interacting with US companies.</p>
<h2>Youngsters still TikToking</h2>
<p>There is no hint of this stopping our TikToking children. For them it’s business as usual, creating and uploading videos of themselves lip-syncing, singing, dancing or just talking.</p>
<p>The most recent trend on TikTok – <a href="https://www.buzzfeed.com/lizmrichardson/taylor-swift-love-story-tiktok-disco-lines-remix">Taylor Swift Love Story</a> dance – has resulted in more than 1.5 million video uploads in around two weeks alone. </p>
<p><iframe id="tc-infographic-513" class="tc-infographic" height="400px" src="https://cdn.theconversation.com/infographics/513/5c0233c9f8ff17b13a5b7aed052792ce136c07be/site/index.html" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>But the latest political issues with TikTok raise questions about whether children should be on this platform right now. More broadly, as we see copycat sites such as <a href="https://about.instagram.com/blog/announcements/introducing-instagram-reels-announcement/">Instagram Reels</a> launched, should children be using any social media platforms that focus on them sharing videos of themselves at all?</p>
<h2>The pros and cons</h2>
<p>The TikTok app has filled a genuine social need for this young age group. Social media sites can offer a <a href="https://journals.sagepub.com/doi/full/10.1177/2055207619826678" title="Applying an affordances approach and a developmental lens to approach adolescent social media use">sense of belonging to a group</a>, such as a group focused on a particular interest, experience, social group or religion.</p>
<p>TikTok celebrates diversity and inclusivity. It can provide a place where young people can join together to support each other in their needs. </p>
<p>During the COVID-19 pandemic, TikTok has had huge numbers of videos with coronavirus-related hashtags such as <a href="https://www.tiktok.com/tag/quarantine?lang=en">#quarantine</a> (65 billion views), <a href="https://www.tiktok.com/tag/happyathome?lang=en">#happyathome</a> (19.5 billion views) and <a href="https://www.tiktok.com/tag/safehands?lang=en">#safehands</a> (5.4 billion views).</p>
<p>Some of these videos are funny, some include song and dance. The World Health Organisation even <a href="https://www.tiktok.com/@who?lang=en">posted its own youth-oriented videos</a> on TikTok to provide young people with reliable public health advice about COVID-19. </p>
<p><iframe id="tc-infographic-511" class="tc-infographic" height="400px" src="https://cdn.theconversation.com/infographics/511/f253fb1ebf0a13f14b4b87f5b481f8a9ac33c05d/site/index.html" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>The key benefit is the platform became a place where young people joined together from all corners of the planet, to understand and take the stressful edge off the pandemic for themselves and others their age. Where else could they do that? The <a href="https://www-cambridge-org.ezproxy.uws.edu.au/core/services/aop-cambridge-core/content/view/13C35DB424523B4210530288561CE615/S0007125000279002a.pdf/role_of_social_media_in_reducing_stigma_and_discrimination.pdf">mental health benefits</a> this offers can be important.</p>
<h2>Let’s get creative</h2>
<p>Another benefit lies in the creativity TikTok centres on. Passive use of technology, such as scrolling and checking social media with no purpose, can lead to <a href="https://www.psychologytoday.com/au/blog/in-excess/201805/addicted-social-media">addictive types of screen behaviours</a> for young people.</p>
<p><iframe id="tc-infographic-512" class="tc-infographic" height="400px" src="https://cdn.theconversation.com/infographics/512/dc23b53f671390ab36348a613746c9cac4c63fed/site/index.html" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>Whereas planning and creating content, such as making their own videos, is <a href="https://dl.acm.org/doi/10.1145/3191754" title="What Makes Smartphone Use Meaningful or Meaningless?">meaningful use of technology</a> and curbs addictive technology behaviours. In other words, if young people are going to use technology, using it creatively, purposefully and with meaning is the type of use we want to encourage. </p>
<p>Users of TikTok must be <a href="https://support.tiktok.com/en/privacy-safety/for-parents-default">at least 13 years old</a>, although it does have a <a href="https://newsroom.tiktok.com/en-au/tiktok-for-younger-users">limited app</a> for under 13s. </p>
<h2>Know the risks</h2>
<p>Like all social media platforms, children are engaging in a space in which others can contact them. They may be engaging in adult concepts that they are not yet mature enough for, such as love gone wrong or suggestively twerking to songs.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-secret-of-tiktoks-success-humans-are-wired-to-love-imitating-dance-moves-133057">The secret of TikTok's success? Humans are wired to love imitating dance moves</a>
</strong>
</em>
</p>
<hr>
<p>The platform moves very quickly, with a huge amount of videos, likes and comments uploaded every day. Taking it all in can lead to cognitive overload. This can be distracting for children and decrease focus on other aspects of their life including schoolwork. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/352161/original/file-20200811-16-1jr1ieg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Three young girls video themselves on a smartphone." src="https://images.theconversation.com/files/352161/original/file-20200811-16-1jr1ieg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/352161/original/file-20200811-16-1jr1ieg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=476&fit=crop&dpr=1 600w, https://images.theconversation.com/files/352161/original/file-20200811-16-1jr1ieg.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=476&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/352161/original/file-20200811-16-1jr1ieg.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=476&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/352161/original/file-20200811-16-1jr1ieg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=598&fit=crop&dpr=1 754w, https://images.theconversation.com/files/352161/original/file-20200811-16-1jr1ieg.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=598&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/352161/original/file-20200811-16-1jr1ieg.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=598&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">How to stay safe and still have fun with TikTok.</span>
<span class="attribution"><span class="source">Luiza Kamalova/Shutterstock</span></span>
</figcaption>
</figure>
<p>So here are a few tips for keeping your child safe, as well as getting the most out of the creative/educational aspects of TikTok.</p>
<ol>
<li><p>as with any social network, use privacy settings to limit how much information your child is sharing</p></li>
<li><p>if your child is creating a video, make sure it is reviewed before it’s uploaded to ensure it doesn’t include content that can be misconstrued or have negative implications</p></li>
<li><p>if a child younger than 13 wants to use the app, there’s <a href="https://newsroom.tiktok.com/en-au/tiktok-for-younger-users">a section for this younger age group</a> that includes extra safety and privacy features</p></li>
<li><p>if you’re okay with your child creating videos for TikTok, then doing it together or helping them plan and film the video can be a great parent-child bonding activity</p></li>
<li><p>be aware of the collection of data by TikTok, encourage your child to be aware of it, and help them know what they are giving away and the implications for them.</p></li>
</ol>
<p>Happy (safe) TikToking!</p><img src="https://counter.theconversation.com/content/144002/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Joanne Orlando does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>TikTok is caught in a political battle between the US and China but children are still using the video-sharing app. Here are some tips on how to make sure they are safe.Joanne Orlando, Researcher: Children and Technology, Western Sydney UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1431922020-07-24T03:00:23Z2020-07-24T03:00:23ZQAnon believers will likely outlast and outsmart Twitter’s bans<figure><img src="https://images.theconversation.com/files/349044/original/file-20200723-18-1sbof1b.jpg?ixlib=rb-1.1.0&rect=45%2C7%2C4992%2C3345&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Matt Rourke/AP</span></span></figcaption></figure><p>Twitter has announced it’s taking sweeping action to limit the reach of content associated with <a href="https://www.bbc.com/news/53498434">QAnon</a>. Believers of this fringe far-right conspiracy theory claim there is a “deep state” plot against US President Donald Trump led by Satan-worshipping elites from within government, business and media.</p>
<p>Twitter has <a href="https://www.nbcnews.com/tech/tech-news/twitter-bans-7-000-qanon-accounts-limits-150-000-others-n1234541">banned more than 7,000 accounts</a> tweeting about QAnon, <a href="https://twitter.com/TwitterSafety/status/1285726277719199746">citing</a> violations of its multi-account policy, coordinated abuse targeting individual victims, and attempts to evade previous account suspensions.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1285726277719199746"}"></div></p>
<p>The platform also said it would stop circulating QAnon-related content, including material appearing in trending topics, recommendation lists and the search feature. It will also reportedly block web links associated with QAnon activity.</p>
<p>These actions, which could impact as many as <a href="https://www.nbcnews.com/tech/tech-news/twitter-bans-7-000-qanon-accounts-limits-150-000-others-n1234541">150,000 accounts globally</a>, are part of Twitter’s wider crackdown on <a href="https://blog.twitter.com/en_us/topics/product/2020/updating-our-approach-to-misleading-information.html">misinformation</a> and “behaviour that has the potential to lead to offline harm”. </p>
<p>However, according to CNN reporter Oliver Darcy, many of the actions are not being extended to “candidates and elected officials”. Regardless, history suggests the threat of online conspiracists is a difficult one to tackle.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1285761052072988673"}"></div></p>
<h2>How it all began</h2>
<p>QAnon began in October 2017 when an anonymous user <a href="https://www.nbcnews.com/tech/tech-news/how-three-conspiracy-theorists-took-q-sparked-qanon-n900531">or group of users</a> going by “Q” began posting on the online message board 4chan. Q claimed to have access to classified information about the Trump administration and its opponents.</p>
<p>More than two years and <a href="https://www.nytimes.com/2020/02/09/us/politics/qanon-trump-conspiracy-theory.html">3,500 posts</a> later, “Q” has generated a sprawling but unfounded conspiracy theory claiming the existence of a global network of political elites and celebrities who want to take down Trump. These people also supposedly run a child sex trafficking ring, among other crimes.</p>
<p>QAnon believers predict <a href="https://www.salon.com/2019/08/18/qanon-is-the-conspiracy-theory-that-wont-die-heres-what-they-believe-and-why-theyre-wrong/">the secret war</a> between the Trump administration and the “deep state” network will eventually lead to “The Storm” – a day of reckoning where Trump’s opponents will be arrested or executed.</p>
<p>Recently, QAnon believers have also pushed a range of baseless coronavirus conspiracies. These include claims the virus is a hoax, or a Chinese bioweapon designed to hurt Trump’s re-election chances.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/qanon-conspiracy-theories-about-the-coronavirus-pandemic-are-a-public-health-threat-135515">QAnon conspiracy theories about the coronavirus pandemic are a public health threat</a>
</strong>
</em>
</p>
<hr>
<h2>Online actors, real-world consequences</h2>
<p>Twitter’s designation of QAnon activity as potentially harmful is <a href="https://www.nbcnews.com/tech/tech-news/twitter-bans-7-000-qanon-accounts-limits-150-000-others-n1234541">partly driven</a> by reports of the movement’s ties to <a href="https://www.nytimes.com/2020/02/09/us/politics/qanon-trump-conspiracy-theory.html">dangerous real-world activities</a>.</p>
<p>QAnon believers have also been linked to <a href="https://eu.azcentral.com/story/news/politics/arizona/2018/08/07/qanon-ties-two-arizona-arrests-conspiracy-theory-trump/920336002/">armed standoffs</a>, <a href="https://www.nbcnews.com/news/crime-courts/colorado-mom-inspired-qanon-conspiracy-sought-kidnap-her-own-child-n1111711">attempted kidnappings</a>, harassment and <a href="https://www.nbcnews.com/news/us-news/man-suspected-gunning-down-reputed-mob-boss-mistook-him-deep-n1032331">at least one killing</a> since the conspiracy picked up steam in 2017.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/349007/original/file-20200722-24-cstye9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/349007/original/file-20200722-24-cstye9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=432&fit=crop&dpr=1 600w, https://images.theconversation.com/files/349007/original/file-20200722-24-cstye9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=432&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/349007/original/file-20200722-24-cstye9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=432&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/349007/original/file-20200722-24-cstye9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=543&fit=crop&dpr=1 754w, https://images.theconversation.com/files/349007/original/file-20200722-24-cstye9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=543&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/349007/original/file-20200722-24-cstye9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=543&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Anthony Comello said his belief in QAnon led him to kill a Gambino mob boss.</span>
<span class="attribution"><span class="source">Seth Wening/AP</span></span>
</figcaption>
</figure>
<p>Last year, the FBI issued a report on “<a href="https://news.yahoo.com/fbi-documents-conspiracy-theories-terrorism-160000507.html">conspiracy-driven domestic extremists</a>” and identified QAnon as a potential domestic terrorist threat.</p>
<p>Although extremism driven by conspiracy theories isn’t new, the report states the internet and social media are helping such theories reach wider audiences.</p>
<p>It also says online conversations help determine the targets of harassment and violence for the small subset of individuals whose beliefs translate into real-world action. </p>
<p>One such example came from the Pizzagate conspiracy (seen by some as a precursor to QAnon), which <a href="https://www.nytimes.com/2016/12/05/business/media/comet-ping-pong-pizza-shooting-fake-news-consequences.html">motivated an American man to gun down</a> a pizza shop that was supposedly a front for a child sex trafficking ring.</p>
<h2>QAnon likely to stay</h2>
<p>While it’s hard to say exactly how many QAnon believers there are, the movement has thousands of followers on social media. </p>
<p>A recent <a href="https://www.theguardian.com/technology/2020/jun/25/qanon-facebook-conspiracy-theories-algorithm">investigation</a> of QAnon-related pages and groups on Facebook found there are about three million followers and members in total. But there is likely significant overlap among these accounts. </p>
<p>According to a New York Times <a href="https://www.nytimes.com/2020/07/21/technology/twitter-bans-qanon-accounts.html">report</a> citing anonymous sources, Facebook is planning to enforce similar measures to limit the reach of QAnon content on its platform. One of the largest Facebook <a href="https://www.facebook.com/groups/163948417717196/">groups</a> dedicated to QAnon currently has more than 200,000 members. </p>
<p>Given QAnon’s reach, it will be difficult for Twitter to stamp it out altogether.</p>
<p>Social media bans are hard to maintain. Content can be shared under new accounts. New code words and hashtags can be adopted which <a href="https://www.bernardmarr.com/default.asp?contentID=1373">artificial intelligence</a> algorithms can’t detect. </p>
<p>For example, <a href="https://www.rollingstone.com/culture/culture-news/qanon-twitter-ban-parler-conspiracy-theories-1032523/">many QAnon believers</a> have tried to operate unnoticed on Twitter by using the number 17 to reference “Q” (the 17th letter of the alphabet), or by writing “CueAnon” instead of “QAnon”. </p>
<p>Human moderators may be needed to identify such circumvention attempts. And it’s hard to say how much human resource Twitter is willing or able to devote to moderating this content. </p>
<p>Banned users can also enlist virtual private networks (VPNs) to change their IP addresses and bypass restrictions. </p>
<p>Furthermore, conspiracy theories such as QAnon are difficult to counter as they are “<a href="https://www.vox.com/policy-and-politics/2019/3/29/18286890/qanon-mueller-report-barr-trump-conspiracy-theories">self-sealing</a>”: any action against believers is interpreted as “evidence” of the theory’s validity. </p>
<p>This is because conspiracists often think agents of the conspiracy have unusual and extensive powers. Some QAnon believers are taking Twitter’s bans to be confirmation of a “deep state” plot against Trump.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1285815686737154050"}"></div></p>
<p>That said, it’s possible Twitter’s measures will reduce QAnon’s visibility. A <a href="https://www.theguardian.com/technology/2020/jun/25/qanon-facebook-conspiracy-theories-algorithm">similar past crackdown</a> by Reddit was effective in stemming QAnon activity. Before its <a href="https://www.theguardian.com/technology/2020/jun/25/qanon-facebook-conspiracy-theories-algorithm">ban in 2018</a>, the largest QAnon subreddit had more than 70,000 members.</p>
<p>However, many of these users simply moved to other sites such as YouTube and Facebook – a common trend following bans.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/reddit-removes-millions-of-pro-trump-posts-but-advertisers-not-values-rule-the-day-141703">Reddit removes millions of pro-Trump posts. But advertisers, not values, rule the day</a>
</strong>
</em>
</p>
<hr>
<p>With QAnon followers expanding and folding new events into their narrative, the fringe movement has <a href="https://www.nbcnews.com/tech/tech-news/how-three-conspiracy-theorists-took-q-sparked-qanon-n900531">taken on a life of its own</a>.</p>
<p>Numerous <a href="https://www.nbcnews.com/politics/2020-election/qanon-caucus-fringe-conspiracy-theory-advocates-aim-congress-n1231225">US Republican candidates for congress</a> have promoted it. Trump himself has repeatedly <a href="https://www.vox.com/policy-and-politics/2020/1/2/21046707/trump-qanon-pizzagate-retweets">retweeted QAnon accounts</a>. </p>
<p>If Twitter is serious about its newest tussle with misinformation, it will likely have to pull out all the stops.</p><img src="https://counter.theconversation.com/content/143192/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Audrey Courty does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>QAnon conspiracists think Trump’s ‘secret war’ against an elite celebrity ‘deep state’ network will eventually lead to a day of reckoning where his opponents will fall.Audrey Courty, PhD candidate, School of Humanities, Languages and Social Science, Griffith UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1428192020-07-16T07:02:55Z2020-07-16T07:02:55ZThe Twitter hack targeted the rich and famous. But we all lose if trusted accounts can be hijacked<p>The list of US figures whose Twitter accounts were <a href="https://www.bbc.com/news/technology-53425822">hijacked by scammers on Wednesday US time</a> reads like a Who’s Who of the tech and celebrity worlds: Tesla boss Elon Musk, Amazon chief Jeff Bezos, Microsoft founder Bill Gates, former president Barack Obama, current Democratic nominee Joe Biden, celebrities Kanye West and Kim Kardashian, billionaires Warren Buffett and Mike Bloomberg, the corporate accounts of Apple and Uber, and more besides.</p>
<p>The point of the hack? To lure followers into sending US$1,000 in Bitcoin, with the classic scammer’s false promise of sending back twice as as much.</p>
<p>After a <a href="https://twitter.com/TwitterSupport/status/1283591844962750464">preliminary investigation</a>, Twitter said it believed the incident was “a coordinated social engineering attack by people who successfully targeted some of our employees with access to internal systems and tools”.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1283591846464233474"}"></div></p>
<p>The details are still far from clear, but it seems likely someone with administrative rights may have granted the hackers access, perhaps inadvertently, despite the presence of two-factor authentication on the accounts – widely considered the gold standard of online security. It appears insiders may have been involved, although the story is <a href="https://www.vice.com/en_us/article/jgxd3d/twitter-insider-access-panel-account-hacks-biden-uber-bezos">still unfolding</a>.</p>
<p>The use of the niche currency Bitcoin limited the number of potential victims, but also makes the hackers’ loot impossible to trace. Ironically enough, Bitcoin is a currency designed for a post-trust world, and the anonymity of its transactions makes the hackers even harder to track down.</p>
<h2>Whom do we trust?</h2>
<p>This is not the first time we have seen the complex and profound impact social media can have. In 2013, <a href="https://theconversation.com/why-the-ap-hack-is-likely-to-happen-again-13735">hackers gained access to @AP</a>, the official Twitter account of the respected Associated Press news agency, and tweeted: </p>
<blockquote>
<p>Breaking: Two Explosions in the White House and Barack Obama is Injured. </p>
</blockquote>
<p>The stock market <a href="https://www.cnbc.com/id/100646197">dived by US$136.5 billion almost immediately</a> but bounced back within six minutes, illustrating the interconnected systems that move so quickly a human cannot intervene - algorithms read the headlines and the stock market collapsed, albeit fleetingly. </p>
<p>By shorting stocks, whoever hacked AP’s Twitter account stood to make enormous profits from the temporary stock market tank. We do not know what the financial benefits, <a href="https://www.theguardian.com/business/2013/apr/23/ap-tweet-hack-wall-street-freefall">if any</a>, to the hackers in 2013 were. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/why-the-ap-hack-is-likely-to-happen-again-13735">Why the AP hack is likely to happen again</a>
</strong>
</em>
</p>
<hr>
<p>This week’s Twitter hack definitely had financial motives. The Bitcoin scammers in this recent hack netted <a href="https://mashable.com/article/twitter-memes-unverified-verified-hack/">more than US$50,000</a>.</p>
<p>More sinister still, however, are the implications for democracy if a similar hack were carried out with political motives.</p>
<p>What if a reliable source, such as a national newspaper’s official account, tweets that a presidential candidate has committed a crime, or is seriously ill, on the eve of an election? What if false information about international armed attacks is shared from a supposedly reliable source such as a government defence department? The impacts of such events would be profound, and go far beyond financial loss. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1283594548233613312"}"></div></p>
<p>This is the inherent danger of our growing reliance on social media platforms as authoritative sources of information. As media institutions decline in size, funding and impact, the public increasingly relies on social media platforms for news. </p>
<p>The Bitcoin scam is a reminder that any social media platform can be hacked, tampered with, or used to spread false information. Even gold-standard technical systems can be outwitted, perhaps by exploiting human vulnerabilities. A disgruntled employee, a careless password selection, or even a device used in a public space can pose grave risks. </p>
<h2>Who’s in charge?</h2>
<p>The question of who polices the vast power accrued by social media platforms is a crucial one. Twitter’s reaction to the hack – temporarily shutting down all accounts verified with the “blue tick” that connotes public interest – raised the ire of high-profile users (and prompted <a href="https://mashable.com/article/twitter-memes-unverified-verified-hack/">mirth</a> among those not bestowed with Twitter’s mark of legitimacy). But the underlying question is: who decides what is censored or shut down, and under what circumstances? And should companies do this themselves, or do they need a regulatory framework to <a href="https://global.oup.com/academic/product/in-search-of-jeffersons-moose-9780195342895?cc=au&lang=en&">ensure fairness and transparency</a>?</p>
<p>Broader questions have already been raised about when Twitter, Facebook or other social media platforms should or should not censor content. Facebook was <a href="https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html">heavily criticised</a> for not removing oppressive posts about Rohingya Muslims in Myanmar, and what the United Nations referred to as a genocide ensued. Twitter much later <a href="https://www.theguardian.com/world/2019/may/16/myanmar-army-chiefs-twitter-account-suspended-over-anti-rohingya-hate-speech">suspended some accounts</a> that had been inciting violence, with some criticism.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/instead-of-showing-leadership-twitter-pays-lip-service-to-the-dangers-of-deep-fakes-127027">Instead of showing leadership, Twitter pays lip service to the dangers of deep fakes</a>
</strong>
</em>
</p>
<hr>
<p>What is the responsibility of such platforms, and who should govern them, as we become more heavily reliant on social media for our news? As the platforms’ power and influence continue to grow, we need rigorous frameworks to hold them accountable. </p>
<p>Last month, the Australian government pledged a A$1.3 billion funding increase and an extra 500 staff for the Australian Signals Directorate, to boost its ability to defend Australia from attacks. Australia’s forthcoming 2020 Cyber Security Strategy will hopefully also include strategies to proactively improve cyber security and digital literacy. </p>
<p>In an idea world, social media giants would regulate themselves. But here in the real world, the stakes are too high to let the platforms police themselves.</p><img src="https://counter.theconversation.com/content/142819/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Kobi Leins is currently conducting research on the existing laws relating to cyber in Australia as part of a ‘Governing Cyber Law in Australia’ project with the Computing and Information Systems of the University of Melbourne, in partnership with the Centre for AI and Digital Ethics.</span></em></p>Twitter’s ‘blue tick’ club of influential users was locked out after financial scammers hacked celebrities’ accounts. But with ever more trust placed in social media, we stand to lose more than money.Kobi Leins, Senior Research Fellow in Digital Ethics, The University of MelbourneLicensed as Creative Commons – attribution, no derivatives.