tag:theconversation.com,2011:/global/topics/ask-fm-6683/articlesask.fm – The Conversation2022-07-14T04:25:36Ztag:theconversation.com,2011:article/1866472022-07-14T04:25:36Z2022-07-14T04:25:36ZSendit, Yolo, NGL: anonymous social apps are taking over once more, but they aren’t without risks<figure><img src="https://images.theconversation.com/files/473783/original/file-20220713-20-pkfvpq.jpeg?ixlib=rb-1.1.0&rect=170%2C161%2C5820%2C3826&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Have you ever told a stranger a secret about yourself online? Did you feel a certain kind of freedom doing so, specifically because the context was removed from your everyday life? Personal disclosure and anonymity have long been a potent mix laced through our online interactions. </p>
<p>We’ve recently seen this through the resurgence of anonymous question apps targeting young people, including Sendit and NGL (which stands for “not gonna lie”). The latter has been installed 15 million times globally, according to recent <a href="https://techcrunch.com/2022/07/11/anonymous-social-ngl-tops-15m-installs-2-4m-in-revenue-as-users-complain-about-being-scammed/">reports</a>.</p>
<p>These apps can be linked to users’ Instagram and Snapchat accounts, allowing them to post questions and receive anonymous answers from followers.</p>
<p>Although they’re trending at the moment, it’s not the first time we’ve seen them. Early examples include ASKfm, launched in 2010, and Spring.me, launched in 2009 (as “Fromspring”).</p>
<p>These platforms have a troublesome history. As a sociologist of technology, I’ve studied human-technology encounters in contentious environments. Here’s my take on why anonymous question apps have once again taken the internet by storm, and what their impact might be.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/473782/original/file-20220713-14-7p7h1u.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A series of screens advertising various features of the 'NGL' app." src="https://images.theconversation.com/files/473782/original/file-20220713-14-7p7h1u.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/473782/original/file-20220713-14-7p7h1u.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=256&fit=crop&dpr=1 600w, https://images.theconversation.com/files/473782/original/file-20220713-14-7p7h1u.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=256&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/473782/original/file-20220713-14-7p7h1u.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=256&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/473782/original/file-20220713-14-7p7h1u.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=322&fit=crop&dpr=1 754w, https://images.theconversation.com/files/473782/original/file-20220713-14-7p7h1u.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=322&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/473782/original/file-20220713-14-7p7h1u.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=322&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The app NGL is targeted at ‘teens’ on the Google app store.</span>
<span class="attribution"><a class="source" href="https://play.google.com/store/apps/details?id=com.nglreactnative&hl=en_US&gl=US">Screenshot/Google Play Store</a></span>
</figcaption>
</figure>
<h2>Why are they so popular?</h2>
<p>We know teens are drawn to social platforms. These networks connect them with their peers, support their journeys towards forming identity, and provide them space for experimentation, creativity and bonding.</p>
<p>We also know they manage online disclosures of their identity and personal life through a technique sociologists call “audience segregation”, or “code switching”. This means they’re likely to <a href="https://oxford.universitypressscholarship.com/view/10.1093/oso/9780199381265.001.0001/oso-9780199381265-chapter-3">present themselves differently</a> online to their parents than they are to their peers. </p>
<p>Digital cultures have long used <a href="https://www.tandfonline.com/doi/abs/10.1080/1369118X.2015.1093531">online anonymity</a> to separate real-world identities from online personas, both for privacy and in response to online surveillance. And research has shown online anonymity <a href="https://spssi.onlinelibrary.wiley.com/doi/full/10.1111/1540-4560.00247">enhances self-disclosure and honesty</a>.</p>
<p>For young people, having online spaces to express themselves away from the adult gaze is important. Anonymous question apps provide this space. They promise to offer the very things young people seek: opportunities for self-expression and authentic encounters.</p>
<h2>Risky by design</h2>
<p>We now have a generation of kids growing up with the internet. On one hand, young people are hailed as pioneers of the digital age – and on they other, we fear for them as its innocent victims. </p>
<p>A recent <a href="https://techcrunch.com/2022/06/29/anonymous-social-apps-shift-their-attention-to-instagram-in-the-wake-of-snapchats-ban/%22%22">TechCrunch</a> article chronicled the rapid uptake of anonymous question apps by young users, and raised concerns about transparency and safety. </p>
<p>NGL <a href="https://www.businessinsider.com/ngl-anonymous-instagram-q-and-a-app-surging-in-popularity-2022-7">exploded in popularity</a> this year, but hasn’t solved the <a href="https://www.nbcnews.com/tech/internet/ngl-anonymous-message-app-instagram-tests-link-bullying-rcna36152">issue of</a> hate speech and bullying. Anonymous chat app <a href="https://arstechnica.com/information-technology/2017/04/yik-yak-is-dead-long-live-yik-yak/">YikYak</a> was shut down in 2017 after becoming littered with hateful speech – but has <a href="https://techcrunch.com/2021/08/16/yik-yak-is-back/">since returned</a>. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/473781/original/file-20220713-26-tsnljj.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A screenshot of a Tweet from @Mistaaaman" src="https://images.theconversation.com/files/473781/original/file-20220713-26-tsnljj.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/473781/original/file-20220713-26-tsnljj.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=296&fit=crop&dpr=1 600w, https://images.theconversation.com/files/473781/original/file-20220713-26-tsnljj.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=296&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/473781/original/file-20220713-26-tsnljj.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=296&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/473781/original/file-20220713-26-tsnljj.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=372&fit=crop&dpr=1 754w, https://images.theconversation.com/files/473781/original/file-20220713-26-tsnljj.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=372&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/473781/original/file-20220713-26-tsnljj.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=372&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Anonymous question apps are just one example of anonymous online spaces.</span>
<span class="attribution"><a class="source" href="https://twitter.com/Mistaaaman/status/1126585149561421824">Screenshot/Twitter</a></span>
</figcaption>
</figure>
<p>These apps are designed to hook users in. They leverage certain platform principles to provide a highly engaging experience, such as interactivity and gamification (wherein a form of “play” is introduced into non-gaming platforms).</p>
<p>Also, given their experimental nature, they’re a good example of how social media platforms have historically been developed with a “move fast and break things” attitude. This approach, first articulated by Meta CEO Mark Zuckerberg, has arguably reached its <a href="https://hbr.org/2019/01/the-era-of-move-fast-and-break-things-is-over">use-by date</a>.</p>
<p>Breaking things in real life is not without consequence. Similarly, breaking away from important safeguards online is not without social consequence. Rapidly developed social apps can have harmful <a href="https://www.mdpi.com/1660-4601/15/11/2471">consequences</a> for young people, including cyberbullying, cyber dating abuse, image-based abuse and even online grooming. </p>
<p>In May 2021, <a href="https://techcrunch.com/2021/08/03/anonymous-snapchat-app-sendit-surges-with-3-5m-installs-after-snap-bans-yolo-and-lmk/">Snapchat suspended</a> integrated anonymous messaging apps Yolo and LMK, after <a href="https://www.scribd.com/document/507515040/Snap-Lawsuit">being</a> <a href="https://www.courthousenews.com/wp-content/uploads/2022/01/rodriguez-meta-snap-complaint.pdf">sued</a> by the distraught parents of teens who committed suicide after being bullied through the apps. </p>
<p>Yolo’s developers <a href="https://arstechnica.com/tech-policy/2021/05/snap-cuts-off-yolo-lmk-anonymous-messaging-apps-after-lawsuit-over-teens-death/">overestimated</a> the capacity of their automated content moderation to identify harmful messages. </p>
<p>In the wake of these suspensions, Sendit soared through <a href="https://techcrunch.com/2022/03/17/following-suicides-and-lawsuits-snapchat-restricts-apps-building-on-its-platform-with-new-policies/">the app store charts</a> as Snapchat users sought a replacement. </p>
<p>Snapchat then <a href="https://www.snap.com/en-US/safety-and-impact/post/announcing-new-policies-for-snaps-developer-platform">banned</a> anonymous messaging from third-party apps in March this year, in a bid to limit bullying and harassment. Yet it <a href="https://www.youtube.com/watch?v=7jW-IRuXj4g">appears</a> Sendit can still be linked to Snapchat as a third-party app, so the implementation conditions are variable.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1546246767695519762"}"></div></p>
<h2>Are kids being manipulated by chatbots?</h2>
<p>It also seems these apps may feature automated <a href="https://www.sciencedirect.com/science/article/pii/S2666827020300062">chatbots</a> parading as anonymous responders to prompt interactions – or at least that’s what staff at Tech Crunch found. </p>
<p>Although chatbots can be harmless (or even helpful), problems arise if users can’t tell whether they’re interacting with a bot or a person. At the very least it’s likely the apps are not effectively screening bots out of conversations. </p>
<p>Users can’t do much either. If responses are <a href="https://screenrant.com/ngl-link-qna-instagram-anonymous-explained/">anonymous</a> (and don’t even have a profile or post history linked to them), there’s no way to know if they’re communicating with a real person or not.</p>
<p>It’s difficult to confirm whether bots are widespread on anonymous question apps, but we’ve seen them cause huge problems on other platforms – opening avenues for deception and exploitation.</p>
<p>For example, in the case of <a href="https://journals.uic.edu/ojs/index.php/fm/article/view/6426/5525">Ashley Madison</a>, a dating and hook-up platform that was hacked in 2015, bots were used to chat with human users to keep them engaged. These bots used fake profiles created by Ashley Madison employees. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/anorexia-coach-sexual-predators-online-are-targeting-teens-wanting-to-lose-weight-platforms-are-looking-the-other-way-162938">'Anorexia coach': sexual predators online are targeting teens wanting to lose weight. Platforms are looking the other way</a>
</strong>
</em>
</p>
<hr>
<h2>What can we do?</h2>
<p>Despite all of the above, <a href="https://dl.acm.org/doi/abs/10.1145/3134711">some research</a> has found many of the risks teens experience online pose only brief negative effects, if any. This suggests we may be overemphasising the risks young people face online.</p>
<p>At the same time, implementing parental controls to mitigate online risk is often in tension with young people’s <a href="https://journals.sagepub.com/doi/abs/10.1177/1461444816686318">digital rights</a>. </p>
<p>So the way forward isn’t simple. And just banning anonymous question apps isn’t the solution.</p>
<p>Rather than avoid anonymous online spaces, we’ll need to trudge through them together – all the while demanding as much accountability and transparency from tech companies as we can.</p>
<p>For parents, there are some <a href="https://www.esafety.gov.au/parents/resources">useful resources</a> on how to help children and teens navigate tricky online environments in a sensible way.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ending-online-anonymity-wont-make-social-media-less-toxic-172228">Ending online anonymity won't make social media less toxic</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/186647/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Alexia Maddox does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Anyone who has trawled through an internet forum will have seen how anonymity can change people. What happens when young people are thrown into the mix?Alexia Maddox, Research Fellow, Blockchain Innovation Hub, RMIT, RMIT UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/824582017-08-15T13:59:37Z2017-08-15T13:59:37ZPopularity of latest ‘honesty app’ Sarahah shows how much we desire validation, whatever the cost<figure><img src="https://images.theconversation.com/files/182072/original/file-20170815-18355-1k6fyxj.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The Sarahah app urges users to send 'constructive' messages, but cyberbullying is rife.</span> <span class="attribution"><span class="source">Sarahaha</span></span></figcaption></figure><p>A new app called <a href="https://www.sarahah.com/Home/About">Sarahah</a> (which is Arabic for “honesty”) launched its English-language version this summer, promising an anonymous way of offering supportive criticism for teams in the workplace. It has since attracted <a href="http://www.bbc.co.uk/news/av/world-middle-east-40846321/sarahah-the-honesty-app-that-s-got-everyone-talking">300m users</a> and reached the top of Apple’s App Store download charts in more than 30 countries, but already users say they are <a href="http://uk.businessinsider.com/sarahah-app-store-bullying-harassment-2017-7">receiving harassing and obscene messages</a>.</p>
<p>Sarahah’s designers state the app allows users to “get honest feedback from your co-workers and friends” to “help people self-develop by receiving constructive anonymous feedback”. Users sign up for an account and receive a link they can share on other social media sites, inviting anyone with access to their profile to send messages anonymously – users sending messages don’t need accounts. In the Arab world where speech is more culturally policed, it was soon <a href="http://www.bbc.co.uk/news/blogs-trending-39067533">used for declarations</a> of love, homosexuality and much more that would otherwise be forbidden. The 29-year-old Saudi founder, Zain al-Abidin Tawfiq, obviously understood the potential for abuse and included blocking and filtering features to prevent misuse. But with only <a href="http://www.bbc.co.uk/news/av/world-middle-east-40846321/sarahah-the-honesty-app-that-s-got-everyone-talking">three staff members</a> the company cannot moderate millions of messages a day.</p>
<p>The English version has been widely adopted by the <a href="https://www.statista.com/statistics/326452/snapchat-age-group-usa/">Snapchat generation of under-25s</a>, reaching the top of the download charts only when Snapchat released updates that allowed its users <a href="http://mashable.com/2017/07/23/the-story-of-sarahah-app/#98NSlmAI0uq4">to link to their Sarahah accounts</a>. And while some users find that Sarahah and honesty apps like it deliver <a href="https://www.theverge.com/2017/8/13/16127170/sarahah-app-anonymous-messages-feedback">self-esteem boosting encouragement</a>, cyberbullying is also rife as people take advantage of the one-way anonymity to safely tell their friends and classmates all the things they wouldn’t dare to say to their faces.</p>
<p>In a review of the app on the Google app store, user Jordan Adams wrote:</p>
<blockquote>
<p>It was really cool at first because it was jokes with friends and stuff. Then someone sent my address and I got really freaked out. Then people were sending me a bunch of perverted stuff. I would delete my account but it won’t let me.</p>
</blockquote>
<p>Also on Google Play, parents Paul and Olivia Parsons wrote: </p>
<blockquote>
<p>Our daughter used it for a day and at first nice comments but slowly more mean comments started coming in … the last one before she deleted it told her to kill herself.</p>
</blockquote>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/182093/original/file-20170815-17703-fktlqx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/182093/original/file-20170815-17703-fktlqx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/182093/original/file-20170815-17703-fktlqx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/182093/original/file-20170815-17703-fktlqx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/182093/original/file-20170815-17703-fktlqx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/182093/original/file-20170815-17703-fktlqx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/182093/original/file-20170815-17703-fktlqx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">You can ask for comments, but you may not like them.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/young-group-people-looking-phone-they-572400388">djile/Shutterstock</a></span>
</figcaption>
</figure>
<h2>Not the first, nor the last</h2>
<p>For researchers like myself, there is a strong sense of Groundhog Day about Sarahah. The first in the long line of semi-anonymous comment apps was Formspring, launched in 2009 and referred to in <a href="http://www.newsday.com/long-island/suffolk/family-friends-shocked-at-cyberposts-after-teen-s-death-1.1827393">teenage suicide cases in the US</a> and in <a href="http://www.telegraph.co.uk/technology/social-media/8653867/Teenager-in-rail-suicide-was-sent-abusive-message-on-social-networking-site.html">Britain</a>. The owner redesigned the site and took part in bullying prevention strategies, but the <a href="https://techcrunch.com/2012/06/27/ask-fm-claims-its-overtaken-qa-giant-formspring-whats-going-on-here/">original concept was cloned</a> by a Latvian team as Ask.fm, and was also <a href="http://clok.uclan.ac.uk/8378/">linked to several teenage suicides</a>. </p>
<p>Other controversial anonymity apps have included Yik Yak – which <a href="https://www.change.org/p/tyler-droll-and-brooks-buffington-shut-down-the-app-yik-yak">closed this year</a> – and After School and Secret. They all offer the same thing: a tantalising opportunity for the user to find out what people “really” think of them, combined with the temptation for the sender to be brutally cruel to someone who has “asked for it”.</p>
<p>In <a href="http://clok.uclan.ac.uk/8378/1/8378_binns.pdf">my research of Ask.fm and Formspring</a>, teenage girls were split between those who strongly blamed the bullies for “sending hate” and those who blamed the receiver for signing up to the service in the first place. Some girls said people who complained about bullying on anonymous sites were attention-seeking, shouldn’t be online if they were so sensitive, and shouldn’t “act surprised” that the comments weren’t all positive.</p>
<p>This same victim-blaming is already apparent in <a href="https://play.google.com/store/apps/details?id=com.sarahah.android&hl=en_GB">Sarahah’s reviews</a>, some of which appear to have been repeatedly cut and pasted while awarding the app five stars. An example: </p>
<blockquote>
<p>For all you people complaining that this promotes bullying is totally wrong. It is completely the user’s fault for putting themselves online for anyone to say anything anonymously about them. It’s simple, if you don’t want to get bullied, just don’t use the app. Don’t fish for comments and complain.</p>
</blockquote>
<h2>Peer review</h2>
<p>This victim-blaming disregards the enormous drive young people have for validation from their peers, which is unfortunately most strong among the more sensitive souls: those who don’t fit in, or who may have already experienced bullying. Rachel Simmons, in <a href="https://www.amazon.co.uk/d/cka/Odd-Girl-Out-Revised-Updated-Rachel-Simmons/0547520190">Odd Girl Out</a>, her work on teenage girls, described this desire to ascertain social worth as a “<a href="https://books.google.co.uk/books?id=HY0PC1g2aW8C&pg=PA133&lpg=PA133&dq=toxic,+self-reinforcing+cycle+rachel+simmons&source=bl&ots=2ocjA9eIwH&sig=rjypxB3zoqEIhG5Uwr_cHU3tytw&hl=en&sa=X&ved=0ahUKEwio1MDi-9jVAhVHKcAKHVRpCWIQ6AEINDAC#v=onepage&q=toxic%2C%20self-reinforcing%20cycle%20rachel%20simmons&f=false">toxic, self-reinforcing cycle</a>”. One-way anonymous apps such as Sarahah lure in users with the promise of peer validation much like the promise of water in a desert. But comments can be particularly hurtful, because they come from people who know the users well: they know who you fancy, what you wore to the party, what you said – and they can use it against you.</p>
<p>How to tackle this problem? The cyclical appearance of these apps and their huge popularity shows they are meeting a deep need and won’t be easily eradicated, however regularly they cause problems – or even suicides. But there are some basic safeguards to take: most obviously to hire large numbers of human moderators, to create and monitor a prominent “report abuse” button, and to partner with experts in bullying prevention, something <a href="http://www.childnet.com/blog/askfm-support-childnet">Ask.fm has now done</a>.</p>
<p>However, these are the actions of established companies, not barely-funded start-ups. Perhaps the responsibility really lies with the app stores who host them: Google and Apple. These well-staffed, profitable companies could insist that semi-anonymous messaging services meet basic standards before they appear on the store, rather than simply sticking a “parental guidance” rating on it that most parents will never see. There are plenty of examples of how these apps go wrong. It’s about time they started learning from past mistakes.</p><img src="https://counter.theconversation.com/content/82458/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Amy Binns does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Apps inviting anonymous comments play upon our desire to know our social standing, but are an open goal for bullies.Amy Binns, Senior Lecturer, Journalism and Digital Communication, University of Central LancashireLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/288732014-07-07T16:10:26Z2014-07-07T16:10:26ZSocial media users won’t fight cyberbullying until they imagine what it’s like to be bullied<figure><img src="https://images.theconversation.com/files/53180/original/56fhcz52-1404735189.jpg?ixlib=rb-1.1.0&rect=4%2C1014%2C2848%2C1952&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The physical damage is not always apparent.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/spookman01/6790694167">spookman01</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p>Estimates show that millions of people have been victims of cyberbullying. Sadly, this includes the famous cases where emotional distress caused <a href="http://www.newyorker.com/reporting/2012/02/06/120206fa_fact_parker?currentPage=all">Tyler Clementi</a>, <a href="http://www.torquayheraldexpress.co.uk/Inquest-hears-bullied-Brixham-student-Izzy-Dix/story-20315950-detail/story.html">Izzy Dix</a>, <a href="http://www.independent.com.mt/articles/2014-03-30/news/troubled-teenager-tormented-on-askfm-4433281026/">Lisa Marie Zahra</a>, and dozens of other victims to take their own lives. These bullies are present on almost all Internet forums, but some websites are particularly notorious. For instance, a recent Time magazine article <a href="http://time.com/2926428/ask-fm-the-antisocial-network/?pcd=hp-magmod">described</a> Ask.fm as a website that was a factor in at least 16 deaths. </p>
<p>Many safety measures, such as report buttons, have been tried to combat cyberbullying. Paradoxically, social media users often do not want their chats to be policed.</p>
<p>An example is <a href="http://digg.com/">Digg</a>, a social news site on which people can vote web content up or down, called “digging” and “burying”, respectively. From early 2009 to late 2010, a large group of <a href="http://www.cnet.com/uk/news/report-conservative-groups-gaming-digg/">users banded together</a> to control what appeared on the front page of Digg. These people searched Digg’s pages to find liberal and anti-conservative users. They then used the bury button to force those users’ stories off the front page. </p>
<p>In response, Digg got rid of the bury button. This enraged and upset the general Digg audience. They liked both the dig and bury buttons to express their opinions. In no time, Digg <a href="http://www.forbes.com/sites/insertcoin/2012/07/13/facebook-didnt-kill-digg-reddit-did/">visits fell greatly</a>.</p>
<p>So social media users value the freedom to express their opinion online without meddling. That is why they get angry with anyone who hinders this. There are more freedom-lovers than bullied victims. For social media companies and politicians alike, the objection of the many drowns out the lament of the few. But they have a duty to protect people from harm. This creates a catch-22 situation. </p>
<p>There may be a way out. Cyberbullying can be curbed but it depends on how and when chats are policed.</p>
<h2>How you intervene matters</h2>
<p>I have studied how to combat cyberbullying and <a href="http://link.springer.com/article/10.1007%2Fs10551-013-1806-z">I found</a> that users are fine with their chats being policed if the decision is backed up with a story of the user being bullied.</p>
<p>One group of social media users was asked to imagine themselves as the main character in a cyberbullying story. A second group only read the story without imagining. A third group read dull facts about cyberbullying. After they were done, all groups were asked if they would allow their conversations to be policed. Only the first group accepted internet policing.</p>
<p>Try it for yourself:</p>
<blockquote>
<p>Imagine a troll posted hundreds of messages in the past month, depicting you as a talentless, sex-crazed swindler. Then the bully created a profile under your name and left obscene messages on your own wall. Now not only you get daily death threats, but so do your family and friends. You feel humiliated, helpless, and abused and your professional and social lives suffer.</p>
</blockquote>
<p>Wouldn’t you want someone to intervene?</p>
<h2>Where things get murky</h2>
<p>So the best way to have social media users accept police is through a cyberbullying story in which they imagine themselves as the victim. Ask.fm, Facebook, and Twitter all have buttons to report another user’s abusive activity. They give hints of the chats that need policing. But it is difficult to draw the line between simple teasing and cyberbullying. </p>
<p>Consider these two examples. When in 2012 Besseres Hannover, a German right-wing extremist group, was charged with inciting racial hatred, Twitter blocked its account. Yet when in the same year Hamza Kashgari, a Saudi writer, was deemed a blasphemer by his country’s authorities for a poem and Twitter was filled with hate speech against him, the company allowed the cyber-harassment to continue. Kashgari suffered emotional distress. </p>
<p>We need criteria to judge whether a chat has escalated and intervention is necessary. Article 29, section 2, of the International Bill of Human Rights (1948) provides this:</p>
<blockquote>
<p>In the exercise of one’s rights and freedoms, everyone shall be subject only to such limitations as are determined by law solely for the purpose of securing due recognition and respect for the rights and freedoms of others and of meeting the just requirements of morality, public order and the general welfare in a democratic society.</p>
</blockquote>
<p>When cyberbullying disturbs social networks in this way, social media companies and politicians should not wash their hands in innocence and ignore what is happening. </p>
<p>Instead, paraphrasing Machiavelli, I believe it becomes their responsibility to use the means set out above to protect society from harm and put an end to the online aggression.</p><img src="https://counter.theconversation.com/content/28873/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tom van Laer does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Estimates show that millions of people have been victims of cyberbullying. Sadly, this includes the famous cases where emotional distress caused Tyler Clementi, Izzy Dix, Lisa Marie Zahra, and dozens of…Tom van Laer, Assistant Professor of Consumer Research, ESCP Business SchoolLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/169812013-08-13T05:20:06Z2013-08-13T05:20:06ZWith the right tech, online bullies can be outsmarted<figure><img src="https://images.theconversation.com/files/29085/original/fcf6ybxh-1376313589.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Cyber-bullies can have access to potential victims round-the-clock.</span> <span class="attribution"><span class="source">wentongg</span></span></figcaption></figure><p>Recent revelations about the <a href="http://www.bbc.co.uk/news/uk-23654329">frequency</a> with which children experience cyber-bullying have caused alarm among parents, advertisers that feature on social media sites and even the <a href="http://www.techdirt.com/articles/20130808/17522624116/uk-prime-minister-calls-askfm-vile-site-blames-it-behavior-some-vile-users.shtml">Prime Minister</a>.</p>
<p>Social media services such as Facebook, Twitter and ask.fm enable people from across the world and various walks of life to come together and share materials and experiences. However, they also present the classical dual-use dilemma, whereby technology that is used for good can also be exploited for harm.</p>
<p>Cyber-bullying is one such consequence. Perpetrators have direct and easy access to potential victims 24 hours a day, particularly since many users can now access these sites on mobile phones. The reach of such media is also practically global so the victimisation doesn’t end by removal of physical proximity (as has been the case in traditional offline bullying).</p>
<p>Arguments are often made that victims should simply disengage from the social media used by perpetrators of bullying. However, the reality is not that simple. Social media sites are now an integral part of young people’s daily lives and are becoming ingrained in the social fabric of society. Disengaging from such social media can often mean disengaging from one’s friends and family.</p>
<p>We have to accept that young people are going to continue to use social networks so it might be wise to think about how we can make it safe for them, using technological know-how. Technology is not an answer in its own right but it can be used to reinforce the excellent education work carried out by charities such as Beat Bullying. Used wisely, it can be a lynch-pin in detecting and apprehending cyber-bullies.</p>
<p>For a start, social networks no longer need to manually read the massive volume of online communications that take place between users to identify bullies. Agressive and abusive language can be automatically flagged. But the concept of identity can be fluid in the online world and this has enabled bullies to flourish. It is easy to assume different faces online in a way that is impossible in real life, so cyber-bullies can hide their true selves and even switch identities to continue to victimise someone if they have been pulled up for bad behaviour under another persona.</p>
<p>One example of how technology can be used to fight cyber-bullies addresses this problem in particular. At the <a href="http://www.comp.lancs.ac.uk/isis/">Isis</a> project, we work on resolving the identities of individuals and groups online to make it hard for perpetrators to hide their identities or use multiple personae. By analysing the language used in online communications we can detect key characteristics that distinguish the online interactions of one person from those of another. Social network hosts can then automatically compare communications originating from multiple identities to detect if the same person or group is hiding behind more than one identity and take the necessary action against them if they step out of line.</p>
<p>Another technological solution to this growing problem is to actively engage young people in designing the social networks they use. The <a href="http://eprints.lancs.ac.uk/54400/">UDesignIt</a> project, for example, calls on young people to collaborate on designing their social media environments. The sharp distinction between bully and victim is softened by this collaborative effort and the social media space becomes a safer place to be. </p>
<p>Parents and schools have a role to play to both highlight the risks posed by online interactions, and encourage standards of good behaviour online (just as they do offline). They also know how best to support victims of cyber-bullying when it happens. But it is unfair to expect victims to miss out on the benefits of social media just as it is unfair to tell a mugging victim to stop walking in the streets at night. Thinking smart on this front can help them to have the best of both worlds.</p><img src="https://counter.theconversation.com/content/16981/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Awais Rashid is the director of, and a shareholder in, Isis Forensics Ltd. He receives funding from the EPSRC.</span></em></p>Recent revelations about the frequency with which children experience cyber-bullying have caused alarm among parents, advertisers that feature on social media sites and even the Prime Minister. Social…Awais Rashid, Director of Security Lancaster Research Centre, Lancaster UniversityLicensed as Creative Commons – attribution, no derivatives.