tag:theconversation.com,2011:/au/topics/facebook-ethics-40179/articlesFacebook ethics – The Conversation2023-06-30T22:35:33Ztag:theconversation.com,2011:article/2087082023-06-30T22:35:33Z2023-06-30T22:35:33ZCambodia PM Hun Sen will shut down opposition on election day – even if he can no longer threaten voters on Facebook<figure><img src="https://images.theconversation.com/files/535079/original/file-20230630-14093-3ojj3y.jpg?ixlib=rb-1.1.0&rect=8%2C25%2C5746%2C3879&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Cambodian PM Hun Sen takes a selfie -- but where will he post it now? </span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/cambodias-prime-minister-hun-sen-takes-selfies-with-a-news-photo/1258807502?adppopup=true">Rang Xhhin Sothy/AFP via Getty Images</a></span></figcaption></figure><p>Cambodia’s Prime Minister Hun Sen will no longer be able to use his <a href="https://www.nytimes.com/2023/06/30/world/asia/cambodia-hun-sen-meta-facebook.html">Facebook page</a> to air threats of violence against opposition supporters – but that doesn’t mean he can’t still suppress their vote as the country <a href="https://www.nytimes.com/2023/06/30/world/asia/cambodia-hun-sen-meta-facebook.html">prepares for a general election</a>.</p>
<p>On June 30, 2023, the Facebook page of Hun Sen – who has ruled the country as leader of the Cambodian People’s Party for almost four decades – <a href="https://www.bbc.com/news/world-asia-66062752">appeared to have been deleted</a>. It wasn’t immediately clear whether <a href="https://www.bbc.com/news/world-asia-66062752">Hun Sen had removed the page</a> or Meta had taken it down. But it follows a <a href="https://www.oversightboard.com/news/656303619335474-oversight-board-overturns-meta-s-decision-in-cambodian-prime-minister-case/">recommendation by the oversight board</a> of Facebook’s parent company to “immediately suspend Hun Sen’s Facebook page and Instagram account for six months” over a video in which he calls on political opponents who allege vote-rigging to choose between the “legal system” and “a bat.” In the video posted on Facebook on Jan. 9, Hun Sen also threatens to “gather CPP people to protest and beat (opposition) up.”</p>
<p>The decision comes as a slap in the face for Hun Sen, who <a href="https://abcnews.go.com/Technology/wireStory/cambodias-prime-minister-hun-sen-huge-facebook-fan-100535327">had regularly posted on Facebook</a> to his 14 million followers. But as an <a href="https://thunderbird.asu.edu/about/people/staff-faculty/sophal-ear">expert on Cambodian politics</a>, I know it will do little to affect the result of the general election scheduled for July 23, 2023. Cambodia has had Hun Sen as prime minister <a href="https://apnews.com/article/cambodia-hun-sen-hun-manet-prime-minister-0095b3362ca2d5af4f14dd77c76ef351">for 38 years</a>. And recent events have only tightened Hun Sen’s grip on power.</p>
<h2>Many parties, no opposition</h2>
<p>Voters heading to the polls will again be presented with a lack of real choice – as has been the case in the six national parliamentary ballots held since <a href="https://www.bbc.com/news/world-asia-44966916">nominally democratic elections were restored</a> in 1993.</p>
<p>It isn’t that there won’t be many parties that voters will be able to choose among on July 23. In fact, there will be numerous parties on the ballot, along with the ruling Cambodian People’s Party. In the <a href="https://www.theguardian.com/world/2018/jul/29/cambodia-hun-sen-re-elected-in-landslide-victory-after-brutal-crackdown">2018 national election</a> there were 19 parties other than the CPP.</p>
<p>The problem for democracy watchers is that the list of parties allowed to run does not include the main opposition party, the <a href="https://thediplomat.com/tag/cambodia-national-rescue-party-cnrp/">Cambodia National Rescue Party</a>. The CNRP was conveniently <a href="https://www.loc.gov/item/global-legal-monitor/2017-12-06/cambodia-supreme-court-dissolves-main-opposition-party/">dissolved on Nov. 16, 2017</a>, by order of the Cambodian Supreme Court – which has as its head a permanent committee member of Hun Sen’s CPP.</p>
<p>Further, the Candle Light Party – the last vestige of <a href="https://apnews.com/article/cambodia-opposition-party-election-hun-sen-63659ff8f2de992d84d2be748afbab8b">real, credible opposition in Cambodia</a> – was not permitted to register for the forthcoming election for bureaucratic reasons. The missing paperwork that prevented registration is <a href="https://apnews.com/article/cambodia-election-candlelight-party-deny-registration-7436b0572eefb9b5be3fa724d3cb2fcb">believed by CLP supporters</a> to have been taken during a police raid on opposition headquarters years ago.</p>
<p>These measures build on decades in which Hun Sen and his ruling CPP have <a href="https://www.brusselstimes.com/141921/how-hun-sen-killed-democracy-in-cambodia">removed real choice</a> from Cambodian ballots. And for Hun Sen and the CPP it has been effective: In the last election, held in 2018, the CPP <a href="https://www.aljazeera.com/news/2018/7/30/cambodians-spoil-ballots-to-protest-poll-critics-labelled-a-sham">garnered 77% of the vote</a> and took all 123 seats in the National Assembly.</p>
<h2>Khmer Rouge commander to autocratic leader</h2>
<p>Hun Sen rose to power after being installed as deputy prime minister and foreign minister by the Vietnamese forces that <a href="https://www.history.com/this-day-in-history/pol-pot-overthrown">liberated Cambodia in 1979</a> from the Khmer Rouge – a murderous regime in which <a href="https://www.hrw.org/report/2015/01/12/30-years-hun-sen/violence-repression-and-corruption-cambodia">Hun Sen served as a commander</a> – and then occupied the country for a decade.</p>
<p>With his country still under Vietnamese occupation, Hun Sen became prime minister in 1985 after his predecessor, Chan Sy, died in office. Since then, he has used the power of incumbency – along with a <a href="https://www.washingtonpost.com/archive/politics/1997/09/05/un-office-says-hun-sen-forces-executed-40/20d602e8-9078-41eb-8c34-2e385e86bcc7/">large dose of brute force</a> – to remain in office. </p>
<p>Even when the CPP <a href="https://www.washingtonpost.com/archive/politics/1993/06/11/phnom-penh-rejects-results-of-election/c43a7f1e-abcf-4ebd-b3b2-fe757f96f930/">lost the popular vote in 1993</a>, Hun Sen was able to elbow his way into a prime ministership-sharing position as “second prime minister” with equal power to the “first prime minister,” Prince <a href="https://www.reuters.com/world/asia-pacific/former-cambodian-prime-minister-prince-norodom-ranariddh-has-died-information-2021-11-28/">Norodom Ranariddh</a>, in a deal engineered by Ranariddh’s father, King Norodom Sihanouk.</p>
<p>After falling out with his co-premier, Hun Sen <a href="https://www.hrw.org/news/2007/07/27/cambodia-july-1997-shock-and-aftermath">orchestrated a coup in 1997</a> and replaced Norodom Ranariddh. In <a href="https://doi.org/10.1080/00049910050007032">an election the following year</a>, Hun Sen resumed the role of sole prime minister and embarked on a campaign of repression – arranging for political enemies to be <a href="https://www.hrw.org/report/2015/01/12/30-years-hun-sen/violence-repression-and-corruption-cambodia">arrested, jailed and sometimes exiled</a>.</p>
<p>He let his guard down in 2012 by allowing opposition leaders Kem Sokha and Sam Rainsy to <a href="https://www.loc.gov/item/lcwaN0008472/">form the opposition Cambodia National Rescue Party</a>. The CNRP came within a whisker of defeating the CPP in the 2013 election – some might even argue that it did, but for who <a href="https://www.reuters.com/article/us-cambodia-election-count/cambodia-election-crisis-deepens-as-opposition-rejects-results-idUSBRE97B02I20130812">controlled the counting of the votes</a>.</p>
<p>Since then, attempts to mount opposition to the CPP have been further blunted by the fact that Cambodia’s economy and society have undergone remarkable change – allowing Hun Sen to <a href="https://www.prnewswire.com/news-releases/prime-minister-hun-sen-shares-message-of-economic-growth--covid-response-success-with-north-american-diaspora-301546659.html">claim credit</a> as <a href="https://www.khmertimeskh.com/501245617/cambodias-economy-resilient-despite-external-factors-says-pm-hun-sen/">a sound manager of the economy</a>. Until the COVID-19 pandemic, Cambodia’s annual gross domestic product growth averaged nearly 8% <a href="https://www.worldbank.org/en/country/cambodia/overview">from 1998 through 2019</a>. Meanwhile, gross national income based on an average individual’s purchasing power <a href="https://data.worldbank.org/indicator/NY.GNP.PCAP.PP.CD?locations=KH">has also grown sixfold</a> since 1995, from US$760 to $5,080.</p>
<p>It has come at a cost though. Economic and infrastructure growth has been <a href="https://www.reuters.com/article/cambodia-protests/cambodian-farmers-rise-up-over-land-grabbing-idINSGE62I07I20100319">on the back of a land grab</a> that has disadvantaged rural farmers. I heard of one farmer who described economic development as meaning “they build a road and steal my land.”</p>
<figure class="align-center ">
<img alt="Two men in hard hats shake hands" src="https://images.theconversation.com/files/535086/original/file-20230630-37566-mwecug.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/535086/original/file-20230630-37566-mwecug.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/535086/original/file-20230630-37566-mwecug.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/535086/original/file-20230630-37566-mwecug.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/535086/original/file-20230630-37566-mwecug.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/535086/original/file-20230630-37566-mwecug.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/535086/original/file-20230630-37566-mwecug.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Cambodia Prime Minister Hun Sen shakes hands with China’s ambassador to Cambodia, Wang Wentian.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/cambodias-prime-minister-hun-sen-shakes-hands-with-chinas-news-photo/1258495631?adppopup=true">Tang Chhin Sothy/AFP via Getty Images</a></span>
</figcaption>
</figure>
<p>And frequently that road has been Chinese-built with loans that the <a href="https://www.voanews.com/a/cambodia-seeks-more-loans-from-beijing-amid-fears-of-debt-trap-/6943062.html">Cambodian people and their progeny will have to repay</a>. </p>
<h2>From autocracy to nepotocracy?</h2>
<p>Yet, Hun Sen is unwilling to open his record to the scrutiny of voters or a free press.</p>
<p>In advance of the July 23 vote, the government has cracked down on independent media. One of the last truly independent outlets, the Voice of Democracy, was <a href="https://www.bbc.com/news/world-asia-64621595">shuttered by Hun Sen</a>. Its crime? To publish a story reporting that the <a href="https://www.voanews.com/a/hun-sen-s-eldest-son-emerges-as-likely-successor-in-cambodia/7118136.html">prime minister’s son and heir apparent</a> signed, on behalf of his father, an official government donation to Turkey after the earthquake. Only the prime minister is allowed to sign off on foreign aid packages, and Hun Sen said the report had damaged the government’s reputation.</p>
<p>The source had been a senior government official. Yet, Voice of Democracy was nonetheless blamed and told to apologize, which it did, but then was still shuttered.</p>
<p>While Hun Sen has been successful in controlling the media and suppressing opposition in Cambodia, he is unable to prevent international scrutiny and sanction.</p>
<p>Cambodia’s anti-democratic rule and human rights abuses have been <a href="https://www.europarl.europa.eu/news/en/press-room/20230310IPR77236/human-rights-breaches-in-iran-tunisia-and-cambodia">condemned by the European Union</a>, <a href="https://www.reuters.com/article/uk-cambodia-politics-idAFKBN1DE2LY">the White House</a> and <a href="https://www.ohchr.org/en/press-releases/2023/03/cambodia-un-experts-condemn-verdict-against-opposition-leader-kem-sokha">the United Nations</a>.</p>
<p>Even prior to the most recent crackdown on opposition parties and independent press, the U.S. had <a href="https://home.treasury.gov/news/press-releases/jy0475">placed some Cambodian generals on the Global Magnitsky Human Rights Accountability list</a>, used to sanction “perpetrators of serious human rights abuse and corruption around the world.” The EU, for its part, <a href="https://ec.europa.eu/commission/presscorner/detail/en/IP_20_1469">cut by 20% the number of Cambodian goods eligible for zero duty imports</a> over human rights concerns – a move that will cost Cambodia an estimated 1 billion euros ($1.1 billion) in annual revenue.</p>
<p>But such moves have done little to nudge Cambodia toward democratic practices – and neither will Facebook’s decision to deprive him of a social media account.</p><img src="https://counter.theconversation.com/content/208708/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sophal Ear does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Social media account of Cambodia’s long-serving leader was deleted amid a spat with Facebook over videoed threats of violence against opposition supporters.Sophal Ear, Associate Professor in the Thunderbird School of Global Management, Arizona State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1472612020-10-01T20:05:45Z2020-10-01T20:05:45ZFacebook is merging Messenger and Instagram chat features. It’s for Zuckerberg’s benefit, not yours<p>Facebook Messenger and Instagram’s direct messaging services will be integrated into one system, Facebook has <a href="https://about.instagram.com/blog/announcements/say-hi-to-messenger-introducing-new-messaging-features-for-instagram">announced</a>. </p>
<p>The merge will allow shared messaging across both platforms, as well as video calls and the use of a range of tools drawn from both platforms. It’s currently being rolled out across countries on an opt-in basis, but hasn’t yet reached Australia.</p>
<p>Facebook CEO Mark Zuckerberg <a href="https://www.facebook.com/notes/mark-zuckerberg/a-privacy-focused-vision-for-social-networking/10156700570096634/">announced</a> plans in March last year to integrate Messenger, Instagram Direct and WhatsApp into a unified messaging experience. </p>
<p>At the crux of this was the goal to administer end-to-end encryption across the whole messaging “ecosystem”. </p>
<p>Ostensibly, this was part of Facebook’s renewed focus on privacy, in the wake of several highly publicised scandals. Most notable was its poor data protection that allowed political consulting firm <a href="https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election">Cambridge Analytica</a> to steal data from 87 million Facebook accounts and use it to target users with political ads ahead of the 2016 US presidential election.</p>
<p>In a <a href="https://about.fb.com/news/2020/09/new-messaging-features-for-instagram/">statement</a> released yesterday on the new merge, Instagram CEO Adam Mosseri and Messenger vice president Stan Chudnovsky wrote:</p>
<blockquote>
<p>… one out of three people sometimes find it difficult to remember where to find a certain conversation thread. With this update, it will be even easier to stay connected without thinking about which app to use to reach your friends and family.</p>
</blockquote>
<p>While that may seem harmless, it’s likely Facebook is actually attempting to make its apps inseparable, ahead of a <a href="https://www.bloomberg.com/news/articles/2020-09-15/ftc-said-to-prepare-possible-antitrust-lawsuit-against-facebook">potential anti-trust lawsuit</a> in the US that may try to see the company sell Instagram and WhatsApp. </p>
<p><div data-react-class="InstagramEmbed" data-react-props="{"url":"https://www.instagram.com/p/CFxRG23pZXV","accessToken":"127105130696839|b4b75090c9688d81dfd245afe6052f20"}"></div></p>
<h2>Together, with Facebook, 24/7</h2>
<p>The Messenger/Instagram Direct merge will <a href="https://mashable.com/article/facebook-messenger-instagram/">extend to</a> features rolled out during the pandemic, such as the “<a href="https://about.fb.com/news/2020/09/introducing-watch-together-on-messenger/">Watch Together</a>” tool for Messenger. As the name suggests, this lets users watch videos together in real time. Now, both Messenger and Instagram users will be able to use it, regardless of which app they’re on.</p>
<p>With the integration, new privacy challenges emerge. Facebook has <a href="https://about.fb.com/news/2020/09/privacy-matters-cross-app-communication/">already acknowledged</a> this. And these challenges will present despite Facebook’s overarching privacy policy applying to every app in its app “family”. </p>
<p>For example, in the new merged messaging ecosystem, a user you previously blocked on Messenger won’t automatically be blocked on Instagram. Thus, the blocked person will be able to <a href="https://about.fb.com/news/2020/09/privacy-matters-cross-app-communication/">once again contact you</a>. This could open doors to a plethora of unexpected online abuse.</p>
<h2>Why this is good for Mark Zuckerberg</h2>
<p>This first step – and Facebook’s <a href="https://www.facebook.com/notes/mark-zuckerberg/a-privacy-focused-vision-for-social-networking/10156700570096634/">full roadmap</a> for the encrypted integration of WhatsApp, Instagram Direct and Messenger – has three clear outcomes.</p>
<p>Firstly, end-to-end encryption means Facebook will have <a href="https://www.justice.gov/opa/press-release/file/1207081/download">complete deniability</a> for anything that travels across its messaging tools. </p>
<p>It won’t be able to “see” the messages. While this might be good from a user privacy perspective, it also means anything from bullying, to <a href="https://milwaukeenns.org/2014/05/21/special-report-diploma-mill-scams-continue-to-plague-milwaukees-adult-students/">scams</a>, to illegal drug sales, to <a href="https://www.justice.gov/usao-ednc/pr/jacksonville-man-sentenced-child-pornography-case">paedophilia</a> can’t be policed if it happens via these tools. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/facebooks-push-for-end-to-end-encryption-is-good-news-for-user-privacy-as-well-as-terrorists-and-paedophiles-128782">Facebook's push for end-to-end encryption is good news for user privacy, as well as terrorists and paedophiles</a>
</strong>
</em>
</p>
<hr>
<p>This would stop Facebook being blamed for hurtful or illegal uses of its services. As far as moderating the platform goes, Facebook would effectively become “invisible” (not to mention moderation is <a href="https://journals.sagepub.com/doi/10.1177/2056305120948186">expensive and complicated</a>). </p>
<p>This is all great news for Mark Zuckerberg, especially as Facebook stares down the barrel of <a href="https://www.theverge.com/2020/7/29/21335706/antitrust-hearing-highlights-facebook-google-amazon-apple-congress-testimony">potential anti-trust litigation</a>.</p>
<p>Secondly, once the apps are merged, functionally they will no longer be separate platforms. They will still <em>exist</em> as separate apps with some separate features, but the vast amount of personal data underpinning them will live in one giant, shared database. </p>
<p>Deeper data integration will let Facebook know users more intimately. Moreover, it will be able to leverage this new insight to target users with more advertising and expand further.</p>
<p>Finally, and perhaps most concerning, is that by integrating its apps Facebook could legitimately respond to <a href="https://www.wsj.com/articles/ftc-preparing-possible-antitrust-suit-against-facebook-11600211840">anti-trust lawsuits</a> by saying it can’t separate Instagram or WhatsApp from the main Facebook platform – because they’re the same thing now. </p>
<p>And if they can’t be separated, there’s no way Facebook could sell Instagram or WhatsApp, even if it wanted to. </p>
<h2>100 billion messages a day</h2>
<p>The messaging traffic across Facebook’s platforms <a href="https://about.fb.com/news/2020/09/new-messaging-features-for-instagram/">is vast</a>, with more than 100 billion messages sent daily. And this has <a href="https://www.warc.com/newsandopinion/news/pandemic-lifts-social-media-use-but-for-how-long/43552">only</a> <a href="https://www.nytimes.com/interactive/2020/04/07/technology/coronavirus-internet-use.html">increased</a> during the COVID-19 pandemic.</p>
<p>With the sheer size of its user database, Facebook continues to either purchase, or squash, its competition. Concerns about the company being a monopoly aren’t without merit. </p>
<p><a href="https://www.theverge.com/2018/9/4/17816572/tim-wu-facebook-regulation-interview-curse-of-bigness-antitrust">Researchers</a> and <a href="https://www.theverge.com/2019/5/9/18538106/facebook-co-founder-chris-hughes-breakup-regulation-ftc-us-government">founding Facebook employees</a> have called to have the company split up – and for Instagram and Whatsapp to become separate again.</p>
<p>Just a few months ago, Facebook released its Instagram-housed tool <a href="https://about.instagram.com/blog/announcements/introducing-instagram-reels-announcement">Reels</a> which bears a striking resemblance to TikTok, another social app sweeping the globe. </p>
<p>It seems this is just another example of Facebook trying to use the sheer size of its network to stifle growing competition, aided (perhaps unwittingly) by Donald Trump’s anti-China sentiment.</p>
<p>If competition is important to encouraging innovation and diversity, then the newest development from Facebook discourages both these things. It further entrenches Facebook and its services into the lives of consumers, making it harder to pull away. And this certainly isn’t far from monopolistic behaviour.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/trumps-tiktok-deal-explained-who-is-oracle-why-walmart-and-what-does-it-mean-for-our-data-146566">Trump's TikTok deal explained: who is Oracle? Why Walmart? And what does it mean for our data?</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/147261/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tama Leaver receives funding from Australian Research Council (ARC); he is currently a Chief Investigator in the ARC Centre of Excellence for the Digital Child.</span></em></p>Having an end-to-end encrypted messaging ‘ecosystem’ is a great way for Facebook to evade the full wrath of the law. It has come at a convenient time, too.Tama Leaver, Associate Professor in Internet Studies, Curtin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1291642020-01-07T22:51:19Z2020-01-07T22:51:19ZDeepfakes: Informed digital citizens are the best defence against online manipulation<figure><img src="https://images.theconversation.com/files/308506/original/file-20200105-11896-1hhkp8u.jpg?ixlib=rb-1.1.0&rect=0%2C298%2C3986%2C1999&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Facebook announced Jan. 6 it will remove videos edited to mislead in ways that ‘aren’t apparent to an average person,’ and are the product of artificial intelligence or machine learning. Here, Facebook CEO Mark Zuckerberg testifies at a hearing at the U.S. House Financial Services Committee on Capitol Hill in Washington, Oct. 23, 2019.
</span> <span class="attribution"><span class="source">AP Photo/Andrew Harnik</span></span></figcaption></figure><p>More than a decade ago, Internet analyst and new media scholar Clay Shirky said: “<a href="https://hbr.org/2008/06/next-generation-online-cons">The only real way to end spam is to shut down e-mail communication</a>.” Will shutting down the Internet be the only way to end <a href="https://timreview.ca/article/1282">deepfake</a> propaganda in 2020? </p>
<p>Today, anyone can create <a href="https://www.thefakenewsgenerator.com/">their own fake news</a> and also <a href="https://play.google.com/store/apps/details?id=com.breakyourownnews.breakyourownnews">break it</a>. Online propaganda is more misleading and manipulative than ever. </p>
<p><a href="https://www.creativebloq.com/features/deepfake-examples">Deepfakes</a>, a <a href="https://dx.doi.org/10.2139/ssrn.3213954">specific form of disinformation</a> that uses machine-learning algorithms to create audio and video of real people saying and doing things they never said or did, are <a href="https://www.theguardian.com/news/shortcuts/2019/aug/13/danger-deepfakes-viral-video-bill-hader-tom-cruise">moving quickly toward being indistinguishable from reality</a>. </p>
<p>Detecting disinformation powered by unethical uses of digital media, big data and artificial intelligence, and their spread through social media, is of the utmost urgency. </p>
<p>Countries must <a href="https://www.canada.ca/en/canadian-heritage/services/online-disinformation.html">educate and equip their citizens</a>. Educators also face real challenges in helping youth develop eagle eyes for deepfakes. If young people lack confidence in finding and evaluating reliable public information, their motivation for participating in or relying on our democratic structures will be increasingly at risk. </p>
<h2>Undermining democracy</h2>
<p><a href="https://arxiv.org/abs/1905.08233">It is now possible</a> to generate a video of a person speaking and making ordinary expressions from just a few or even a single image of this person’s face. Face swap apps such as <a href="https://apps.apple.com/fr/app/faceapp-%C3%A9diteur-ia-de-selfie/id1180884341">FaceApp</a> and lip-sync apps such as <a href="https://dubsmash.com/">Dubsmash</a> are examples of accessible user-friendly basic deepfake tools that people can use without any programming or coding background.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/zaos-deepfake-face-swapping-app-shows-uploading-your-photos-is-riskier-than-ever-122334">Zao's deepfake face-swapping app shows uploading your photos is riskier than ever</a>
</strong>
</em>
</p>
<hr>
<p>While the use of this technology may enrapture or stun viewers for its expert depictions in entertainment and gaming industries, the sinister face of deepfakes is a serious threat to both people’s security and democracy.</p>
<p>Deepfakes’ potential to be used as a weapon is alarmingly increasing and many harms can be anticipated based on <a href="https://doi.org/10.1177/1365712718807226">people’s ability to create explicit content without others’ consent</a>. </p>
<p>It’s expected that people will use deepfakes to cyberbully, destroy reputations, blackmail, spread hate speech, incite violence, <a href="https://carnegieendowment.org/2019/09/05/campaigns-must-prepare-for-deepfakes-this-is-what-their-plan-should-look-like-pub-79792">disrupt democratic processes</a>, spread disinformation to targeted audiences and to commit <a href="https://www.wsj.com/articles/fraudsters-use-ai-to-mimic-ceos-voice-in-unusual-cybercrime-case-11567157402">cybercrime and frauds</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/pg5WtBjox-Y?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Danielle Citron, professor at Boston University School of Law, discusses how deepfakes undermine truth and threaten democracy.</span></figcaption>
</figure>
<h2>Deepfake detection</h2>
<p>Key players have ventured into finding a response to deepfake threats. </p>
<p>Facebook announced Jan. 6 it “<a href="https://about.fb.com/news/2020/01/enforcing-against-manipulated-media/">will strengthen its policy toward misleading manipulated videos that have been identified as deepfakes</a>.” The company says it will remove manipulated media that’s been “edited or synthesized — beyond adjustments for clarity or quality — in ways that aren’t apparent to an average person” and if the media is “the product of artificial intelligence or machine learning that merges, replaces or superimposes content onto a video, making it appear to be authentic.”</p>
<p>The news follows Facebook’s “<a href="https://ai.facebook.com/blog/deepfake-detection-challenge-launches-with-new-data-set-and-kaggle-site/">deepfake challenge</a>,” which aims to <a href="https://www.vice.com/en_ca/article/8xwqp3/facebook-deepfake-detection-challenge-dataset">design new tools</a> that detect manipulated media content. The challenge is supported by Microsoft, a consortium on artificial intelligence and a US$10-million fund. </p>
<p>In late October, Facebook CEO Mark Zuckerberg testified at a U.S. House of Representatives Financial Services Committee hearing in Washington about the company’s cryptocurrency plans, <a href="https://www.latimes.com/business/story/2019-10-23/facebook-zuckerberg-house-financial-services-committee">where Zuckerberg faced questions about what the company is doing to prevent deepfakes</a>. </p>
<p>The <a href="https://www.darpa.mil/program/semantic-forensics">Defense Advanced Research Projects Agency (DARPA)</a> of the U.S. Department of Defense is working on using specific types of algorithms to assess the integrity of digital visual media. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1168983859247099904"}"></div></p>
<p>Some researchers discuss the <a href="https://doi.org/10.1109/AVSS.2018.8639163">use of convolutional neural networks</a> — a set of algorithms that loosely replicates the human brain, designed to analyse visual imagery and recognize patterns — to detect the inconsistencies across the multiple frames in deepfakes. Others propose <a href="https://doi.org/10.1109/WACVW.2019.00020">algorithms to detect completely generated faces</a>. </p>
<p>Hani Farid, an expert in digital forensics and <a href="https://www.wired.com/story/wired25-stories-people-racing-to-save-us/">one of the leading authorities on detecting fake photos</a>, and his student Shruti Agarwal at University of California, Berkeley are developing a <a href="https://news.berkeley.edu/2019/06/18/researchers-use-facial-quirks-to-unmask-deepfakes/">software that uses the subtle characteristics of how a person speaks to distinguish this person from the fake version</a>.</p>
<p>Farid is also collaborating very closely with <a href="https://ict.usc.edu/profile/hao-li/">deepfake pioneer Hao Li</a> to confront the problem of “<a href="https://www.technologyreview.com/s/614083/the-worlds-top-deepfake-artist-is-wrestling-with-the-monster-he-created/">increasingly seamless off-the-shelf deception</a>.”</p>
<h2>YouTube nation</h2>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/308508/original/file-20200105-11924-fpulko.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/308508/original/file-20200105-11924-fpulko.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/308508/original/file-20200105-11924-fpulko.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/308508/original/file-20200105-11924-fpulko.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/308508/original/file-20200105-11924-fpulko.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/308508/original/file-20200105-11924-fpulko.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/308508/original/file-20200105-11924-fpulko.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">In 2017, 90 per cent of Canadians aged 18 to 24 were active YouTube users.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>What if we wake up tomorrow to a deepfake of <a href="https://time.com/person-of-the-year-2019-greta-thunberg/">Greta Thunberg, <em>Time</em> magazine’s 2019 Person of the Year</a>, accusing a specific organization to be the major catalyst of climate change? Would any youth be skeptical of the information?</p>
<p>We are living in a digital era when many people expect every answer to be found through a Google search, a YouTube or a Vimeo video or a TED talk. Nearly <a href="https://www.pewresearch.org/global/2018/01/11/people-in-poorer-countries-just-as-likely-to-use-social-media-for-news-as-those-in-wealthier-countries/pg_2018-01-11_global-media-habits_3-02/">100 per cent of Canadian youth between 15 to 24 years old</a> use the internet on a daily basis. Most follow news and current affairs through social media platforms such as Facebook, Twitter and Instagram. </p>
<p>In 2017, <a href="https://www.statista.com/statistics/484416/canada-youtube-penetration-by-age/">90 per cent of Canadians aged 18 to 24</a> were active YouTube users. </p>
<p>According to Statista, a company that provides market and consumer data, “as of May 2019, more than 500 hours of video were uploaded to YouTube every minute,” equating to “<a href="https://www.statista.com/statistics/259477/hours-of-video-uploaded-to-youtube-every-minute/">approximately 30,000 hours of newly uploaded content per hour</a>.” The company reports that between 2014 and 2019 “the number of video content hours uploaded every 60 seconds grew by around 40 percent.” </p>
<p>Many of today’s 18- to 24-year-old social media users recognize the agendas and algorithms behind the posts that pop up on their walls. In my PhD thesis research, I explored how 42 participants in this age group understood refugees in a contexts where ideas about refugees <a href="https://spectrum.library.concordia.ca/983399/">were deeply influenced by social media propaganda, fake news and disinformation</a>. I found that many craved to become influencers and disrupt public commentary and media-generated messages in ways that resonate with <a href="https://www.theguardian.com/environment/2019/sep/27/climate-crisis-6-million-people-join-latest-wave-of-worldwide-protests">advocacy or activist campaigns today led by youth</a>.</p>
<p>The deepfake phenomenon is a new <a href="https://theconversation.com/zaos-deepfake-face-swapping-app-shows-uploading-your-photos-is-riskier-than-ever-122334">critical challenge</a> they, and all participants in our democracies, now face.</p>
<h2>Education for resilience</h2>
<p>In Canada, Journalists for Human Rights announced a new program, funded by Heritage Canada, <a href="https://jhr.ca/our-work/canada-combatting-misinformaton">to train journalists</a> and to enhance “<a href="https://www.newswire.ca/news-releases/launching-jhr-s-program-on-fighting-disinformation-through-strengthened-media-and-citizen-preparedness-in-canada--899686785.html">citizen preparedness against online manipulation and misinformation</a>.”</p>
<p><a href="https://www.cnn.com/interactive/2019/05/europe/finland-fake-news-intl/">Educators can play a key role</a> in fostering youth agency to detect deepfakes and reduce their influence. One challenge is ensuring youth learn critical media literacy skills while they continue to explore valuable resources online and build their capacities and knowledge to participate in democratic structures. </p>
<p>Following steps I have identified in the “<a href="https://theconversation.com/dont-be-a-bystander-five-steps-to-fight-cyberbullying-91440">Get Ready to Act Against Social Media Propaganda</a>” model — beginning with explaining stances on a controversial issue targeted through social media propaganda — educators can help youth discuss how they perceive and recognize deepfakes. They can explore the content’s origins, who it’s targeting, the reaction it’s trying to achieve and who’s behind it. </p>
<p>They can also discuss youth’s role and responsibility to respond and stand up to disinformation and potential digital strategies to pursue in this process. A well-equipped generation of digital citizens could be our best bet.</p>
<p>[ <em><a href="https://theconversation.com/ca/newsletters?utm_source=TCCA&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=expertise">Expertise in your inbox. Sign up for The Conversation’s newsletter and get a digest of academic takes on today’s news, every day.</a></em> ]</p><img src="https://counter.theconversation.com/content/129164/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nadia Naffi receives funding from the National Bank to support the work of her Chair in Educational Leadership. This Chair focuses on training future experts in the field of educational technology, learning and development, and lifelong learning in the era of digital transformation and artificial intelligence. Nadia Naffi is affiliated with the Centre de recherche et d'intervention sur l'éducation et la vie au travail (CRIEVAT), the Observatoire international sur les impacts sociétaux de l'IA et du numérique (OBVIA), the Institut Technologies de l’information et Sociétés (ITIS), the Centre de recherche et d'intervention sur la réussite scolaire (CRIRES), and Milieux Institute for Arts, Culture and Technology.</span></em></p>The abilities to detect and analyze deepfake videos is of the utmost urgency. Deepfakes are a serious threat to people’s security and our democratic institutions.Nadia Naffi, Assistant Professor, Educational Technology, Holds the Chair in Educational Leadership in the Sustainable Transformation of Pedagogical Practices in Digital Contexts, Université LavalLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1237672019-09-22T20:16:16Z2019-09-22T20:16:16ZUsers (and their bias) are key to fighting fake news on Facebook – AI isn’t smart enough yet<figure><img src="https://images.theconversation.com/files/293332/original/file-20190920-16165-xsg2z1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">On its own, human judgement can be subjective and skewed towards personal biases.</span> </figcaption></figure><p>The information we encounter online everyday can be misleading, incomplete or fabricated. </p>
<p>Being exposed to “fake news” on social media platforms such as Facebook and Twitter can influence our thoughts and decisions. We’ve already seen misinformation <a href="https://www.vox.com/2017/4/28/15476142/facebook-report-trump-clinton-russia-us-presidential-election">interfere with elections</a> in the United States.</p>
<p>Facebook founder Mark Zuckerberg has repeatedly <a href="https://www.vox.com/2017/2/16/14640460/mark-zuckerberg-facebook-manifesto-letter">proposed artificial intelligence</a> (AI) as the <a href="https://techcrunch.com/2016/11/14/facebook-fake-news/">solution</a> to the fake news dilemma. </p>
<p>However, the issue likely requires high levels of human involvement, as many experts agree that AI technologies <a href="https://www.forbes.com/sites/charlestowersclark/2018/10/04/can-ai-put-an-end-to-fake-news-dont-be-so-sure/#352bc4532f84">need further advancement</a>. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/we-made-deceptive-robots-to-see-why-fake-news-spreads-and-found-a-weakness-104776">We made deceptive robots to see why fake news spreads, and found a weakness</a>
</strong>
</em>
</p>
<hr>
<p>I and two colleagues have <a href="https://research.fb.com/programs/research-awards/proposals/the-online-safety-benchmark-request-for-proposals/">received funding</a> from Facebook to independently carry out research on a “human-in-the-loop” AI approach that might help bridge the gap. </p>
<p>Human-in-the-loop refers to the involvement of humans (users or moderators) to support AI in doing its job. For example, by creating training data or manually validating the decisions made by AI.</p>
<p>Our approach combines AI’s ability to process large amounts of data with humans’ ability to understand digital content. This is a targeted solution to fake news on Facebook, given its massive scale and subjective interpretation.</p>
<p>The dataset we’re compiling can be used to train AI. But we also want all social media users to be more aware of their own biases, when it comes to what they dub fake news. </p>
<h2>Humans have biases, but also unique knowledge</h2>
<p>To eradicate fake news, asking Facebook employees to make controversial editorial decisions is problematic, as <a href="https://doi.org/10.1145/3209978.3210094">our research found</a>. This is because the way people perceive content depends on their cultural background, political ideas, biases, and stereotypes.</p>
<p>Facebook has employed <a href="https://www.facebook.com/zuck/posts/10103695315624661">thousands</a> of people for content moderation. These moderators spend eight to ten hours a day looking at explicit and violent material such as pornography, terrorism, and beheadings, to decide which content is acceptable for users to see. </p>
<p>Consider them cyber janitors who clean our social media by removing inappropriate content. They play an integral role in shaping what we interact with.</p>
<p>A similar approach could be adapted to fake news, by asking Facebook’s moderators which articles should be removed and which should be allowed.</p>
<p>AI systems could do this automatically at a large scale by learning what fake news is from manually annotated examples. But even when AI can detect “forbidden” content, human moderators are needed to flag content that is controversial or subjective.</p>
<p>A famous example is the Napalm Girl image.</p>
<p>The Pulitzer Prize-winning photograph shows children and soldiers escaping from a napalm bomb explosion during the Vietnam War. The image was posted on Facebook in 2016 and <a href="https://www.theguardian.com/technology/2016/sep/09/facebook-reinstates-napalm-girl-photo">removed</a> because it showed a naked nine-year-old girl, contravening Facebook’s official <a href="https://www.facebook.com/communitystandards/">community standards</a>. </p>
<p>Significant community protest followed, as the iconic image had obvious historical value, and Facebook allowed the photo back on its platform.</p>
<h2>Using the best of brains and bots</h2>
<p>In the context of verifying information, human judgement can be subjective and skewed based on a person’s background and implicit bias. </p>
<p>In our <a href="https://www.damianospina.com/wp-content/uploads/2018/08/roitero2018how.pdf">research</a> we aim to collect multiple “truth labels” for the same news item from a few thousand moderators. These labels indicate the “fakeness” level of a news article.</p>
<p>Rather than simply collect the most popular labels, we also want to record moderators’ backgrounds and their specific judgements to <a href="https://doi.org/10.1145/3308560.3317307">track and explain ambiguity and controversy</a> in the responses.</p>
<p>We’ll compile results to generate a high-quality dataset, which may help us explain cases with high levels of disagreement among moderators. </p>
<p>Currently, Facebook content is treated as binary - it either complies with the standards or it doesn’t.</p>
<p>The dataset we compile can be used train AI to better identify fake news by teaching it which news is controversial and which news is plain fake. The data can also help evaluate how effective current AI is in fake news detection.</p>
<h2>Power to the people</h2>
<p>While benchmarks to evaluate AI systems that can detect fake news are significant, we want to go a step further.</p>
<p>Instead of only asking AI or experts to make decisions about what news is fake, we should teach social media users how to identify such items for themselves. We think an approach aimed at fostering information credibility literacy is possible.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/most-young-australians-cant-identify-fake-news-online-87100">Most young Australians can’t identify fake news online</a>
</strong>
</em>
</p>
<hr>
<p>In our ongoing <a href="https://www.uq.edu.au/news/article/2019/09/figuring-out-fake-news">research</a>, we’re collecting a vast range of user responses to identify credible news content. </p>
<p>While this can help us build AI training programs, it also lets us study the development of human moderator skills in recognising credible content, as they perform fake news identification tasks. </p>
<p>Thus, our research can help design online tasks or games aimed at training social media users to recognise trustworthy information.</p>
<h2>Other avenues</h2>
<p>The issue of fake news is being tackled in different ways across online platforms. </p>
<p>It’s quite often removed through a bottom-up approach, where users report inappropriate content, which is then reviewed and removed by the <a href="https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona">platform’s employees.</a>.</p>
<p>The approach Facebook is taking is to <a href="https://www.theguardian.com/technology/2019/jul/31/facebook-says-it-was-not-our-role-to-remove-fake-news-during-australian-election">demote unreliable content</a> rather than remove it. </p>
<p>In each case, the need for people to make decisions on content suitability remains. The work of both users and moderators is crucial, as humans are needed to interpret guidelines and decide on the value of digital content, especially if it’s controversial. </p>
<p>In doing so, they must try to look beyond cultural differences, biases and borders.</p><img src="https://counter.theconversation.com/content/123767/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Gianluca Demartini receives funding from the Australian Research Council and Facebook. </span></em></p>Sometimes it feels like everybody on social media is fighting about what’s “right” and what’s “wrong”. Well, figuring out why we all have such unique opinions is now helping experts tackle fake news.Gianluca Demartini, Associate professor, The University of QueenslandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1189872019-06-18T20:53:28Z2019-06-18T20:53:28ZWith cryptocurrency launch, Facebook sets its path toward becoming an independent nation<figure><img src="https://images.theconversation.com/files/280095/original/file-20190618-118530-1j9hjdk.jpg?ixlib=rb-1.1.0&rect=94%2C12%2C756%2C465&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The world's newest country?</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/video/clip-24607922-facebook-company-flag-waving-slow-motion-against">railway fx/Shutterstock.com</a></span></figcaption></figure><p>Facebook has announced a plan to launch <a href="https://www.nytimes.com/2019/06/18/technology/facebook-cryptocurrency-libra.html">a new cryptocurrency named the Libra</a>, adding another layer to its efforts to dominate global communications and business. <a href="https://www.nytimes.com/2019/06/18/technology/facebook-cryptocurrency-libra.html">Backed by huge finance and technology companies</a> including Visa, Spotify, eBay, PayPal and Uber – plus a ready-made <a href="https://newsroom.fb.com/company-info/">user base of 2 billion people</a> around the world – Facebook is positioned to pressure countries and central banks to cooperate with its <a href="https://www.nytimes.com/2019/06/18/technology/facebook-cryptocurrency-libra.html">reinvention of the global financial system</a>.</p>
<p>In my view as a <a href="https://newhouse.syr.edu/faculty-staff/jennifer-grygiel">social media researcher and educator</a>, Facebook CEO Mark Zuckerberg is clearly seeking to give his company even more <a href="https://www.jstor.org/stable/41375924">political power on a global scale</a>, despite the potential dangers to society at large. In a sense, he is declaring that he wants Facebook to become a virtual nation, populated by users, powered by a self-contained economy, and headed by a CEO – Zuckerberg himself – who is <a href="https://www.vox.com/recode/2019/5/30/18644755/facebook-stock-shareholder-meeting-mark-zuckerberg-vote">not even accountable to his shareholders</a>.</p>
<p>Facebook <a href="https://theconversation.com/technology-giants-didnt-deserve-public-trust-in-the-first-place-106989">hasn’t behaved responsibly</a> in the past, and is still wrestling with significant public concerns – and investigations – about its <a href="https://www.cnbc.com/2019/06/13/facebook-investigations-by-eu-ireland-regulator-nearing-conclusions.html">privacy practices</a>, <a href="https://www.wsj.com/articles/in-facebooks-effort-to-fight-fake-news-human-fact-checkers-play-a-supporting-role-1539856800">information accuracy</a> and <a href="https://www.theguardian.com/technology/2019/apr/17/eu-tells-facebooks-nick-clegg-to-rethink-ad-funding-rules">targeted advertising</a>. Therefore, it’s important to see through the hype. People must <a href="https://www.jstor.org/stable/586053?seq=1">consider who is reshaping the world</a>, and whether they are doing it in the best interests of humankind – or whether they are just seeking to benefit the new class of elite technology executives. </p>
<p>Humanity needs ethical leadership, and time to think through the potential repercussions of rapid technological change. That’s why, in my view, Facebook’s cryptocurrency <a href="https://www.nytimes.com/2019/06/18/technology/facebook-cryptocurrency-libra.html">should be blocked</a> by financial regulators until its design has been proved to be safe for all of global society.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/280098/original/file-20190618-118526-1jvkige.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/280098/original/file-20190618-118526-1jvkige.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/280098/original/file-20190618-118526-1jvkige.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=410&fit=crop&dpr=1 600w, https://images.theconversation.com/files/280098/original/file-20190618-118526-1jvkige.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=410&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/280098/original/file-20190618-118526-1jvkige.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=410&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/280098/original/file-20190618-118526-1jvkige.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=515&fit=crop&dpr=1 754w, https://images.theconversation.com/files/280098/original/file-20190618-118526-1jvkige.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=515&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/280098/original/file-20190618-118526-1jvkige.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=515&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">You might not want to trust this man.</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:Mark_Zuckerberg_F8_2018_Keynote_(41118893354).jpg">Anthony Quintano/Wikimedia Commons</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<h2>Understanding Libra</h2>
<p>Technology companies are interested in a <a href="https://qz.com/1642172/jack-dorsey-on-bitcoin-facebooks-crypto-and-the-end-of-cash/">global currency that is native to the internet</a>. That could allow companies like Facebook and Twitter to bring in more users to their platforms, and <a href="https://www.coindesk.com/nouriel-roubini-says-facebooks-globalcoin-has-nothing-to-do-with-crypto">collect money</a> from <a href="https://techcrunch.com/2019/06/18/facebook-libra/">businesses who want to join</a> the new system. They also want to <a href="https://www.barrons.com/articles/what-facebooks-cryptocurrency-push-means-51560539185">siphon off business from the existing financial services industry</a>. That sector is worth <a href="https://www.investopedia.com/ask/answers/030515/what-percentage-global-economy-comprised-financial-services-sector.asp">trillions of dollars</a>, is enormously profitable, and yet has <a href="http://fortune.com/2018/06/07/blockchain-firm-r3-is-running-out-of-money-sources-say/">struggled to implement its own digital currency</a>.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1140750182373568514"}"></div></p>
<p>The technical details of Facebook’s plans are still emerging, but it seems that the company is not seeking to compete with <a href="https://www.capmktsreg.org/wp-content/uploads/2019/04/CCMR_statement_Blockchain_Securities_Settlement-Final.pdf">Bitcoin</a> or other <a href="https://www.nytimes.com/2019/02/28/technology/cryptocurrency-facebook-telegram.html">cryptocurrencies</a>. Rather, Facebook is looking to replace the existing <a href="https://www.nytimes.com/2019/06/18/technology/facebook-cryptocurrency-libra.html">global financial system</a> with an all-new setup, with Libra at its center.</p>
<p>The company may be counting on increased public interest in <a href="https://www.forbes.com/sites/geraldfenech/2019/01/17/what-is-plaguing-the-cryptocurrency-market/#7e7a0abd4edf">cryptocurrencies and financial technologies</a>, and its market strength, to overcome objections. However, I don’t believe Facebook should be allowed to <a href="https://www.barrons.com/articles/what-facebooks-cryptocurrency-push-means-51560539185">wreck the global financial system</a> like it has, as many see it, <a href="https://theconversation.com/how-facebook-went-from-friend-to-frenemy-110130">wrecked global communications</a>.</p>
<h2>Speeding global exchange</h2>
<p>There is definitely a need for <a href="https://techcrunch.com/2019/06/18/facebook-libra/">smoother, faster and cheaper</a> ways to <a href="https://www.worldbank.org/en/news/press-release/2019/04/08/record-high-remittances-sent-globally-in-2018">send money around the world</a>, and to provide access to financial services to the many <a href="https://www.theguardian.com/technology/2019/jun/18/facebook-libra-launch-cryptocurrency">people who do not have formal bank accounts</a>. There is real potential to Libra, but there are likely to be ways to improve even more, developing a payment system that better serves the world as a whole.</p>
<p>At least at the moment, the Libra is being designed as a form of <a href="https://www.coindesk.com/icelandic-regulators-approve-startups-plan-for-fiat-payments-on-ethereum">electronic money</a> <a href="https://www.ft.com/content/d4c1e00c-8dd6-11e9-a24d-b42f641eca37">linked to many national currencies</a>. That has <a href="https://www.bloomberg.com/news/articles/2019-06-18/france-calls-for-central-bank-review-of-facebook-cryptocurrency">raised fears</a> that Libra might someday be recognized as a sovereign currency, with Facebook acting as a “<a href="https://www.businessinsider.com/libra-pushback-against-facebook-cryptocurrency-begins-2019-6">shadow bank</a>” that could compete with the central banks of countries around the world.</p>
<p>It doesn’t help that Facebook is already positioning itself to evade <a href="https://www.coindesk.com/facebooks-new-crypto-faces-scrutiny-from-european-authorities">regulatory scrutiny</a> by <a href="https://www.theverge.com/2019/6/18/18682838/facebook-digital-wallet-calibra-libra-cryptocurrency-kevin-weil-david-marcus-interview">creating a corporate subsidiary</a> that will join an <a href="https://www.swissinfo.ch/eng/calibra_swiss-role-in-facebook-cryptocurrency-project-revealed/45038626">ostensibly independent governing body</a> for the Libra.</p>
<p>To protect consumers, regulators should look carefully at whether the new system supporting the Libra is sound. It may be that an entirely new set of financial rules and regulations is needed to shield the existing financial system from harm if the Libra becomes more popular than national currencies. At the very least, governments need to proceed slowly and carefully when new products may introduce systemic risks into our environment. Even the <a href="https://www.cnn.com/videos/business/2019/06/14/google-ceo-sundar-pichai-poppy-harlow-zw-orig.cnn">CEO of Google</a> has acknowledged that. In my opinion, Libra’s planned launch in 2020 does not allow enough time to fully vet this technology and its risks.</p>
<h2>Protecting the global financial system</h2>
<p>Financial regulations have developed over time to encourage <a href="https://corpgov.law.harvard.edu/2016/02/07/fincen-know-your-customer-requirements/">trust between unknown parties</a>, and to protect regular customers from fraudsters and corporate greed. There are also rules that help governments prevent and detect <a href="https://www.finra.org/industry/anti-money-laundering">transactions that support crime and terrorism</a>.</p>
<p>This is not to say that all payments and purchases should be tied to a <a href="https://corpgov.law.harvard.edu/2016/02/07/fincen-know-your-customer-requirements/">known entity online or in real life</a>. <a href="https://www.nber.org/papers/w20126">Cash and anonymity is also a civil right</a> and is key to privacy and personal freedoms. </p>
<p>As new digital financial services, methods of electronic payment and currencies develop and become popular, they should not be allowed to undermine longstanding <a href="https://www.jstor.org/stable/24357502">financial safety systems</a>, even in the name of smoother, cheaper transactions. </p>
<p>My concern is not just about large-volume transactions. Facebook has shown how even small amounts of money can buy <a href="https://www.newyorker.com/magazine/2018/09/17/can-mark-zuckerberg-fix-facebook-before-it-breaks-democracy">microtargeted ads</a> with the power to influence public opinion and election outcomes in the U.S. and around the world.</p>
<h2>Product design and risk assessment</h2>
<p>Facebook has a long history of <a href="https://www.cnbc.com/2019/06/13/facebook-investigations-by-eu-ireland-regulator-nearing-conclusions.html">questionable business models and privacy practices</a>. The public, and their representatives in government – including elected officials, financial regulators and <a href="https://www.theverge.com/2019/6/14/18678785/facebook-libra-cryptocurrency-visa-mastercard-uber-paypal-stripe-association-consortium">central bank authorities</a> – should carefully scrutinize all aspects of Facebook’s cryptocurrency plans. </p>
<p>This concern is especially urgent because Facebook also has a long history of launching products and services, like political ads and <a href="https://theconversation.com/livestreamed-massacre-means-its-time-to-shut-down-facebook-live-113830">live-streaming video</a>, without fully considering their potential to damage democracy and the global society at large.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/EgI_KAkSyCw?wmode=transparent&start=56" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Mark Zuckerberg didn’t think enough about how people could use Facebook for ill.</span></figcaption>
</figure>
<p>The company has demonstrated its inability to serve society beneficially – and it <a href="https://doi.org/10.1016/j.telpol.2018.12.003">may not even be interested in trying</a>. All the signals suggest that customers and regulators alike should carefully examine whether Facebook’s Libra is <a href="http://www.people.hbs.edu/rmerton/Financial%20System%20and%20Economic%20Peformance.pdf">truly innovative</a> or just a way to avoid restrictions on a potentially hazardous financial product.</p>
<h2>Defending democracy</h2>
<p>Facebook’s entrance into the financial industry is a threat to democracies and their citizens around the world, on the same scale as disinformation and information warfare, which also depend on social media for their effectiveness.</p>
<p>It may be hard for world leaders to understand that this is an emergency, as they cannot see the virtual powers aligning against them. But they must huddle quickly to <a href="https://www.jstor.org/stable/24357502?seq=1">ensure they have</a> – and keep – the <a href="https://doi.org/10.1016/j.telpol.2018.12.003">power to protect their people</a> from technology companies’ greed.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1140941085885308929"}"></div></p>
<p>It will be key to understand if Facebook’s future cryptocurrency will ultimately function more like anonymous cash, or more like a traceable credit card transaction. Facebook has the blockchain and encryption technology to create an anonymous digital cash-like system, or a private digital currency, <a href="https://techcrunch.com/2019/03/03/regarding-facebooks-cryptocurrency/">which has not been created yet</a>. Anonymity would heighten the risks of abuse such as money laundering, so it’s worth watching out for a cash-like Facebook cryptocurrency that mirrors the central banks’ cash system.</p>
<p>In addition, I cannot help but reflect on the name that Facebook chose for this, the Libra, which is a <a href="https://www.nytimes.com/2019/06/18/technology/facebook-cryptocurrency-libra.html">reference to the Roman measurement for a pound</a>, once used to mint coins. In many ways the company that Mark Zuckerberg is building is beginning to look more like a Roman Empire, now with its own central bank and currency, than a corporation. The only problem is that this new nation-like platform is a controlled company and is run more like a <a href="https://www.cnbc.com/2018/05/10/mark-zuckerbergs-control-of-facebook-is-like-a-dictatorship-calstrs.html">dictatorship</a> than a sovereign country with democratically elected leaders. Even now, the company may have <a href="https://www.vox.com/the-big-idea/2018/4/9/17214752/zuckerberg-facebook-power-regulation-data-privacy-control-political-theory-data-breach-king">as much power</a> as some countries – and <a href="https://foreignpolicy.com/2016/03/15/these-25-companies-are-more-powerful-than-many-countries-multinational-corporate-wealth-power/">more than others</a>.</p>
<p>In the wake of the not too distant <a href="https://www.economist.com/schools-brief/2013/09/07/crash-course">global financial crisis</a>, and the “fake news” and disinformation culture that is developing, people must slow down and fully evaluate disruptive technology of this magnitude. Society cannot withstand a launch of a cryptocurrency in Facebook’s infamous “<a href="https://hbr.org/2019/01/the-era-of-move-fast-and-break-things-is-over">move fast and break things</a>” style.</p><img src="https://counter.theconversation.com/content/118987/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jennifer Grygiel owns a small number of shares in the following social media companies: Facebook, Google, Twitter, Alibaba, LinkedIn, YY and Snap. Grygiel also owns nominal amounts of the following cryptocurrencies: Bitcoin, Litecoin and Ethereum.</span></em></p>With the launch of the Libra cryptocurrency, Mark Zuckerberg reveals his dreams of building a new virtual country, perhaps inspired by the Roman Empire.Jennifer Grygiel, Assistant Professor of Communications (Social Media) & Magazine, News and Digital Journalism, Syracuse UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1131442019-03-14T10:38:09Z2019-03-14T10:38:09ZFacebook’s ‘pivot’ is less about privacy and more about profits<figure><img src="https://images.theconversation.com/files/263496/original/file-20190312-86696-1o8mzvk.jpg?ixlib=rb-1.1.0&rect=329%2C30%2C3744%2C2919&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Facebook CEO Mark Zuckerberg is trying to bolster his embattled company.</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Facebook-Privacy-Vision/d6cc8112a3b74ebcb23cba1a9c5ba8ce/15/0">AP Photo/Andrew Harnik</a></span></figcaption></figure><p>Facebook’s founder and CEO Mark Zuckerberg’s latest promise is that his social media conglomerate will become a “<a href="https://www.facebook.com/notes/mark-zuckerberg/a-privacy-focused-vision-for-social-networking/10156700570096634/">privacy-focused</a>” one. By turns <a href="https://thehill.com/policy/technology/432894-zuckerberg-says-facebook-will-shift-to-privacy-oriented-platform">lauded</a> and <a href="https://www.technologyreview.com/s/613084/zuckerbergs-new-privacy-essay-shows-why-facebook-needs-to-be-broken-up/">lambasted</a>, this move does not quite address users’ primary problems with the company. </p>
<p>His move is the pragmatic shift of a CEO toward where the market is already headed. Ironically, Zuckerberg’s announcement provides more evidence for <a href="https://www.nytimes.com/2019/03/07/opinion/zuckerberg-privacy-facebook.html">Facebook critics</a> who say the company doesn’t understand even the <a href="https://slate.com/technology/2019/03/facebook-privacy-pivot-zuckerberg-messaging-whatsapp-instagram-merging.html">concept of user privacy</a>.</p>
<p>Zuckerberg has chosen an interesting metaphor to describe this change. He claims that people are shifting from publicly broadcasting their activities and views in a digital “<a href="https://www.nytimes.com/2019/03/06/technology/facebook-privacy-blog.html">town square</a>” – and would rather discuss issues in a more secure, privacy-protective online “<a href="https://www.nytimes.com/2019/03/06/technology/facebook-privacy-blog.html">living room</a>.” His company already owns platforms representing both venues: Facebook is the town square, the <a href="https://www.statista.com/statistics/398136/us-facebook-user-age-groups/">largest platform</a> for sharing widely, and WhatsApp is the living room, the <a href="https://www.statista.com/statistics/730306/whatsapp-status-dau/">largest platform</a> for sharing in small groups.</p>
<p>As a former partner in McKinsey’s strategy practice and now, as a scholar of strategy at Tufts’ Fletcher School studying the <a href="https://sites.tufts.edu/digitalplanet/">effects of digital technologies in 80 countries</a>, I have been analyzing Facebook’s changing strategies for several years. I see Zuckerberg’s latest move as Strategy 101: a market-driven shift of focus. That, by itself, is welcome. What is not so laudable is trying to package the move as a revolutionary solution to his company’s widespread problems with privacy, facilitating fake news and underhand deals to share user data.</p>
<p>Worse, the changes will be difficult to execute and will not happen soon – or at least not soon enough for many users. </p>
<h2>Driven by market forces</h2>
<p>The writing is already on Zuckerberg’s wall: <a href="https://www.slideshare.net/webby2001/infinite-dial-2019">Users are leaving the town square</a> and filling living rooms. <a href="https://www.wsj.com/articles/four-charts-that-show-why-mark-zuckerberg-is-overhauling-facebook-11552070444">U.S. Facebook users’ activity</a> dropped in late 2018, while WhatsApp and Messenger activity grew. </p>
<p>Zuckerberg is merely preparing to shift resources to follow users from one platform to the other. However, Facebook’s business model dictates that the company cannot make a true pivot away from the town square and toward the living room. As a publicly traded company, Zuckerberg has a fiduciary responsibility to shareholders to deliver returns on their investments. The town square makes all the money – and a lot of it. </p>
<p>Despite its troubles, Facebook <a href="https://investor.fb.com/investor-news/press-release-details/2019/Facebook-Reports-Fourth-Quarter-and-Full-Year-2018-Results/default.aspx">made more money in 2018</a> than in previous years, and made more profit too. Most of the company’s billions were generated by the town square version of Facebook.</p>
<p>Zuckerberg hasn’t yet shown a plan for making the WhatsApp living room platform even remotely as profitable. Currently, his company makes <a href="https://www.statista.com/statistics/271258/facebooks-advertising-revenue-worldwide/">98 percent</a> of its revenues from advertisers. It’s had only <a href="https://techcrunch.com/2017/07/11/facebooks-messenger-ads-are-bad-and-must-be-destroyed/">limited success</a> with advertisements in the Messenger app and <a href="https://www.business2community.com/social-media/marketers-get-ready-whatsapp-ads-are-on-the-way-02171780">hasn’t even tested</a> the concept in WhatsApp. </p>
<p>Moreover, the advertising revenue comes from companies that want to <a href="https://theconversation.com/solving-the-political-ad-problem-with-transparency-85366">target the extensively detailed subgroups</a> of Facebook’s social network users. WhatsApp collects far less data and is encrypted, which means its users are <a href="https://www.wired.com/story/facebook-messenger-whatsapp-instagram-chat-combined-encryption-identity/">harder to target</a> as effectively.</p>
<h2>Gradually shifting, not replacing</h2>
<p>It may take Facebook a very long time to figure out how to make money from its shift to more private messaging.</p>
<p>A plan to <a href="https://www.nytimes.com/2019/01/25/technology/facebook-instagram-whatsapp-messenger.html">integrate the technical infrastructure</a> of WhatsApp and Messenger with Instagram would allow users to seamlessly communicate across three platforms for the first time. More importantly for Zuckerberg, though, it would link billions of <a href="https://www.wired.com/story/facebook-messenger-whatsapp-instagram-chat-combined-encryption-identity/">detailed Facebook accounts with WhatsApp</a> users – opening stockpiles of data to mine for advertisers. </p>
<p>The integration could save the company money by letting it consolidate servers that handle messaging – but there is still risk: It could raise the hackles of regulators. <a href="https://www.theverge.com/2019/3/7/18254717/facebook-instagram-whatsapp-regulation-antitrust-mark-zuckerberg-klobuchar-hawley-blumenthal">Technical consolidation</a> could appear to be a preemptive move against calls to break up Facebook, including by prominent lawmakers, such as <a href="https://www.wsj.com/articles/elizabeth-warren-calls-for-breakup-of-amazon-google-facebook-11552065735">U.S. Sen. Elizabeth Warren</a>.</p>
<p>Another possibility is that Facebook intends to adopt a business model more like China’s <a href="https://www.technologyreview.com/s/608578/can-wechat-thrive-in-the-united-states/">immensely popular WeChat</a>. WeChat makes money from commissions on mobile payments for a whole range of services within the app, including shopping, games, meal deliveries and even utility bills. Facebook is working on a <a href="https://www.cnbc.com/2019/03/11/facebooks-cryptocurrency-could-be-a-19b-revenue-opportunity-barclays-says.html">new cryptocurrency</a> and on <a href="https://www.ft.com/content/e045cdd2-0503-11e9-99df-6183d3002ee1">handling payments through WhatsApp</a>. But those efforts are in early stages – and late in the marketplace. </p>
<p>An essential facet of WeChat’s business model is off-limits to Facebook. WeChat’s parent company is <a href="https://www.scmp.com/news/china/society/article/2156297/how-growing-privacy-fears-china-are-driving-wechat-users-away">widely believed to share user data</a> with the Chinese government, in exchange for regulatory protection. That may be part of the political reality of China, but would doom Facebook in the Western markets it currently dominates.</p>
<h2>A major problem still lingers in the living room</h2>
<p>All of this talk about changing the business model ignores Facebook’s real problem: its role in <a href="https://www.motherjones.com/politics/2019/02/facebook-disinformation-timeline/">spreading disinformation</a> and <a href="https://www.motherjones.com/politics/2019/03/facebook-amazon-smile-fundraising-hate-discrimination/">hate speech</a> in communities around the world. Moving users’ attention away from the town square – which needs to be monitored and moderated – to an encrypted, private living room is not a solution.</p>
<p>Private messaging might even make things worse. WhatsApp is already central to a trend of <a href="https://www.latimes.com/world/la-fg-india-whatsapp-2019-story.html">misinformation, fear and violence in India</a>: Users passed the word from town to town that any strangers about might be there to kidnap children. <a href="https://www.latimes.com/world/asia/la-fg-india-whatsapp-2018-story.html">More than 20 innocent people</a> have been killed as a result of these terrifying – but false – rumors. WhatsApp has also been implicated in mob violence in <a href="https://qz.com/1223787/sri-lanka-shut-down-facebook-whatsapp-and-instagram-to-stop-anti-muslim-violence/">Sri Lanka</a>, and voter manipulation in <a href="https://www.pbs.org/newshour/science/whatsapp-skewed-brazilian-election-showing-social-medias-danger-to-democracy">Brazil</a> and <a href="https://www.washingtonpost.com/news/monkey-cage/wp/2019/02/15/its-nigerias-first-whatsapp-election-heres-what-were-learning-about-how-fake-news-spreads/">Nigeria</a>. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/263498/original/file-20190312-86717-1aj1gfg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/263498/original/file-20190312-86717-1aj1gfg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/263498/original/file-20190312-86717-1aj1gfg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=387&fit=crop&dpr=1 600w, https://images.theconversation.com/files/263498/original/file-20190312-86717-1aj1gfg.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=387&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/263498/original/file-20190312-86717-1aj1gfg.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=387&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/263498/original/file-20190312-86717-1aj1gfg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=487&fit=crop&dpr=1 754w, https://images.theconversation.com/files/263498/original/file-20190312-86717-1aj1gfg.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=487&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/263498/original/file-20190312-86717-1aj1gfg.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=487&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Facebook’s policies around user privacy and disinformation are under fire worldwide.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Facebook-Privacy-Scandal/88b6c011d0cc43dd883706087a8fdabc/9/0">AP Photo/Marcio Jose Sanchez</a></span>
</figcaption>
</figure>
<p>The company has <a href="https://thewire.in/tech/whatsapp-india-tracing-fake-news-encryption">claimed it can do little</a> about such falsehoods spreading, because they’re encrypted and sent from user to user, rather than posted more publicly for others to view. Without <a href="https://www.cjr.org/tow_center/whatsapp-doesnt-have-to-break-encryption-to-beat-fake-news.php">addressing the problems of misinformation</a>, shifting more communications to the living room will create more opportunities for fear, havoc and violence. This is especially true in the developing world, where <a href="https://theconversation.com/as-emerging-economies-bring-their-citizens-online-global-trust-in-internet-media-is-changing-95262">users tend to be more trusting of digital media</a> in general. </p>
<p>Zuckerberg also committed not to store data in countries with repressive governments, but that poses fresh problems. Many governments are discussing restrictions on free speech and data sharing, especially in <a href="https://www.statista.com/statistics/268136/top-15-countries-based-on-number-of-facebook-users/">countries where Facebook has some of the highest numbers of users</a>, including India, Brazil, Indonesia, Vietnam, the Philippines and Turkey. Facebook can’t afford to turn its back on these countries and their governments, so this promise, too, sounds a bit hollow.</p>
<p>Zuckerberg’s latest promise is, indeed, one taken from a strategy textbook: Preemptively announce the intent to remake itself without abandoning the core business that funds everything. But it will mean nothing unless Zuckerberg can find a way to genuinely respect the welfare of <a href="https://markets.businessinsider.com/news/stocks/facebook-stock-price-earnings-revenuewall-street-2019-1-1027913555">his 2.7 billion users</a> and improve the quality of social discourse, whether it takes place in town squares or in living rooms.</p><img src="https://counter.theconversation.com/content/113144/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bhaskar Chakravorti has founded and directs the Institute for Business in the Global Context at Fletcher/Tufts that has received funding from Mastercard, Microsoft, the Gates Foundation and the Onassis Foundation. He is a Non-Resident Senior Fellow at Brookings India and a Senior Advisor on Digital Inclusion at the Mastercard Center for Inclusive Growth.</span></em></p>CEO Mark Zuckerberg’s claimed intent to focus on privacy will be hard to execute, will not happen soon and does not address major concerns about the company’s role in society.Bhaskar Chakravorti, Dean of Global Business, The Fletcher School, Tufts UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1101302019-01-30T11:51:21Z2019-01-30T11:51:21ZHow Facebook went from friend to frenemy<figure><img src="https://images.theconversation.com/files/255883/original/file-20190128-108334-1b56cil.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C2447%2C2205&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How do you feel about Facebook?</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/minsk-belarus-february-9-2017-boy-578390869">AlesiaKan/Shutterstock.com</a></span></figcaption></figure><p><em>An updated version of this article was published on Oct. 6, 2021. <a href="https://theconversation.com/facebooks-scandals-and-outage-test-users-frenemy-relationship-169244">Read it here</a>.</em></p>
<p>As Facebook celebrates 15 years of virtual friendship, social science has compiled an <a href="https://doi.org/10.1177/1461444817695745">expansive body of research</a> that documents the public’s love-hate relationship with its best frenemy. </p>
<p>What many once viewed as a confidant has devolved into a messy codependence, mired by ambiguity and <a href="https://www.nbcnews.com/business/consumer/trust-facebook-has-dropped-51-percent-cambridge-analytica-scandal-n867011">mistrust</a>. It’s a relationship that’s both taken for granted, yet extremely high-maintenance, leaving users to wonder whether they should just move on with healthier friends.</p>
<p>But it wasn’t always like this. </p>
<p><iframe id="Oq79e" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/Oq79e/1/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<h2>Friendly beginnings</h2>
<p>At its launch, Facebook was one of the most authentic social networking partners. Existing online networks, like MySpace, had <a href="https://www.businessinsider.com/2009/1/myspace-yes-facebook-kills-our-traffic-but-at-least-we-make-money-nws">influential parent companies</a> that chaperoned their platforms, pestering users with ads and gimmicks. But Facebook promised something different: a genuine connection. It was an unexploited social space to live your best life – well before anyone hashbragged it. </p>
<p>Still today, a friendship with Facebook comes with plenty of perks. Most importantly, it is the friend who brings everyone together. Participating in this community is shown to <a href="https://doi.org/10.1111/j.1083-6101.2007.00367.x">strengthen relationships</a> between close friends and casual acquaintances. Individuals can bond over community causes, shared identities and amusing videos. Facebook has been credited for helping organize coalitions that <a href="https://www.publicdeliberation.net/jpd/vol8/iss1/art11/">took down dictators</a> and <a href="https://www.nytimes.com/2016/07/28/health/the-ice-bucket-challenge-helped-scientists-discover-a-new-gene-tied-to-als.html">raised millions to fight disease</a>.</p>
<p>Adding to Facebook’s popularity, it lets users carefully curate a public image, <a href="http://doi.org/10.1089/cyber.2009.0411">emphasizing the best parts of their lives</a>. The site has become a central source not only for information about one another, but also the world. Social sharing is up, such that <a href="http://www.journalism.org/2018/09/10/news-use-across-social-media-platforms-2018/">two-thirds of U.S. Facebook users report consuming news on the platform</a>.</p>
<p>Academics friended Facebook, too. I led a study revealing that it is <a href="https://doi.org/10.1177/1461444817695745">the most researched subject</a> in the field of information and communication technology since 2005. This focus has led to advances in understanding <a href="https://doi.org/10.1016/j.ijinfomgt.2014.09.004">online interactions</a>, <a href="http://doi.org/10.1089/cyber.2009.0226">digital activism</a> and <a href="https://doi.org/10.1111/j.1083-6101.2007.00367.x">human psychology</a>. </p>
<h2>Undermining trust</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/255860/original/file-20190128-108370-nl4qrl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/255860/original/file-20190128-108370-nl4qrl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/255860/original/file-20190128-108370-nl4qrl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/255860/original/file-20190128-108370-nl4qrl.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/255860/original/file-20190128-108370-nl4qrl.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/255860/original/file-20190128-108370-nl4qrl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/255860/original/file-20190128-108370-nl4qrl.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/255860/original/file-20190128-108370-nl4qrl.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Facebook vacuums up users’ data.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/orangec-cartoon-vacuums-bits-isolated-on-80203072">Alexander Limbach/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>But Facebook’s stunning success has now <a href="http://time.com/5505441/mark-zuckerberg-mentor-facebook-downfall/">come at the expense</a> of the privacy of its virtual friends. Its “<a href="https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/">we sell ads</a>” business model <a href="https://www.wsj.com/articles/the-facts-about-facebook-11548374613">may sound benign</a>, but the platform <a href="https://www.nytimes.com/2019/01/25/opinion/mark-zuckerberg-wsj.html">collects more data and information</a> about users than they may consciously <a href="https://www.cnbc.com/2017/11/17/how-to-find-out-what-facebook-knows-about-me.html">know about themselves</a>. </p>
<p>By <a href="https://theconversation.com/how-cambridge-analyticas-facebook-targeting-model-really-worked-according-to-the-person-who-built-it-94078">sharing users’ data</a>, <a href="https://www.nytimes.com/2019/01/25/opinion/mark-zuckerberg-wsj.html">enabling disinformation campaigns</a> and election interference, Facebook has revealed its allegiances – and they don’t involve protecting users. Carelessness, or what increasingly looks like intentional abuse, of user data has made it <a href="https://theconversation.com/dont-quit-facebook-but-dont-trust-it-either-93776">difficult to trust the platform</a> with people’s most intimate relationships.</p>
<p>These scandals have consequences. Research finds that users can be <a href="https://doi.org/10.1073/pnas.1320040111">emotionally manipulated</a> by changes to Facebook’s algorithm. This has made the public more <a href="https://doi.org/10.1371/journal.pone.0159641">politically polarized</a> and <a href="http://doi.org/10.1177/1077699016630255">less likely to share minority views</a> – implications that may derail democracy. </p>
<p>Algorithms that foster day-to-day social comparison have also taken a toll on mental health. Recent research convincingly shows that Facebook use <a href="https://doi.org/10.1371/journal.pone.0069841">dampens individuals’ happiness</a> – both immediately and over the long term. Using Facebook has been linked to depression and so many other negative psychological outcomes that it inspired a <a href="https://doi.org/10.1016/j.chb.2018.02.009">summary report</a> of 56 studies on the topic.</p>
<h2>Frenemies for now</h2>
<p>Despite widespread calls to #DeleteFacebook in 2018, most users have maintained their profiles. Why? Because abstaining from Facebook means giving up a network that has social currency and value. The site boasts <a href="https://www.aljazeera.com/news/2018/04/number-active-facebook-users-increased-scandals-180426073628185.html">2.2 billion users</a>, nearly 30 percent of the global population. <a href="https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/">As members of Congress recently pointed out</a>, Facebook has few market competitors, meaning it serves as a primary, if not the only, way for large groups to connect. It holds users together (or sometimes hostage) by maintaining relationships with all their friends. </p>
<p><iframe id="QXA7K" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/QXA7K/1/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>For those who prefer Instagram or WhatsApp, know that Facebook owns those too, and is working to <a href="https://www.nytimes.com/2019/01/25/technology/facebook-instagram-whatsapp-messenger.html">consolidate the technology behind them</a>. Even people with the willpower to de-friend Facebook will <a href="https://gizmodo.com/i-cut-facebook-out-of-my-life-surprisingly-i-missed-i-1830565456">still find their data swept up</a> in content that others add to the platform and its affiliates. It’s nearly impossible to escape Facebook’s orbit. </p>
<p>In advance of its anniversary this month, Facebook attempted to restore fond memories by encouraging users to reminisce with the <a href="https://www.wired.com/story/facebook-10-year-meme-challenge/">#10YearChallenge</a>. The award for biggest transformation goes to Facebook itself – from altruistic friend to cagey frenemy. </p>
<p>Recapturing the public’s trust will require significant changes. Options for unaltered news feeds, transparent advertising, and user control of data and metadata would be good places to start. But currently, it’s unclear whether Facebook will make these changes to salvage its billions of friendships. </p>
<p>In the meantime, most of Facebook’s friends are <a href="http://www.pewresearch.org/fact-tank/2018/09/05/americans-are-changing-their-relationship-with-facebook/">updating their privacy settings</a> and just trying to coexist.</p><img src="https://counter.theconversation.com/content/110130/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Elizabeth Stoycheff has received grant funding from WhatsApp, an affiliate of Facebook. </span></em></p>Facebook users no longer see the site as a confidant. They’re struggling with how to deal with a messy codependence – and whether to just break up and move on with healthier friends.Elizabeth Stoycheff, Associate Professor of Communication, Wayne State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1084542019-01-11T11:47:02Z2019-01-11T11:47:02Z3 ways to be smart on social media<figure><img src="https://images.theconversation.com/files/253270/original/file-20190110-43532-7e4oq7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A smarter use of social media can improve your sense of well-being.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/social-media-networking-technology-innovation-concept-381680653?src=Fn-hLvJP8kg4pqBrfDFvKw-1-61">Rawpixel.com/Shutterstock.com</a></span></figcaption></figure><p>This past year, many people <a href="https://www.usatoday.com/story/news/2018/12/19/these-people-deleted-their-facebook-accounts-and-have-no-regrets/2365191002/">deleted</a> their social media accounts following revelations about <a href="https://thehill.com/opinion/technology/417049-hate-speech-fake-news-privacy-violations-time-to-rein-in-social-media">privacy violations on social media platforms</a> and other concerns related to hate speech. </p>
<p>As people adopt their resolutions for the year, it is likely that many more will reconsider their social media use. </p>
<p>However, as a <a href="https://scholar.google.com/citations?user=krMEDisAAAAJ&hl=en">scholar</a> of social media and religion, I’d argue that rather than just stop using social media, people could use it to improve their overall well-being. Here are three ways to do so.</p>
<h2>1. Be active</h2>
<p>Studies have shown that there is a big difference between passive social media use and active use. Scrolling through a newsfeed and merely looking at what others have posted is considered passive social media use. </p>
<p>Conversely, commenting on posts, sharing articles and creating posts constitute active social media use. Research has found that actively using social networking sites can contribute to <a href="http://psycnet.apa.org/doi/10.1111/sipr.12033">feelings of social connectedness</a>. This can contribute to a sense of overall well-being.</p>
<p>On the other hand, a study found that passive Facebook use <a href="https://doi.org/10.1037/xge0000057">increases feelings of envy</a>. Researchers asked participants to sit in a laboratory and passively use Facebook by only browsing and not commenting, sharing or liking content. Participants passively using Facebook were found to have an increase in their feelings of envy. </p>
<h2>2. Focus on meaningful engagement</h2>
<p>Social media sites allow users to engage in various types of communication. There are impersonal forms of communication such as the single click “Like” button and more personal forms of communication such as direct messaging and comments. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/253271/original/file-20190110-43507-15mnrp0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/253271/original/file-20190110-43507-15mnrp0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/253271/original/file-20190110-43507-15mnrp0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/253271/original/file-20190110-43507-15mnrp0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/253271/original/file-20190110-43507-15mnrp0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/253271/original/file-20190110-43507-15mnrp0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/253271/original/file-20190110-43507-15mnrp0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Direct messaging and comments can help with a deeper level of engagement.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/alushta-russia-october-20-2016-woman-572264506?src=D9qEng1ZkoMRbn3Vw7tSnA-1-9">Denys Prykhodov/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>Research has found that direct communication on Facebook can have a positive <a href="https://doi.org/10.1111/jcc4.12162">psychological impact on individuals</a>. A direct message can often lead to feelings of social support and encouragement. It has been found to be <a href="http://dx.doi.org/10.7903/ijecs.1391">particularly helpful</a> when people already share a connection. Direct messaging and personalized comments can provide a deeper level of engagement.</p>
<p>One of these studies showed that commenting on a post, instead of pressing the like button, could improve the mood of the person who made the original post. In one such example, a respondent in the study described how personalized comments, even trivial ones about funny cat videos, can <a href="https://doi.org/10.1111/jcc4.12162">result in feelings of support</a>. </p>
<p>Similarly, research has shown that social networking sites can <a href="https://doi.org/10.1145/2441776.2441936">provide social support</a> to those who have recently lost a job. </p>
<h2>3. Use social media for professional purposes</h2>
<p>According to researchers in Germany, <a href="https://www.iwm-tuebingen.de/www/personen/ma.html?uid=sutz">Sonja Utz</a> and <a href="https://www.uni-muenster.de/Kowi/en/personen/johannes-breuer.html">Johannes Breuer</a>, using social networking sites for professional purposes can result in “informational benefits” such as knowing what is happening in one’s field and developing professional connections.</p>
<p>For example, these scholars found that people who use social networking sites for professional purposes report having greater access to information about timely innovations in their field than nonusers. A <a href="https://doi.org/10.1080/0309877X.2015.1014321">similar study</a> of academics in the United Kingdom found that 70 percent of participants had gained valuable professional information through Twitter. </p>
<p>Researchers, however, have found that these professional benefits <a href="http://dx.doi.org/10.5817/CP2016-4-3">require active use</a> of social networking sites. “Frequent skimming of posts,” as Utz and Breuer explain, can lead to “short time benefits.” What is more important, however, are “active contributions to work-related discussions.” </p>
<p>Indeed, there are those who <a href="https://doi.org/10.1093/aje/kww189">recommend curtailing use of social media</a> and focusing instead on real-world relationships. But, as with everything else, moderation is vital.</p><img src="https://counter.theconversation.com/content/108454/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>A. Trevor Sutton does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Don’t swear off social media. Use it to your advantage.A. Trevor Sutton, Ph.D. Student in Doctrinal Theology, Concordia SeminaryLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/939062018-05-01T22:15:35Z2018-05-01T22:15:35ZWhy do we stay on Facebook? It’s complicated<figure><img src="https://images.theconversation.com/files/216893/original/file-20180430-135851-10of5zx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A protester wears a mask with the face of Facebook founder Mark Zuckerberg, in between men wearing angry face emoji masks, during a protest against Facebook in London in April 2018. </span> <span class="attribution"><span class="source">(AP Photo/Alastair Grant)</span></span></figcaption></figure><p>Lately I can’t log onto Facebook without being asked to consider my own motivations for using the site. </p>
<p>As a researcher who focuses on online communities, I’m accustomed to this running meta-narrative about what it is I’m actually doing online — but usually, that narrative plays inside my head, not all the way down the feed I’m scrolling through. It’s like my research questions have sprung to life these days: What’s Facebook all about, anyway? Is this even fun? If it’s not fun … what is it, exactly? </p>
<p>This is an exciting time in the very short history of social media use. </p>
<p>Facebook’s users are becoming critical of the systems into which they’ve been conscripted. This is an important moment: Will public opinion follow the same well-worn cycle of outrage and acceptance, or will it jump the tracks and begin engaging Facebook on new, more challenging terms? </p>
<p>Researchers have been asking tough questions <a href="https://www.researchgate.net/publication/315063279_What_have_we_learned_about_social_media_by_studying_Facebook_A_decade_in_review">about Facebook for the past decade,</a> but even armed with the most prestigious credentials, they pose a much smaller threat than educated consumers. And without consumer outrage, government regulation seems unlikely to move forward.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/why-not-nationalize-facebook-93816">Why not nationalize Facebook?</a>
</strong>
</em>
</p>
<hr>
<h2>‘Sound and fury’</h2>
<p>So far, at least in my own feed, the same old script is being followed to the letter. The soul-searching is punctuated by passionate cris-de-coeur from the feed’s more opinionated characters: Wake up, sheeple! If you’re not paying for the product, you are the product — remember? <a href="https://theconversation.com/why-we-should-all-cut-the-facebook-cord-or-should-we-93929">Quit Facebook! Encrypt your data!</a> Smash your phone under the heel of your steel-toed boots!</p>
<p>Next, right on cue, the incisive social commentators swoop in to remind us that these calls are coming from inside the house. “Pretty ironic that you’re posting all this stuff on Facebook!” To which everyone silently rolls their eyes in resignation. Cue the gallows humor about how we’re all under constant surveillance, rinse and repeat. The human condition’s same old two-step. Sound and fury, signifying nothing. </p>
<p>That this discursive cycle was triggered by the revelations earlier this year that voter profiling company <a href="https://www.theglobeandmail.com/world/article-what-is-cambridge-analytica-and-what-did-it-do-a-guide/">Cambridge Analytica obtained the Facebook data</a> of 50 million American accounts is beside the point. </p>
<p>This is only the latest in a long series of such leaks about data mining. In 2017, approximately 200 million registered voters’ personal data stored by voter profiling company Deep Root Analytics was <a href="https://mashable.com/2017/06/19/200m-voters-exposed-gop-data-leak/#uFUmUS3_i8qD">accidentally made public</a>. The previous year, <a href="https://www.wired.com/2016/07/heres-know-russia-dnc-hack/">Russian hackers accessed</a> a large cache of voter information owned by the Democratic National Committee.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/216933/original/file-20180430-135837-csmtg4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/216933/original/file-20180430-135837-csmtg4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/216933/original/file-20180430-135837-csmtg4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/216933/original/file-20180430-135837-csmtg4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/216933/original/file-20180430-135837-csmtg4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/216933/original/file-20180430-135837-csmtg4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/216933/original/file-20180430-135837-csmtg4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Facebook CEO Mark Zuckerberg testifies on Capitol Hill on April 11, 2018 about the use of Facebook data to target American voters in the 2016 election and data privacy.</span>
<span class="attribution"><span class="source">(AP Photo/Andrew Harnik)</span></span>
</figcaption>
</figure>
<p>What this latest go-round is revealing is that these are industry practices that will carry on undisturbed, regardless of what Mark Zuckerberg <a href="http://www.cbc.ca/news/technology/facebook-zuckerberg-congress-election-1.4612495">says or does</a>. This is not a Zuckerberg problem anymore; it’s a problem with an advertising model that is the industry standard.</p>
<p>Most of us Facebook users have been on the platform for about a decade, and perhaps our outrage is our growing pains. </p>
<p>We’ve gained some critical distance through time spent on the platform. We are less easily distracted by the ostensible fun the platform offers. And we appear to be compelled to ask questions about Facebook we’ve never asked before. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/its-time-we-demanded-the-protection-of-our-personal-data-94960">It's time we demanded the protection of our personal data</a>
</strong>
</em>
</p>
<hr>
<h2>Must ask different questions</h2>
<p><a href="https://www.concordia.ca/artsci/coms/faculty.html?fpid=fenwick-mckelvey">Fenwick McKelvey,</a> co-director of the Media History Research Centre at Concordia University’s <a href="https://milieux.concordia.ca">Milieux Institute for Art, Culture and Technology</a>, wishes that the media would start asking different questions about how data is being used by platforms like Facebook. </p>
<p>“The media narrative still assumes that the goal of these platforms (like Facebook) is to expose people to information,” McKelvey told me. “But it’s less and less about that — the goal is to manage and control people’s behaviour.” </p>
<p>Among the urgent questions media commentators should be asking, McKelvey believes, is how online advertisers are deploying user data to subtly nudge people. He provides the illustrative example of SnapChat — <a href="https://www.snap.com/en-US/privacy/privacy-policy/">a company with relatively strong privacy settings in place</a> — that leaks data to advertisers with dizzying granularity that reflects the industry standard. </p>
<p>Through SnapChat’s protocols, your phone informs advertisers how much time passes between the moment you’re served one of their ads and the moment you make a purchase at their business, either online or in person. </p>
<p>Every time you walk into a retailer with your phone’s location services on, you are leaking data about your consumption habits. </p>
<p>Perhaps we should be burrowing even deeper into Facebook’s business practices. </p>
<p>Facebook tends to rely on the fact that most of its data collection practices are laid bare in its terms of service. But according to <a href="https://www.concordia.ca/research/lifestyle-addiction/about/team/french.html">Martin French, an assistant professor of sociology at Concordia,</a> Facebook’s notion of “consent” is flimsy at best.</p>
<h2>Most unaware of how their data is being used</h2>
<p>“Facebook reportedly changed its policies after 2015 to stop app developers accessing information on app users’ network. But for me the question is: Are Facebook users, in the real world, actually aware of the changing ways their data is being used, and the policies that purportedly govern these uses?” wonders French. </p>
<p>French posits that <a href="https://conferences.sigcomm.org/imc/2011/docs/p61.pdf">based on research</a> that has been <a href="http://maritzajohnson.com/publications/2012-sesoc.pdf">done on who reads and understands social media privacy policies,</a> most users are unaware of how their data is actually being used. The “consent” that Facebook is talking about when they refer to an agreement with their users is not really a kind of consent that conforms to any dictionary definition of that term. </p>
<p>The consensus among social scientists who study life online is that whatever dynamics play out online have offline analogs. </p>
<p>We’ve had a decade to incorporate Facebook into our lives, and like any learning process, our success with it has been uneven. </p>
<p>We’re at a critical moment as users of Facebook. It’s our responsibility to educate ourselves about the implications of our participation. Deactivating our accounts won’t change how our personal data is valued to advertisers.</p>
<p>But perhaps, as we become mature users of social media, we can begin to demand that limits be set on how and when our data is bought and sold.</p><img src="https://counter.theconversation.com/content/93906/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Kathryn Jezer-Morton does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>We’re at a critical moment as users of Facebook. It’s our responsibility to educate ourselves about how our data is bought and sold.Kathryn Jezer-Morton, Doctoral student , Concordia UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/950302018-04-27T10:44:32Z2018-04-27T10:44:32ZThe internet is designed for corporations, not people<figure><img src="https://images.theconversation.com/files/216545/original/file-20180426-175035-vyoh03.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Conversations on Facebook ethics are part of a bigger conversation about information architecture.</span> <span class="attribution"><span class="source">AP Photo/Alastair Grant</span></span></figcaption></figure><p>Urban spaces are often <a href="https://www.theatlantic.com/business/archive/2014/06/how-cities-use-design-to-drive-homeless-people-away/373067/">designed</a> to be subtly hostile to certain uses. Think about, for example, the seat partitions on bus terminal benches that make it harder for the homeless to sleep there or the decorative leaves on railings in front of office buildings and on university campuses that serve to make skateboarding dangerous. </p>
<p>Scholars call this <a href="https://www.theatlantic.com/business/archive/2014/06/how-cities-use-design-to-drive-homeless-people-away/373067/">“hostile urban architecture.”</a> </p>
<p>When a few weeks ago, news broke that <a href="https://www.nytimes.com/2018/04/04/technology/mark-zuckerberg-testify-congress.html">Facebook shared millions of users’ private information</a> with Cambridge Analytica, which then used it for political purposes, I saw the parallels. </p>
<p>As a <a href="https://scholar.google.com/citations?user=ZiL1i4kAAAAJ&hl=en&oi=ao">scholar</a> of the social and political implications of technology, I would argue the internet is designed to be hostile to the people who use it. I call it a “hostile information architecture.” </p>
<h2>The depth of the privacy problem</h2>
<p>Let’s start with Facebook and privacy. Sites like Facebook <a href="https://theconversation.com/fragmented-us-privacy-rules-leave-large-data-loopholes-for-facebook-and-others-94606">supposedly protect user privacy</a> with a practice called “notice and consent.” This practice is the business model of the internet. Sites fund their “free” services by <a href="https://www.cnn.com/2018/03/26/opinions/data-company-spying-opinion-schneier/index.html">collecting information</a> about users and <a href="https://www.nytimes.com/2018/03/19/opinion/facebook-cambridge-analytica.html">selling that information</a> to others. </p>
<p>Of course, these sites present privacy policies to users to notify them how their information will be used. They ask users to “click here to accept” them. The problem is that these policies are <a href="https://theconversation.com/nobody-reads-privacy-policies-heres-how-to-fix-that-81932">nearly impossible to understand</a>. As a result, no one knows what they have consented to. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/216550/original/file-20180426-175047-oc20oj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/216550/original/file-20180426-175047-oc20oj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/216550/original/file-20180426-175047-oc20oj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/216550/original/file-20180426-175047-oc20oj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/216550/original/file-20180426-175047-oc20oj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/216550/original/file-20180426-175047-oc20oj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/216550/original/file-20180426-175047-oc20oj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Users are also unable to protect themselves, as opting out of sites like Facebook and Google isn’t viable for most.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/privacy-settings-web-page-computer-screen-308763962?src=2f2sgR6d21LV5AkGj81wMQ-1-57">David M G/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>But that’s not all. The problem runs deeper than that. Legal scholar <a href="https://its.law.nyu.edu/facultyprofiles/index.cfm?fuseaction=profile.overview&personid=28509">Katherine Strandburg</a> has <a href="https://chicagounbound.uchicago.edu/uclf/vol2013/iss1/5/">pointed out</a> that the entire metaphor of a market where consumers trade privacy for services is deeply flawed. It is advertisers, not users, who are Facebook’s real customers. Users have no idea what they are “paying” and have no possible way of knowing the value of their information. Users are also unable to protect themselves, as opting out of sites like Facebook and Google isn’t viable for most. </p>
<p>As I have <a href="https://ssrn.com/abstract=2533057">argued in an academic journal</a>, the main thing notice and consent does is subtly communicate to users the idea that their privacy is a commodity that they trade for services. It certainly does not protect their privacy. It also hurts innocent people. </p>
<p>It’s not just that most of those whose data made it to Cambridge Analytica did not consent to that transfer, but it’s also the case that Facebook has vast troves of data even on those who <a href="https://www.aclu.org/blog/privacy-technology/internet-privacy/facebook-tracking-me-even-though-im-not-facebook">refuse to use</a> its services. </p>
<p>Not unrelated, news broke recently that thousands of Google Play apps – probably illegally – <a href="http://blogs.edweek.org/edweek/DigitalEducation/2018/04/android_mobile_apps_track_children_study.html">track children</a>. We can expect stories like this to surface again and again. The truth is there is too much money in personal information. </p>
<h2>Facebook’s hostile information architecture</h2>
<p>Facebook’s privacy problem is both a symptom of its hostile information architecture and an excellent example of it. </p>
<p>Several years ago, two of my colleagues, <a href="http://www.celinelatulipe.com/">Celine Latulipe</a> and <a href="https://webpages.uncc.edu/richter/">Heather Lipford</a> and I published <a href="https://ssrn.com/abstract=1427546">an article</a> in which we argued that many of Facebook’s privacy issues were problems of design. </p>
<p>Our argument was that these design elements violated ordinary people’s expectations of how information about them would travel. For example, Facebook allowed apps to collect information on users’ friends (this is why the Cambridge Analytica problem impacted so many people). But no one who signed up for, say, tennis lessons would think that the tennis club should have access to personal information about their friends. </p>
<p>The details have changed since then, but they aren’t better. Facebook still makes it very hard for you to control how much data it gets about you. Everything about the Facebook experience is very carefully curated. Users who don’t like it have little choice, as the site has a virtual monopoly on social networking. </p>
<h2>The internet’s hostile architecture</h2>
<p><a href="http://www.lessig.org/about/">Lawrence Lessig</a>, one of the leading legal scholars of the internet, <a href="http://codev2.cc/">wrote a pioneering book</a> that discussed the similarities between architecture in physical space and things like interfaces online. Both can regulate what you do in a place, as anyone who has tried to access content behind a “paywall” immediately understands.</p>
<p>In the present context, the idea that the internet is at least somewhat of a public space where one can meet friends, listen to music, go shopping, and get news is a complete myth. </p>
<p>Unless you make money by trafficking in user data, internet architecture is hostile from top to bottom. That the business model of companies like Facebook is based on targeted advertising is only part of the story. Here are some other examples of how the internet is designed by and for companies, not the public.</p>
<p>Consider first that the internet in the U.S. isn’t actually, in any legal sense, a public space. The hardware is all owned by telecom companies, and they have <a href="https://arstechnica.com/tech-policy/2014/02/isp-lobby-has-already-won-limits-on-public-broadband-in-20-states/">successfully lobbied</a> 20 state legislatures to ban efforts by cities to build out public broadband. </p>
<p>The Federal Trade Commission has recently declared its intention to undo Obama-era <a href="http://theconversation.com/understanding-net-neutrality-10-essential-reads-71848">net neutrality</a> rules. The rollback, which treats the internet as a <a href="http://ssrn.com/abstract=2117497">vehicle for delivering paid content</a>, would allow ISPs like the telecom companies to deliver their own content, or paid content, faster than (or instead of) everyone else’s. So advertising could come faster, and your blog about free speech could take a very long time to load. </p>
<p>Copyright law gives sites like YouTube very strong legal incentives to <a href="https://ssrn.com/abstract=1577785">unilaterally and automatically, without user consent, take down</a> material that someone says is infringing, and very few incentives to restore it, even if it is legitimate. These takedown provisions include content that would be protected free speech in other contexts; both President Barack Obama and Senator John McCain campaigns had material removed from their YouTube channels in the weeks prior to the 2008 elections. </p>
<p>Federal requirements that content-filtering software is installed in public libraries that receive federal funding <a href="https://ssrn.com/abstract=1288090">regulate</a> the only internet the poor can access. These privately produced programs are designed to block access to pornography, but they tend to sweep up other material, particularly if it is about LGBTQ+ issues. Worse, the companies that make these programs are under no obligation to disclose how or what their software blocks.</p>
<p>In short, the internet has enough seat dividers and decorative leaves to be a hostile architecture. This time, though, it’s a hostile information architecture.</p>
<h2>A broader conversation</h2>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/216553/original/file-20180426-175074-k4z9d8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/216553/original/file-20180426-175074-k4z9d8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=370&fit=crop&dpr=1 600w, https://images.theconversation.com/files/216553/original/file-20180426-175074-k4z9d8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=370&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/216553/original/file-20180426-175074-k4z9d8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=370&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/216553/original/file-20180426-175074-k4z9d8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=465&fit=crop&dpr=1 754w, https://images.theconversation.com/files/216553/original/file-20180426-175074-k4z9d8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=465&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/216553/original/file-20180426-175074-k4z9d8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=465&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">What should be included in today’s conversations about Facebook?</span>
<span class="attribution"><span class="source">AP Photo/Carolyn Kaster</span></span>
</figcaption>
</figure>
<p>So let’s do have a conversation about Facebook. But let’s make that part of a bigger conversation about information architecture, and how much of it should be ceded to corporate interests. </p>
<p>As the celebrated urban theorist and activist <a href="https://www.pps.org/article/jjacobs-2">Jane Jacobs</a> <a href="https://books.google.com/books/about/The_Death_and_Life_of_Great_American_Cit.html?id=P_bPTgOoBYkC">famously wrote</a>, the best public spaces involve lots of side streets and unplanned interactions. Our current information architecture, like our heavily surveilled urban architecture, is going in the opposite direction.</p><img src="https://counter.theconversation.com/content/95030/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Gordon Hull does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>An expert explains how Facebook’s privacy issues are linked to a bigger problem – a ‘hostile information architecture,’ largely controlled by corporate interests.Gordon Hull, Associate Professor of Philosophy, Director of Center for Professional and Applied Ethics, University of North Carolina – CharlotteLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/947772018-04-12T11:23:11Z2018-04-12T11:23:11ZFacebook is now a vital part of our democracy<p>In the wake of the scandal over the misuse of user data by <a href="https://theconversation.com/uk/topics/cambridge-analytica-51337">Cambridge Analytica</a>, Facebook has been branded a threat to democracy. But would democracy actually function better without the social networks upon which we’ve come to rely?</p>
<p>Facebook and other social media platforms play a major role in our political life. Many citizens receive their daily news from social media, while circulation numbers of newspapers and audience shares of news broadcasts are in <a href="http://www.digitalnewsreport.org/interactive-2017/">decline</a>. This is especially true for young citizens. They live in a world in which news is not something they actively have to seek out. Instead, they rely on news finding them, through <a href="http://journalismresearchnews.org/article-news-will-find-might-not-make-wiser/">push mechanisms</a> on social media platforms.</p>
<p>Social network sites are also an important space for political discussions and movements such as <a href="https://theconversation.com/march-for-our-lives-awakens-the-spirit-of-student-and-media-activism-of-the-1960s-93713">#MeToo or #MarchForOurLives</a>. Importantly, political actors use social media as well as citizens. Barack Obama was among the first politicians to embrace it to contact voters directly ahead of his election as US president in 2008 and it has since become the cornerstone of every serious campaign strategy.</p>
<p>In a <a href="http://journals.sagepub.com/doi/10.1177/1461444817745017">study</a> conducted at the <a href="https://twitter.com/cfjsdu">Centre for Journalism</a> at the University of Southern Denmark during the latest Danish national election in 2015, my colleagues and I found that social media played a major role in <a href="https://videnskab.dk/kultur-samfund/valgkamp-pa-sociale-medier-partierne-slar-nyhedsmedierne">informing citizens</a>. Unsurprisingly, 97% of Danish members of parliament were maintaining a <a href="http://journals.sagepub.com/doi/abs/10.1177/0163443715620924">Facebook profile</a> during the campaign. We found that first-time voters using social media encountered information shared directly by political actors as often as they did from the pages of more traditional news media.</p>
<h2>Users engage</h2>
<p>People who used social media as a source of campaign news during election times were more <a href="http://journals.sagepub.com/doi/10.1177/1461444817745017">actively engaged</a> in the campaign. They were more likely to attend a political rally, use a vote advice application or discuss politics with their peers. Our study suggests that first-time voters’ social media use increased their campaign participation and also helped them to become more certain about their vote – which is an important driver for voters actually turning out on election day.</p>
<p>Other <a href="https://www.tandfonline.com/doi/abs/10.1080/1369118X.2015.1008542">studies</a> have shown that using platforms such as Facebook, YouTube or Twitter can help citizens become more politically active. That’s particularly the case for Generation Z and millennials, whose political socialisation is tightly connected to the existence of social media networks. Their protest is organised on Facebook, their pictures about a zero waste lifestyle are shared on <a href="https://www.instagram.com/explore/tags/zerowaste/">Instagram</a> – and if they ever contact a politician, they use Twitter to do it.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/214333/original/file-20180411-570-15lxzl8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/214333/original/file-20180411-570-15lxzl8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=408&fit=crop&dpr=1 600w, https://images.theconversation.com/files/214333/original/file-20180411-570-15lxzl8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=408&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/214333/original/file-20180411-570-15lxzl8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=408&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/214333/original/file-20180411-570-15lxzl8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=512&fit=crop&dpr=1 754w, https://images.theconversation.com/files/214333/original/file-20180411-570-15lxzl8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=512&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/214333/original/file-20180411-570-15lxzl8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=512&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Have you deleted yet?</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>So campaigns such as <a href="https://www.theguardian.com/commentisfree/2018/mar/27/pioneer-delete-facebook-addiction-social-life">#deletefacebook</a> are potentially creating a political information vacuum that we don’t know how to fill. We could hope for the clock to turn back to when single news outlets had greater importance in informing us. But is it realistic to think that generations who grew up with social media will find their way back to more cumbersome ways of seeking news once they get rid of Facebook? Or will they instead just tune out?</p>
<h2>What’s the alternative?</h2>
<p>Political micro-targeting is a powerful – and therefore dangerous – tool. It is extremely worrying that ads have been placed in users’ Facebook newsfeeds based on psychometric data that companies hold about them, with the aim of influencing their vote decision. Identifying the full scale of this misuse is of course a pressing concern. </p>
<p>But there are signs that the network will learn from its mistakes. Appearing in front of the US congress, Facebook founder Mark Zuckerberg appeared rather more introspective than when he declared, in 2016, that the suggestion Facebook influenced the election outcome was a “<a href="https://www.theguardian.com/technology/2016/nov/10/facebook-fake-news-us-election-mark-zuckerberg-donald-trump">pretty crazy idea</a>”.</p>
<p>His comments on Facebook’s openness to political regulations and his sympathy for EU’s new General Data Protection Regulation <a href="https://techcrunch.com/2018/04/04/zuckerberg-gdpr/">(GDPR)</a> show a growing understanding of the role the network plays in the stability of a political system. Chief operating officer Sheryl Sandberg’s recent <a href="http://time.com/5230506/facebook-pay-ad-free/">remarks</a> about a paid version of Facebook also address the recent criticism of <a href="https://www.theguardian.com/technology/2018/mar/31/big-data-lie-exposed-simply-blaming-facebook-wont-fix-reclaim-private-information">data-capitalism</a> as social media’s revenue model. </p>
<p>But the self-healing powers of big corporations are limited. This willingness to improve needs to be judged by deeds. If a better Facebook is to become a reality, it will have to involve working with legislative authorities and others, including scientists, to give voice to users’ rights and needs.</p>
<p>Social media is still relatively young. Facebook and other platforms need a chance to learn from mistakes. Deleting Facebook from our lives may have more serious consequences for democracy than personalised political advertising.</p><img src="https://counter.theconversation.com/content/94777/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jakob Ohme does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>If you want us to delete social media, you need to fill the political news vacuum it creates.Jakob Ohme, Assistant Professor at the Centre for Journalism, University of Southern DenmarkLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/947362018-04-11T01:29:06Z2018-04-11T01:29:06ZMark Zuckerberg’s Facebook apology is the linguistic equivalent of ‘shit happens’<p>A corporate apology echoes the words we are so familiar with from our everyday lives – but it is a distinct beast. It happens under the glare of media and is issued by an office holder in a complex management structure, to a mass and impersonalised audience. </p>
<p>And its contents may be subject to legal proceedings. It may also be couched in words which create the veneer of an apology without a detailed admission of guilt. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/why-the-business-model-of-social-media-giants-like-facebook-is-incompatible-with-human-rights-94016">Why the business model of social media giants like Facebook is incompatible with human rights</a>
</strong>
</em>
</p>
<hr>
<p>This week, two high profile CEOs have issued public apologies on behalf of their corporations. Facebook CEO Mark Zuckerberg <a href="https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/?utm_term=.630cbe748f9d">apologised at a US Congress hearing</a> for failing to protect the personal data of millions of users in the Cambridge Analytica scandal.</p>
<p>The Commonwealth Bank of Australia’s newly ascended CEO, Matt Comyn, started his first day in the job with <a href="http://www.abc.net.au/news/2018-04-09/cba-commonwealth-bank-new-boss-sorry-for-past-failures-on-day-1/9633300">an internal email</a> apologising to the bank’s employees, and taking responsibility for the bank’s “mistakes”. </p>
<p>Meaning is a complex process, and not at the beck and call of individuals. It depends not only on what we say, but what we don’t say, and what we do or don’t do.</p>
<p>It also depends on who we are in the scheme of things. Corporate CEOs are required by law to act in the best interests of shareholders. </p>
<p>So a corporate apology is always connected to the benefits it brings to the company. It is not a personal apology, it is a form of institutional positioning.</p>
<h2>Say it like you mean it</h2>
<p><a href="https://www-sciencedirect-com.simsrad.net.ocs.mq.edu.au/science/article/pii/S0749597815000540?via%3Dihub">One small study of corporate apologies</a> focused on the relationship of facial expressions used during an apology to reactions from share markets. Using a sample of 29 corporate apologies, two researchers carefully analysed the minute muscle movements of the apologisers. </p>
<p>Apologies accompanied by the display of positive or neutral emotion were associated with decreased investor confidence as (expressed by negative stock market returns). The effects persisted up to three months after the apology. </p>
<p>This research provides some tips for corporate CEOs - make sure your emotional display shows sufficient remorse for your actions. Otherwise you and your company may have a price to pay.</p>
<p>So how about the Zuckerberg and Comyn apologies? Cleverly, both frame their statements in terms of a lack of action. Zuckerberg said “But it’s clear now that we didn’t do enough”, while CBA’s Matt Comyn told his employees “We have not done enough to protect our customers”. </p>
<p>But like that glass that can be either half-full or half-empty, this is just a linguistic trick. Facebook’s cover-up of the theft of data by Cambridge Analytics can be either construed as a failure to act, or as a form of action. Ditto for the CBA.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/does-mark-zuckerberg-have-too-much-power-at-the-helm-of-facebook-94003">Does Mark Zuckerberg have too much power at the helm of Facebook?</a>
</strong>
</em>
</p>
<hr>
<p>Construing your failure as a lack of action affords an important rhetorical benefit: it means you don’t have to lay bare the details of what you have done. It allows you to apologise in vague and general terms, protecting yourself and your shareholders from the brutal details of your company’s transgressions. </p>
<p>So engaging in money laundering or funding terrorist organisations can just be “mistakes we made”. Generic apologies lack an essential part of the definition of an apology, the frank acknowledgement of the offence. </p>
<p>And upping the ante with more statements about taking responsibility, as Zuckerberg has just done before Congress doesn’t fill this gap:</p>
<blockquote>
<p>…I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here. </p>
</blockquote>
<p>In fact, Zuckerberg is using a lovely linguistic trick, a grammatical option called “middle voice” which you shouldn’t fall for. In the grammar of middle voice, an event is construed as if it happens under its own steam. No-one has responsibility for it taking place.</p>
<p>Hypothetically, imagine he said “I’m responsible because I didn’t disclose the company’s complicity in the theft of people’s private data”. This is a frank acknowledgement. </p>
<p>But instead, Zuckerberg says he’s responsible for “what happens”. But “what happens”, like the expression “shit happens”, makes it seem like things happened without anyone, like Zuckerberg, actually doing anything.</p>
<h2>What follows an apology</h2>
<p>Zuckerberg’s apology, <a href="https://www.wired.com/story/why-zuckerberg-15-year-apology-tour-hasnt-fixed-facebook/">one of many he’s made</a>, has more in common with the ancient Greek word <em>apologia</em> from which our word apology descends. An <em>apologia</em> was a speech in defence of one’s actions. </p>
<p>Zuckerberg is busy trying to rescue Facebook’s reputation by announcing actions the company will now take. </p>
<p>But his apology already has a stench about it. <a href="https://newsroom.fb.com/news/2018/04/new-elections-initiative/">Zuckerberg is commissioning</a> “independent research” on the role of social media in elections, as well as democracy more generally. The team to oversee the research includes a number of billionaires’ foundations, including the Charles Koch foundation. <a href="https://www.rollingstone.com/politics/news/inside-the-koch-brothers-toxic-empire-20140924">The Koch brothers</a> have their own reputations for interfering in US elections.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"983521376358236160"}"></div></p>
<p>Zuckerberg has put the fox in charge of the social media henhouse, hardly the action of someone truly contrite. Meanwhile, Comyn’s apology was quickly overshadowed <a href="http://www.afr.com/business/banking-and-finance/financial-services/austrac-steals-comyns-thunder-20180409-h0yj5p?logout=true">by an AUSTRAC allegation</a> that the company knowingly dealt with customers it suspected of money laundering.</p>
<p>Saying sorry is not so hard, but meaning it is another story altogether.</p><img src="https://counter.theconversation.com/content/94736/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Annabelle Lukin does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A corporate apology is always connected to the benefits it brings to the company. It is not a personal apology, it is a form of institutional positioning.Annabelle Lukin, Associate Professor in Linguistics, Macquarie UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/940782018-03-30T11:03:22Z2018-03-30T11:03:22ZHow Cambridge Analytica’s Facebook targeting model really worked – according to the person who built it<figure><img src="https://images.theconversation.com/files/212677/original/file-20180329-189824-1lbooac.jpg?ixlib=rb-1.1.0&rect=7%2C1197%2C4971%2C3002&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How accurately can you be profiled online?</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-vector/laptop-shooting-target-arrows-on-screen-795280663">Andrew Krasovitckii/Shutterstock.com</a></span></figcaption></figure><p>The researcher whose work is at the center of the <a href="https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html">Facebook-Cambridge Analytica data analysis and political advertising uproar</a> has revealed that his method worked much like the one <a href="https://medium.com/netflix-techblog/netflix-recommendations-beyond-the-5-stars-part-1-55838468f429">Netflix uses to recommend movies</a>. </p>
<p>In an email to me, Cambridge University scholar Aleksandr Kogan explained how his statistical model processed Facebook data for Cambridge Analytica. The accuracy he claims suggests it works about as well as <a href="https://www.cambridge.org/core/books/hacking-the-electorate/C0D269F47449B042767A51EC512DD82E">established voter-targeting methods</a> based on demographics like race, age and gender.</p>
<p>If confirmed, Kogan’s account would mean the digital modeling Cambridge Analytica used was <a href="https://www.youtube.com/watch?v=APqU_EJ5d3U">hardly the virtual crystal ball</a> <a href="https://techcrunch.com/2018/03/23/facebook-knows-literally-everything-about-you/">a few have claimed</a>. Yet the numbers Kogan provides <a href="https://civichall.org/civicist/will-the-real-psychometric-targeters-please-stand-up/">also show</a> what is – and isn’t – <a href="https://www.washingtonpost.com/news/monkey-cage/wp/2018/03/23/four-and-a-half-reasons-not-to-worry-that-cambridge-analytica-skewed-the-2016-election/">actually possible</a> by <a href="https://www.wired.com/story/the-noisy-fallacies-of-psychographic-targeting/">combining personal data</a> <a href="https://www.nbcnews.com/politics/politics-news/cambridge-analytica-s-effectiveness-called-question-despite-alleged-facebook-data-n858256">with machine learning</a> for political ends.</p>
<p>Regarding one key public concern, though, Kogan’s numbers suggest that information on users’ personalities or “<a href="https://www.vox.com/science-and-health/2018/3/23/17152564/cambridge-analytica-psychographic-microtargeting-what">psychographics</a>” was just a modest part of how the model targeted citizens. It was not a personality model strictly speaking, but rather one that boiled down demographics, social influences, personality and everything else into a big correlated lump. This soak-up-all-the-correlation-and-call-it-personality approach seems to have created a valuable campaign tool, even if the product being sold wasn’t quite as it was billed.</p>
<h2>The promise of personality targeting</h2>
<p>In the wake of the revelations that Trump campaign consultants Cambridge Analytica used <a href="https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html">data from 50 million Facebook users</a> to target digital political advertising during the 2016 U.S. presidential election, Facebook has <a href="https://www.nasdaq.com/symbol/fb/stock-report">lost billions in stock market value</a>, governments on <a href="https://www.theverge.com/2018/3/19/17141138/facebook-cambridge-analytica-uk-authorities-warrant-data-breach">both sides of the Atlantic</a> have <a href="https://www.pbs.org/newshour/politics/federal-trade-commission-to-investigate-facebook-as-companys-stock-value-sinks">opened investigations</a>, and a nascent <a href="https://theconversation.com/facebook-is-killing-democracy-with-its-personality-profiling-data-93611">social movement</a> is calling on users to <a href="https://twitter.com/search?q=%23deletefacebook">#DeleteFacebook</a>.</p>
<p>But a key question has remained unanswered: Was Cambridge Analytica really able to effectively target campaign messages to citizens based on their personality characteristics – or even their “<a href="https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election">inner demons</a>,” as a company whistleblower alleged? </p>
<p>If anyone would know what Cambridge Analytica did with its massive trove of Facebook data, it would be Aleksandr Kogan and Joseph Chancellor. It was <a href="https://www.reuters.com/article/us-facebook-cambridge-analytica/trump-consultants-harvested-data-from-50-million-facebook-users-reports-idUSKCN1GT02Y">their startup Global Science Research</a> that collected profile information from <a href="https://www.wired.com/story/cambridge-analytica-50m-facebook-users-data/">270,000 Facebook users and tens of millions of their friends</a> using a personality test app called “thisisyourdigitallife.”</p>
<p>Part of <a href="https://scholar.google.com/citations?user=igL-0AsAAAAJ&hl=en">my own research</a> focuses on understanding <a href="https://doi.org/10.1177/0002716215570279">machine learning</a> methods, and <a href="https://www.amazon.com/Internet-Trap-Monopolies-Undermines-Democracy/dp/0691159262/">my forthcoming book</a> discusses how digital firms use recommendation models to build audiences. I had a hunch about how Kogan and Chancellor’s model worked.</p>
<p>So I emailed Kogan to ask. Kogan is still a <a href="https://www.bloomberg.com/news/articles/2018-03-20/meet-the-psychologist-at-the-center-of-facebook-s-data-scandal">researcher at Cambridge University</a>; his collaborator <a href="https://www.theguardian.com/news/2018/mar/18/facebook-cambridge-analytica-joseph-chancellor-gsr">Chancellor now works at Facebook</a>. In a remarkable display of academic courtesy, Kogan answered. </p>
<p>His response requires some unpacking, and some background.</p>
<h2>From the Netflix Prize to “psychometrics”</h2>
<p>Back in 2006, when it was still a DVD-by-mail company, Netflix offered a <a href="https://www.netflixprize.com/">reward of $1 million</a> to anyone who developed a better way to make predictions about users’ movie rankings than the company already had. A surprise top competitor was an <a href="https://www.kdnuggets.com/news/2007/n08/3i.html">independent software developer using the pseudonym Simon Funk</a>, whose basic approach was ultimately incorporated into all the top teams’ entries. Funk adapted a technique called “<a href="http://www.aclweb.org/anthology/E06-1013">singular value decomposition</a>,” condensing users’ ratings of movies into a <a href="https://www.youtube.com/watch?v=P5mlg91as1c">series of factors or components</a> – essentially a set of inferred categories, ranked by importance. As Funk <a href="http://sifter.org/simon/journal/20061027.2.html">explained in a blog post</a>,</p>
<blockquote>
<p>“So, for instance, a category might represent action movies, with movies with a lot of action at the top, and slow movies at the bottom, and correspondingly users who like action movies at the top, and those who prefer slow movies at the bottom.”</p>
</blockquote>
<p>Factors are artificial categories, which are not always like the kind of categories humans would come up with. The <a href="http://sifter.org/simon/journal/20061027.2.html">most important factor in Funk’s early Netflix model</a> was defined by users who loved films like “Pearl Harbor” and “The Wedding Planner” while also hating movies like “Lost in Translation” or “Eternal Sunshine of the Spotless Mind.” His model showed how machine learning can find correlations among groups of people, and groups of movies, that humans themselves would never spot.</p>
<p>Funk’s general approach used the 50 or 100 most important factors for both users and movies to make a decent guess at how every user would rate every movie. This method, often called <a href="https://en.wikipedia.org/wiki/Dimensionality_reduction">dimensionality reduction</a> or matrix factorization, was not new. Political science researchers had shown that <a href="https://en.wikipedia.org/wiki/NOMINATE_(scaling_method)">similar techniques using roll-call vote data</a> could predict the votes of members of Congress with 90 percent accuracy. In psychology the “<a href="https://doi.org/10.1037/0003-066X.48.1.26">Big Five</a>” model had also been used to predict behavior by clustering together personality questions that tended to be answered similarly.</p>
<p>Still, Funk’s model was a big advance: It allowed the technique to work well with huge data sets, even those with lots of missing data – like the Netflix dataset, where a typical user rated only few dozen films out of the thousands in the company’s library. More than a decade after the Netflix Prize contest ended, <a href="https://doi.org/10.1145/1401890.1401944">SVD-based methods</a>, or <a href="https://doi.org/10.1109/ICDM.2008.22">related models for implicit data</a>, are still the tool of choice for many websites to predict what users will read, watch, or buy. </p>
<p>These models can predict other things, too.</p>
<h2>Facebook knows if you are a Republican</h2>
<p>In 2013, Cambridge University researchers Michal Kosinski, David Stillwell and Thore Graepel published an article on the <a href="https://doi.org/10.1073/pnas.1218772110">predictive power of Facebook data</a>, using information gathered through an online personality test. Their initial analysis was nearly identical to that used on the Netflix Prize, using SVD to categorize both users and things they “liked” into the top 100 factors. </p>
<p>The paper showed that a factor model made with users’ Facebook “likes” alone was <a href="https://doi.org/10.1073/pnas.1218772110">95 percent accurate</a> at distinguishing between black and white respondents, 93 percent accurate at distinguishing men from women, and 88 percent accurate at distinguishing people who identified as gay men from men who identified as straight. It could even correctly distinguish Republicans from Democrats 85 percent of the time. It was also useful, though not as accurate, for <a href="https://doi.org/10.1073/pnas.1218772110">predicting users’ scores</a> on the “Big Five” personality test. </p>
<p>There was <a href="https://psmag.com/economics/big-data-big-brother-and-the-like-button-53894">public outcry</a> <a href="https://www.theatlantic.com/technology/archive/2013/03/armed-with-facebook-likes-alone-researchers-can-tell-your-race-gender-and-sexual-orientation/273963/">in response</a>; within weeks Facebook had <a href="https://motherboard.vice.com/en_us/article/mg9vvn/how-our-likes-helped-trump-win">made users’ likes private</a> by default.</p>
<p>Kogan and Chancellor, also Cambridge University researchers at the time, were starting to use Facebook data for election targeting as part of a collaboration with Cambridge Analytica’s parent firm SCL. Kogan invited Kosinski and Stillwell to join his project, but it <a href="https://www.theguardian.com/education/2018/mar/24/cambridge-analytica-academics-work-upset-university-colleagues">didn’t work out</a>. Kosinski reportedly suspected Kogan and Chancellor might have <a href="https://motherboard.vice.com/en_us/article/mg9vvn/how-our-likes-helped-trump-win">reverse-engineered the Facebook “likes” model</a> for Cambridge Analytica. Kogan denied this, saying his project “<a href="https://www.theguardian.com/education/2018/mar/24/cambridge-analytica-academics-work-upset-university-colleagues">built all our models</a> using our own data, collected using our own software.” </p>
<h2>What did Kogan and Chancellor actually do?</h2>
<p>As I followed the developments in the story, it became clear Kogan and Chancellor had indeed collected plenty of their own data through the thisisyourdigitallife app. They certainly could have built a predictive SVD model like that featured in Kosinski and Stillwell’s published research.</p>
<p>So I emailed Kogan to ask if that was what he had done. Somewhat to my surprise, he wrote back. </p>
<p>“We didn’t exactly use SVD,” he wrote, noting that SVD can struggle when some users have many more “likes” than others. Instead, Kogan explained, “The technique was something we actually developed ourselves … It’s not something that is in the public domain.” Without going into details, Kogan described their method as “a multi-step <a href="https://www.quora.com/What-is-a-co-occurrence-matrix">co-occurrence</a> approach.” </p>
<p>However, his message went on to confirm that his approach was indeed similar to SVD or other matrix factorization methods, like in the Netflix Prize competition, and the Kosinki-Stillwell-Graepel Facebook model. Dimensionality reduction of Facebook data was the core of his model. </p>
<h2>How accurate was it?</h2>
<p>Kogan suggested the exact model used doesn’t matter much, though – what matters is the accuracy of its predictions. According to Kogan, the “correlation between predicted and actual scores … was around [30 percent] for all the personality dimensions.” By comparison, a person’s previous Big Five scores are about <a href="https://doi.org/10.1016/j.jrp.2014.06.003">70 to 80 percent accurate</a> in predicting their scores when they retake the test. </p>
<p>Kogan’s accuracy claims cannot be independently verified, of course. And anyone in the midst of such a high-profile scandal might have incentive to understate his or her contribution. In his <a href="https://www.youtube.com/watch?v=APqU_EJ5d3U">appearance on CNN</a>, Kogan explained to a increasingly incredulous Anderson Cooper that, in fact, the models had actually not worked very well. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/APqU_EJ5d3U?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Aleksandr Kogan answers questions on CNN.</span></figcaption>
</figure>
<p>In fact, the accuracy Kogan claims seems a bit low, but plausible. Kosinski, Stillwell and Graepel reported comparable or slightly better results, as have several <a href="https://doi.org/10.1016/j.paid.2017.12.018">other academic studies</a> using digital footprints to predict personality (though some of those studies had more data than just Facebook “likes”). It is surprising that Kogan and Chancellor would go to the trouble of designing their own proprietary model if off-the-shelf solutions would seem to be just as accurate.</p>
<p>Importantly, though, the model’s accuracy on personality scores allows comparisons of Kogan’s results with other research. Published models with equivalent accuracy in predicting personality are all much more accurate at guessing demographics and political variables.</p>
<p>For instance, the similar Kosinski-Stillwell-Graepel SVD model was 85 percent accurate in guessing party affiliation, even without using any profile information other than likes. Kogan’s model had similar or better accuracy. Adding even a small amount of information about friends or users’ demographics would likely boost this accuracy above 90 percent. Guesses about gender, race, sexual orientation and other characteristics would probably be more than 90 percent accurate too.</p>
<p>Critically, these guesses would be especially good for the most active Facebook users – the people the model was primarily used to target. Users with less activity to analyze are likely not on Facebook much anyway. </p>
<h2>When psychographics is mostly demographics</h2>
<p>Knowing how the model is built helps explain Cambridge Analytica’s apparently contradictory statements about <a href="https://motherboard.vice.com/en_us/article/mg9vvn/how-our-likes-helped-trump-win">the role</a> – or <a href="https://www.c-span.org/video/?420077-1/google-hosts-post-election-review&start=6905">lack thereof</a> – that personality profiling and psychographics played in its modeling. They’re all technically consistent with what Kogan describes.</p>
<p>A model like Kogan’s would give estimates for every variable available on any group of users. That means it would automatically <a href="https://www.bloomberg.com/news/features/2015-11-12/is-the-republican-party-s-killer-data-app-for-real-">estimate the Big Five personality scores</a> for every voter. But these personality scores are the output of the model, not the input. All the model knows is that certain Facebook likes, and certain users, tend to be grouped together. </p>
<p>With this model, Cambridge Analytica could say that it was identifying people with low openness to experience and high neuroticism. But the same model, with the exact same predictions for every user, could just as accurately claim to be identifying less educated older Republican men. </p>
<p>Kogan’s information also helps clarify the confusion about whether Cambridge Analytica <a href="https://www.youtube.com/watch?v=MepM_YXZdYg">actually deleted its trove</a> of Facebook data, when models built from the data <a href="https://www.channel4.com/news/revealed-cambridge-analytica-data-on-thousands-of-facebook-users-still-not-deleted">seem to still be circulating</a>, and even <a href="https://gizmodo.com/aggregateiq-created-cambridge-analyticas-election-softw-1824026565">being developed further</a>. </p>
<p>The whole point of a dimension reduction model is to mathematically represent the data in simpler form. It’s as if Cambridge Analytica took a very high-resolution photograph, resized it to be smaller, and then deleted the original. The photo still exists – and as long as Cambridge Analytica’s models exist, the data effectively does too.</p><img src="https://counter.theconversation.com/content/94078/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Matthew Hindman does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>An email from Aleksandr Kogan sheds light on exactly how much your Facebook data reveals about you, and what data scientists can actually do with that information.Matthew Hindman, Associate Professor of Media and Public Affairs, George Washington UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/910242018-02-05T14:21:30Z2018-02-05T14:21:30ZExplainer: how Facebook has become the world’s largest echo chamber<figure><img src="https://images.theconversation.com/files/204616/original/file-20180202-162082-1nk3qoi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Is there an echo here?</span> <span class="attribution"><span class="source">Reuters/Benoit Tessier</span></span></figcaption></figure><p>I began my research career in the last century with an analysis of how news organisations were adapting to this strange new thing called “the Internet”. Five years later I signed up for Twitter and, a year after that, for Facebook. </p>
<p>Now, as it celebrates its 14th birthday, Facebook is becoming ubiquitous, and its usage and impact <a href="https://herts.academia.edu/MeganKnight">is central</a> to my (and many others’) research. </p>
<p>In 2017 the social network had <a href="https://www.statista.com/statistics/241552/share-of-global-population-using-facebook-by-region">2 billion members</a>, by its own count. Facebook’s relationship with news content is an important part of this ubiquity. Since 2008 the company has courted news organisations with features like “Connect”, “Share” and “Instant Articles”. As of 2017, 48% of Americans <a href="http://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/">rely primarily</a> on Facebook for news and current affairs information. </p>
<p>Social networks present news content in a way that’s integrated into the flow of personal and other communication. Media scholar <a href="http://alfredhermida.com/research/projects/">Alfred Hermida</a> calls this “<a href="http://alfredhermida.com/2010/05/03/ambient-journalism-paper-published/">ambient news</a>”. It’s a trend that has been considered promising for the development of civil society. Social media – like the Internet before it – has being hailed as the new “public sphere”: a place for civic discourse and political engagement among the citizenry. </p>
<p>But, unlike the Internet, Facebook is not a public space in which all content is equal. It is a private company. It controls what content you see, according to algorithms and commercial interests. The new public sphere is, in fact, privately owned, and this has far-reaching implications for civic society worldwide. </p>
<p>When a single company is acting as the broker for news and current affairs content for a majority of the population, the possibility for abuse is rife. Facebook is not seen as a “news organisation”, so it falls outside of whatever regulations countries apply to “the news”. And its content is provided by myriad third parties, often with little oversight and tracking by countries’ authorities. So civic society’s ability to address concerns about Facebook’s content becomes even more constrained.</p>
<h2>Getting to know all about you</h2>
<p>Facebook’s primary goal is to sell advertising. It does so by knowing as much as possible about its users, then selling that information to advertisers. The provision of content to entice consumers to look at advertising is not new: it’s the entire basis of the commercial media. </p>
<p>But where newspapers can only target broad demographic groups based on language, location and, to an extent, education level and income, Facebook can narrow its target market down to individual level. How? Based on demographics – and everything your “likes”, posts and comments have told it.</p>
<p>This ability to fine tune content to subsets of the audience is not limited to advertising. Everything on your Facebook feed is curated and presented to you by an algorithm seeking to maximise your engagement by only showing you things that it thinks you will like and respond to. The more you engage and respond, the better the algorithm gets at predicting what you will like.</p>
<p>When it comes to news content and discussion of the news, this means you will increasingly only see material that’s in line with your stated interests. More and more, too, news items, advertisements and posts by friends are blurred in the interface. This all merges into a single stream of information. </p>
<p>And because of the way your network is structured, the nature of that information becomes ever more narrow. It is inherent in the ideals of democracy that people be exposed to a <a href="http://www.expo98.msu.edu/innerindex.html?ideas">plurality of ideas</a>; that the public sphere should be open to all. The loss of this plurality creates a society made up of extremes, with little hope for consensus or bridging of ideas. </p>
<h2>An echo chamber</h2>
<p>Most people’s “friends” on Facebook tend to be people with whom they have some real-life connection – actual friends, classmates, neighbours and family members. Functionally, this means that most of your network will consist largely of people who share your broad demographic profile: education level, income, location, ethnic and cultural background and age. </p>
<p>The algorithm knows who in this network you are most likely to engage with, which further narrows the field to people whose worldview aligns with your own. You may be Facebook friends with your Uncle Fred, whose political outbursts threaten the tranquillity of every family get-together. But if you ignore his conspiracy-themed posts and don’t engage, they will start to disappear from your feed. </p>
<p>Over time this means that your feed gets narrower and narrower. It shows less and less content that you might disagree with or find distasteful.</p>
<p>These two responses, engaging and ignoring are both driven by the invisible hand of the algorithm. And they have created an echo chamber. This isn’t dissimilar to what news organisations have been trying to do for some time: <a href="http://journals.sagepub.com/doi/abs/10.1177/107769906704400301">gatekeeping</a> is the expression of the journalists’ idea of what the audience wants to read. </p>
<p>Traditional journalists had to rely on their instinct for what people would be interested in. Technology now makes it possible to know exactly what people read, responded to, or shared. </p>
<p>For Facebook, this process is now run by a computer; an algorithm which reacts instantly to provide the content it thinks you want. But this fine tuned and carefully managed algorithm is open to manipulation, especially by political and social interests.</p>
<h2>Extreme views confirmed</h2>
<p>In the last few years Facebook users have unwittingly become part of a massive social experiment – one which may have contributed to the equally surprising <a href="https://www.theguardian.com/technology/2017/oct/26/cambridge-analytica-used-data-from-facebook-and-politico-to-help-trump">election of Donald Trump</a> as president of the US and the UK <a href="https://www.theguardian.com/technology/2017/may/07/the-great-british-brexit-robbery-hijacked-democracy">electing to leave</a> the European Union. We can’t be sure of this, since Facebook’s content algorithm is secret and most of the content is shown only to specific users. </p>
<p>It’s physically impossible for a researcher to see all of the content distributed on Facebook; the company explicitly prevents that kind of access. Researchers and journalists need to construct model accounts (fake ones, violating Facebook’s terms of use) and attempt to trick the algorithm into showing what the social network’s most extreme political users see.</p>
<p>What they’ve <a href="https://medium.com/@richgor/why-every-american-should-look-at-blue-feed-red-feed-and-why-the-nation-needs-someone-to-build-f455ef17a0f2">found</a> is that the <a href="https://arxiv.org/pdf/1509.00189.pdf">more extreme the views</a> the user has already agreed with, the more extreme the content they saw was. People who liked or expressed support for leaving the EU were shown content that reflected this desire, but in a more extreme way. </p>
<p>If they liked that they’d be shown even more content, and so on, the group getting smaller and smaller and more and more insular. This is similar to how extremist groups would identify and court potential members, enticing them with more and more radical ideas and watching their reaction. That sort of personal interaction was a slow process. Facebook’s algorithm now works at lightning speed and the pace of radicalisation is exponentially increased.</p><img src="https://counter.theconversation.com/content/91024/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Megan Knight does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>More and more, news items, adverts and posts by friends are blurred in Facebook’s interface. This all merges into a single stream of information.Megan Knight, Associate Dean, University of HertfordshireLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/903672018-02-01T11:38:55Z2018-02-01T11:38:55ZHow Facebook could really fix itself<figure><img src="https://images.theconversation.com/files/204064/original/file-20180130-107706-1ahgam8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Under fire: Facebook founder and CEO Mark Zuckerberg.</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Facebook-Publishers/d401f9e4e5fe4fdabe15923bf17bafad/6/0">AP Photo/Jeff Roberson</a></span></figcaption></figure><p>Facebook has a world of problems. Beyond <a href="https://www.cnbc.com/2018/01/25/facebook-tells-senate-its-software-recommended-russian-propaganda.html">charges of Russian manipulation and promoting fake news</a>, the company’s signature social media platform is under fire for <a href="http://www.foxnews.com/tech/2017/12/29/facebook-cocaine-opioids-how-addictive-is-social-network.html">being addictive</a>, causing <a href="http://fortune.com/2017/12/16/facebook-admits-social-media-can-harm-your-mental-health/">anxiety and depression</a>, and even instigating <a href="https://www.theatlantic.com/technology/archive/2017/12/could-facebook-be-tried-for-war-crimes/548639/">human rights abuses</a>. </p>
<p>To make matters even worse, Facebook has faced huge backlash in recent days after it was revealed that Trump campaign consultant Cambridge Analytica harvested the data of up to 50 million users without their permission.</p>
<p>Company founder and CEO Mark Zuckerberg says <a href="https://www.facebook.com/zuck/posts/10104413015393571">he wants to win back users’ trust</a>. But his company’s efforts so far have ignored the root causes of the problems they intend to fix, and even risk making matters worse. Specifically, they ignore the fact that personal interaction isn’t always meaningful or benign, leave out the needs of users in the developing world, and seem to compete with the company’s own business model.</p>
<p>Based on <a href="https://sites.tufts.edu/digitalplanet/">The Digital Planet</a>, a multi-year global study of how digital technologies spread and how much people trust them, which I lead at Tufts University’s Fletcher School, I have some ideas about how to fix Facebook’s efforts to fix itself.</p>
<h2>Face-saving changes?</h2>
<p>Like many technology companies, Facebook must balance the convergence of digital dependence, digital dominance and digital distrust. Over <a href="https://www.facebook.com/zuck/posts/10103831654565331">2 billion people</a> worldwide check Facebook each month; <a href="http://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/">45 percent of American adults</a> get their news from Facebook. Together with Google, it captures <a href="http://fortune.com/2017/07/28/google-facebook-digital-advertising/">half of all digital advertising revenues</a> worldwide. Yet <a href="https://www.theverge.com/2017/10/27/16552620/facebook-trust-survey-usage-popularity-fake-news">more people say they greatly distrust Facebook</a> than any other member of the big five – Amazon, Apple, Google or Microsoft.</p>
<p>In March 2017 Facebook started taking responsibility for quality control as a way to restore users’ trust. The company hired <a href="https://www.engadget.com/2017/03/06/facebook-now-flags-fake-news/">fact-checkers</a> to verify information in posts. Two months later the company changed its algorithms to <a href="https://newsroom.fb.com/news/2017/05/update-on-trending/">help users find diverse viewpoints</a> on current issues and events. And in October 2017, it imposed new <a href="https://newsroom.fb.com/news/2017/10/update-on-our-advertising-transparency-and-authenticity-efforts/">transparency requirements</a> to force advertisers to identify themselves clearly. </p>
<p>But Zuckerberg led off 2018 in a different direction, committing to “<a href="https://www.facebook.com/zuck/posts/10104380170714571">working to fix our issues together</a>.” That last word, “together,” suggests an inclusive approach, but in my view, it really says the company is shifting the burden back onto its users. </p>
<p>The company began by overhauling its crucial News Feed feature, giving less priority to third-party publishers, whether more traditional media outlets like The New York Times, The Washington Post or newer online publications such as Buzzfeed or Vox. That will leave more room for posts from family and friends, which Zuckerberg has called “<a href="https://www.facebook.com/zuck/posts/10104413015393571">meaningful social interactions</a>.” </p>
<p>However, Facebook will <a href="https://www.buzzfeed.com/alexkantrowitz/this-is-facebooks-news-survey">rely on users to rate</a> how trustworthy groups, organizations and media outlets are. Those ratings will <a href="https://www.facebook.com/zuck/posts/10104445245963251">determine which third-party publishers</a> do make it to users’ screens, if at all. Leaving trustworthiness ratings to users without addressing online political polarization risks making civic discourse even more divided and extreme.</p>
<h2>Personal isn’t always ‘meaningful’</h2>
<p>Unlike real-life interactions, online exchanges can exacerbate both <a href="https://techcrunch.com/2017/12/14/active-vs-passive-social-media/">passive</a> and <a href="http://dx.doi.org/10.1111/jopy.12305">narcissistic</a> tendencies. It’s easier to be invisible online, so people who want to avoid attention can do so without facing peer pressure to participate. By contrast, though, people who are active online can see their friends like, share and comment on their posts, motivating them to seek even more attention.</p>
<p>This creates two groups of online users, broadly speaking: disengaged observers and those who are competing for attention with ever more extreme efforts to catch users’ eyes. This environment has helped outrageous, untrue claims with clickbait headlines <a href="https://www.appnexus.com/en/company/whitepapers/inventory-quality-whitepaper">attract enormous amounts</a> <a href="https://newrepublic.com/article/136888/science-going-viral">of attention</a>.</p>
<p>This phenomenon is further complicated by two other elements of social interaction online. First, news of any kind – including fake news – gains credibility when it is forwarded by a personal connection. </p>
<p>And social media tends to group like-minded people together, creating <a href="https://press.princeton.edu/titles/10935.html">an echo chamber effect</a> that reinforces messages the group agrees with and resists outside views – including more accurate information and independent perspectives. It’s no coincidence that <a href="http://www.journalism.org/2014/10/21/political-polarization-media-habits/">conservatives and liberals trust very different</a> news sources.</p>
<p>Users of Facebook’s instant-messaging subsidiary WhatsApp have shown that even a technology focusing on individual connection isn’t always <a href="https://www.wsj.com/articles/the-internet-is-filling-up-because-indians-are-sending-millions-of-good-morning-texts-1516640068">healthy or productive</a>. WhatsApp has been identified as a <a href="https://www.ft.com/content/64fdb23e-badc-11e6-8b45-b8b81dd5d080">primary carrier of fake news</a> and divisive rumors in India, where its users’ messages have been described as a “<a href="http://indianexpress.com/article/opinion/editorials/do-not-forward-this-whatsapp-fake-news-videos-5000140/">mix of off-color jokes</a>, doctored TV [clips], wild rumors and other people’s opinions, mostly vile.” Kenya has identified <a href="https://www.nation.co.ke/news/21-WhatsApp-groups-spreading-hate-identified/1056-4018322-ix7vff/index.html">21 hate-mongering WhatsApp groups</a>. WhatsApp users in the U.K. have had to <a href="http://www.independent.co.uk/life-style/whatsapp-scam-tesco-asda-vouchers-how-to-spot-avoid-messages-links-crime-a8041096.html">stay alert for scams</a> in their personal messages.</p>
<h2>Addressing the developing world</h2>
<p>Facebook’s actions appear to be responding to public pressure from the <a href="https://www.nytimes.com/2017/10/31/us/politics/facebook-twitter-google-hearings-congress.html">U.S.</a> and <a href="https://www.politico.eu/article/facebook-battle-of-five-armies-antitrust-regulators-publishers-privacy-security-politics/">Europe</a>. But Facebook is experiencing <a href="https://www.usatoday.com/story/tech/news/2017/06/27/status-update-facebook-has-2-billion-users-can-reach-3-billion/103104200/">its fastest growth</a> in Asia and Africa.</p>
<p>Research I have conducted with colleagues has found that users in the developing world are <a href="https://sites.tufts.edu/digitalplanet/dei17/">more trusting</a> of online material, and therefore more vulnerable to manipulation by false information. In Myanmar, for instance, Facebook is <a href="https://www.nytimes.com/2017/10/27/world/asia/myanmar-government-facebook-rohingya.html">the dominant internet site</a> because of its <a href="https://www.mmtimes.com/business/technology/20685-facebook-free-basics-lands-in-myanmar.html">Free Basics program</a>, which lets mobile-phone users connect to a few selected internet sites, including Facebook, without paying extra or using up allotted data in their mobile plans. In 2014, Facebook had 2 million users in Myanmar; after Free Basics arrived in 2016, that number climbed to 30 million.</p>
<p>One of the effects has been devastating. Rumor campaigns against the Rohingya ethnic group in Myanmar were, in part, <a href="https://www.buzzfeed.com/meghara/how-fake-news-and-online-hate-are-making-life-hell-for">spread on Facebook</a>, sparking violence. At least <a href="http://www.msf.org/en/article/myanmarbangladesh-msf-surveys-estimate-least-6700-rohingya-were-killed-during-attacks">6,700 Rohingya Muslims were killed</a> by Myanmar’s security forces between August and September 2017; <a href="https://www.usatoday.com/story/news/world/2017/12/14/aid-group-rohingya-killed-myanmar/951035001/">630,000 more have fled</a> the country. Facebook did not stop the rumors, and at one point actually <a href="https://www.theguardian.com/technology/2017/sep/20/facebook-rohingya-muslims-myanmar">shut down responding posts</a> from a Rohingya activist group.</p>
<p>Facebook’s Free Basics program is in <a href="https://info.internet.org/en/story/where-weve-launched/">63 developing countries and municipalities</a>, each filled with people new to the digital economy and potentially vulnerable to manipulation.</p>
<h2>Fighting against the business model</h2>
<p>Facebook’s efforts to promote what might be called “corporate digital responsibility” runs counter to the company’s business model. Zuckerberg himself declared that the upcoming changes would cause people to <a href="https://news.vice.com/en_us/article/paqxxb/why-facebook-wants-you-to-spend-less-time-on-facebook">spend less time</a> on Facebook.</p>
<p>But the company makes <a href="http://fortune.com/2017/05/05/facebook-digital-advertising-business-model/">98 percent of its revenues from advertising</a>. That is only possible if users keep their attention focused on the platform, so the company can <a href="http://time.com/5112847/facebook-fake-news-unstoppable/">analyze their usage data</a> to generate more targeted advertising.</p>
<p>Our research finds that companies <a href="http://fletcher.tufts.edu/InclusionInc/Content-Library/Original-Research/Inclusive-Innovators">working toward corporate social responsibility</a> will only succeed if their efforts align with their core business models. Otherwise, the responsibility project will become unsustainable in the face of pressure from the stock market, competitors or government regulators, as happened to Facebook with <a href="https://techcrunch.com/2018/01/29/facebook-starts-polishing-its-privacy-messaging-ahead-of-gdpr/">European privacy rules</a>.</p>
<h2>Real solutions</h2>
<p>What can Facebook do instead? I recommend the following to fix Facebook’s fix:</p>
<ol>
<li><p>Own the reality of Facebook’s enormous role in society. It’s a primary source of news and communication that influences the beliefs and assumptions driving citizen behavior around the world. The company cannot rely on users to police the system. As a media company, Facebook needs to <a href="https://www.wired.com/story/should-facebook-and-twitter-be-regulated-under-the-first-amendment/">take responsibility for the content it publishes</a> and republishes. It can combine both human and artificial intelligence to sort through the content, labeling news, opinions, hearsay, research and other types of information in ways ordinary users can understand. </p></li>
<li><p>Establish on-the-ground operations in every location where it has large numbers of users, to ensure the company understands local contexts. Rather than a virtual global entity operating from Silicon Valley, Facebook should engage with the nuances and complexities of cities, regions and countries, using local languages to customize content for users. Right now, Facebook <a href="https://www.facebook.com/fbsafety/photos/a.197686146935898.42079.125459124158601/912369255467580/?type=3&theater">passively publishes educational materials</a> on digital safety and community standards, which are easily ignored. As Facebook adds users in developing nations, the company must pay close attention to the unintended consequences of explosive growth in connectivity.</p></li>
<li><p>Reduce the company’s dependence on advertising revenue. As long as Facebook is almost entirely dependent on ad sales, it will be forced to hold users’ attention as long as possible and gather their data to analyze for future ad opportunities. Its strategy for expansion should go beyond building and buying other apps, like WhatsApp, Instagram and Messenger, all of which still feed the core business model of monopolizing and data-mining users’ attention. Taking inspiration from Amazon and <a href="https://www.fool.com/investing/2018/01/29/amazon-and-disney-better-pay-attention-to-these-ne.aspx">Netflix</a> – and even <a href="http://ww2.cfo.com/financial-performance/2017/01/googles-non-advertising-revenue/">Google parent company Alphabet</a> – Facebook could use its <a href="https://www.fastcodesign.com/1669551/how-companies-like-amazon-use-big-data-to-make-you-love-them">huge trove</a> of user data responsibly to identify, design and deliver new services that people would pay for. </p></li>
</ol>
<p>Ultimately, Zuckerberg and Facebook’s leaders have created an enormously powerful, compelling and potentially addictive service. This unprecedented opportunity has developed at an unprecedented pace. Growth may be the easy part; being the responsible grown-up is much harder.</p><img src="https://counter.theconversation.com/content/90367/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bhaskar Chakravorti directs the Institute for Business in the Global Context at Tufts Fletcher School. The Institute has received funding from Mastercard, Microsoft and the Gates Foundation. </span></em></p>A scholar of digital trust evaluates Facebook’s current efforts and proposes some improvements the company could make.Bhaskar Chakravorti, Senior Associate Dean, International Business & Finance, The Fletcher School, Tufts UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/772302017-06-28T01:42:17Z2017-06-28T01:42:17ZWhy it’s important to understand social media’s dark history<figure><img src="https://images.theconversation.com/files/175890/original/file-20170627-24756-c1emre.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/download/confirm/516548275?src=jfFfFpTAd5CTSLKufab5xw-1-56&size=huge_jpg">www.shutterstock.com</a></span></figcaption></figure><p>It was in April 2016 that Facebook founder Mark Zuckerberg announced that the social media platform was providing its <a href="http://money.cnn.com/2017/02/01/technology/facebook-earnings/">nearly two billion users</a> the opportunity to <a href="https://www.facebook.com/zuck/posts/10102764095821611">livestream content</a>. The move was viewed as a natural extension of the platform’s primary goal: providing a space for the average person to share their daily experiences, from <a href="https://www.facebook.com/bowmanspartan/videos/vb.2328858/10110128600086054/?type=2&theater">the mundane</a> to <a href="https://www.facebook.com/sharer/sharer.php?u=https%3A%2F%2Fwww.facebook.com%2Fatlbuzz%2Fvideos%2F10155052739929832%2F&display=popup&ref=plugin&src=video">the meaningful</a>. </p>
<p>Almost as quickly, users found ways to live-broadcast <a href="http://abc7chicago.com/news/hate-crime-charges-filed-against-4-in-facebook-live-torture-case/1687517/">the worst of their nature</a>, including the <a href="http://www.news.com.au/world/north-america/man-kills-victim-live-on-facebook-and-goes-on-the-run-after-posting-easter-day-slaughter-video/news-story/63ebe5845760d68942be0807d4a040f5">“Easter Day slaughter”</a> in which the fatal shooting of a 74-year-old Cleveland grandfather was livestreamed. </p>
<p>In response, calls have increased for Facebook to either shutter the service or find a way to better regulate its content. Rev. Jesse Jackson, for example, remarked that Facebook Live is being used by people “<a href="https://www.usatoday.com/story/news/2017/04/21/jesse-jackson-chicago-officials-call-facebook-live-moratorium/100772144/">as a platform to release their anger, their fears and their foolishness.”</a></p>
<p>Many have referred to these behaviors as <a href="http://fortune.com/2017/04/17/facebook-killing/">Facebook’s “dark side”</a> and demanded that the company find a solution to prevent such antisocial behavior. </p>
<p>However, a brief look through the history of social media shows us that dark behaviors are neither unique to Facebook nor something new to today’s users. </p>
<h2>A dark history</h2>
<p>Poet and technology author Judy Malloy wrote about the <a href="https://mitpress.mit.edu/books/social-media-archeology-and-poetics">earliest precursors to social media networks as places of creativity and community</a>. For example, programs such as Berkeley’s <a href="http://www.computerhistory.org/atchm/community-memory-precedents-in-social-media-and-movements/">Community Memory</a> allowed 1970s users a digital space to post content and share stories for others in the community to read, with popular content including personal ads and short stories. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/175910/original/file-20170627-24741-33l5ut.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/175910/original/file-20170627-24741-33l5ut.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/175910/original/file-20170627-24741-33l5ut.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/175910/original/file-20170627-24741-33l5ut.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/175910/original/file-20170627-24741-33l5ut.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/175910/original/file-20170627-24741-33l5ut.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/175910/original/file-20170627-24741-33l5ut.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">An early French Minitel terminal. Early social media days had their dark moments as well.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/cheindel/3272350519/in/photolist-5ZaDNx-7zApPV-a7XzWj-7bd3g8-iNpH9k-7be69Q-7bbtrz-oswaWP-6T2Wt9-7baHmM-aVpWkt-7bgwUA-7bhSPn-9M2QU4-aUXTD8-7beDa7-eVLBmw-aUXTLk-2EYR3H-aUXTYX-aUXTW8-aUXUaX-aUXU8a-4kvY6K-azT8dp-aUXTHg-aUXUkp-aUXUz8-aUXU5g-aUXTTr-aUXUg6-aUXUpZ-6iJm2j-aUXUwr-aUXUtg-aUXUnR-aUXTNr-4BNSep-aUXTQi-ai6w7F-8Pkt2n-jz4ju-7bgWNy-7bdBz2-fhR1s9-73up42-btXnoC-7bicJb-azXNym-5YKvdQ">Christian Heindel</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>Yet even those halcyon days had their dark moments. In 1985, author <a href="http://lindsyvangelder.com/">Lindsy Van Gelder</a> wrote about her experiences with the <a href="https://www.wired.com/2009/09/0924compuserve-launches/">CompuServe CB Simulator</a>, one of the world’s first online chat rooms. Among the popular channels in CB Simulator were those devoted to romance and relationships, which were of particular interest to LGBTQ individuals who found it difficult to discuss gender identity and sexual preferences in public. While many users found love online – <a href="http://boxchronicles.com/cb-simulator/">a 1991 wedding hosted in CB Simulator is thought to be the first online wedding</a> – in Van Geldr’s case, she was <a href="http://lindsyvangelder.com/sites/default/files/Plinkers.org%20-%20Electronic%20Lover.htm_.pdf">deceived into an intimate online romantic relationship</a> by a man posing as a disabled woman. </p>
<p>Stories of sexual aggression turned perhaps darker in 1998, when technology journalist Julian Dibbell wrote about a <a href="http://www.juliandibbell.com/articles/a-rape-in-cyberspace/">sexual assault</a> that took place in a text-based online world called <a href="http://www.moo.mud.org/">LambdaMoo</a>. The notion of a sexual assault online might seem odd given that users have no physical contact with one another. Yet, a LambdaMoo user named “Mr. Bungle” hacked the program in a way that allowed him to have complete control over other users’ behaviors, such as their conversations and descriptions of their movements. </p>
<p>He used this hack to cause users to engage in obscene and violent sexual acts with their own bodies, having the players describe where and how they were touching themselves and others, but without consent, according to Dibbell’s account. Mr. Bungle claimed that his actions were just a prank, despite his victims’ insistence that they had been humiliated by his actions (or at least the actions that he forced them to perform or describe while performing). The story is notable, given that <a href="http://journals.sagepub.com/doi/abs/10.1177/009365096023001001">online relationships can be just as intimate and important as offline ones</a>. </p>
<p>Fast forward to early 2006, and the story of Evan Guttmann and his friend’s stolen Motorola Sidekick mobile phone captivated the internet. What started as a <a href="http://www.evanwashere.com/stolensidekick/original/">simple blog</a> about a teenager who refused to return the phone to its rightful owners turned into a story of a growing internet mob – <a href="http://www.nytimes.com/2006/06/21/nyregion/21sidekick.html">followers of Evans’s blog tracked down the teen’s home address and harassed the family</a>. </p>
<p>Later in 2006, users of MySpace would hear the tragic story of Megan Meier, a Missouri teenager who <a href="http://www.nytimes.com/2006/06/21/nyregion/21sidekick.html">took her own life</a> after the boy she met online (a MySpace user named “Josh”) shunned her. It was only later, after investigations were done, that Megan’s family found out that the boy “Josh” was really the mother of a girl that Megan had recently gotten into a fight with. That incident led to the passage of the United States’ first <a href="http://tucson.com/news/mo-begins-prosecuting-under-cyberbullying-law/article_5178236b-7989-5913-a8b6-c9b8b915575b.html">cyberbullying laws</a>. </p>
<h2>Understanding social media</h2>
<p>These stories are examples of what can happen when a single user discovers ways to use a technology that weren’t intended by designers: using the anonymity of CompuServe to deceive, using clever programming scripts to alter other users’ behaviors, using blogs to draw attention to a minor offense, and using social media to create a false identity. In each case, deceptions and actions <a href="https://books.google.com/books?id=sK62qU0Fbz0C&pg=PA100&lpg=PA100&dq=ethics+and+lambdamoo&source=bl&ots=mu0-YxVE2i&sig=uufrSoD1-I6_kJikUF7dt2FFdFI&hl=en&sa=X&ved=0ahUKEwiTtaqbst7TAhWJ4iYKHXjFDucQ6AEINzAD#v=onepage&q=ethics%20and%20lambdamoo&f=false">had dramatic real-life consequences for those involved</a>. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/175897/original/file-20170627-24798-176ju2j.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/175897/original/file-20170627-24798-176ju2j.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=401&fit=crop&dpr=1 600w, https://images.theconversation.com/files/175897/original/file-20170627-24798-176ju2j.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=401&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/175897/original/file-20170627-24798-176ju2j.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=401&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/175897/original/file-20170627-24798-176ju2j.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=504&fit=crop&dpr=1 754w, https://images.theconversation.com/files/175897/original/file-20170627-24798-176ju2j.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=504&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/175897/original/file-20170627-24798-176ju2j.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=504&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">How can we understand today’s social media?</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/bangkok-thailand-november-26-2016-man-527447305?src=fhw4S5O_rDrZZcCOrViWYg-1-69">Vasin Lee/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>Most importantly, these stories serve as examples of how to understand Facebook specifically, and social media in general. It is important that users realize that the ethics of Facebook communication are no different than the <a href="https://www.natcom.org/sites/default/files/pages/2013_Public_Statements_Credo_for_Free_and_Responsible_Use_of_Electronic_Communication_Networks_Approved.pdf">ethics of any other form of human communication</a>. Rather than dismissing social media as wasteful and distracting <a href="http://www.sciencedirect.com/science/article/pii/S1096751616300379">and passing this perspective on to our children</a>, they need to recognize that the enterprise of human communication <a href="http://journals.sagepub.com/doi/pdf/10.1177/009365096023001001">is as meaningful online as it is offline</a>. </p>
<p>Commentators have blasted Facebook’s livestreaming option as “<a href="https://theringer.com/facebook-live-violence-crime-moderation-policy-9a3ae1fefb07">essentially barrierless broadcasting system</a>,” but such critiques ignore the benefits of that “barrierless” broadcasting, <a href="http://onlinelibrary.wiley.com/doi/10.1111/jcc4.12069/pdf">such as connecting families separated by oceans</a> and <a href="http://www.tandfonline.com/doi/abs/10.1080/14680777.2013.838369">providing voice to persecuted groups</a>. Even violent footage can, at times, be beneficial: The <a href="http://www.cnn.com/2016/07/07/us/facebook-live-video-minnesota-police-shooting/">Facebook Live broadcast of a July 2016 police shooting in Minnesota</a> served as a powerful reminder <a href="https://daily.jstor.org/how-do-i-not-look/?preview_id=24392">about social injustice and policing in the United States</a>. Counterterrorism forces have come to <a href="https://www.usatoday.com/story/tech/news/2017/06/15/facebook-using-artificial-intelligence-to-crack-down-on-terrorism/102887032/">rely on social media posts to track and better understand terrorist activities online</a>. </p>
<p>To combat misuse of livestreaming, Facebook recently announced the <a href="http://www.popsci.com/Facebook-hiring-3000-content-monitors">hiring of an additional 3,000 monitors to screen live videos</a>. However, in my view, ultimately, the responsibility for the content of social media falls to the <a href="https://dash.harvard.edu/handle/1/4455262">digital citizens</a> who create and interact in the space on a daily basis.</p><img src="https://counter.theconversation.com/content/77230/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nicholas Bowman does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Facebook has been used at times for antisocial behavior. However, such behaviors are neither unique nor new.Nicholas Bowman, Associate Professor of Communication Studies, West Virginia UniversityLicensed as Creative Commons – attribution, no derivatives.