Menu Close

Facebook is all for community, but what kind of community is it building?

There are many of us who stand for bringing people together and connecting the world. I hope we have the focus to take the long view and build the new social infrastructure to create the world we want for generations to come.

– Mark Zuckerburg, Facebook Post

Facebook CEO Mark Zuckerberg talks a good game about wanting to build a better global community, but a community is only as good as the standards that underpin it. Recent events suggest that Facebook’s idea of community isn’t really the kind we want to build “for generations to come”. And if we want Facebook to do better, we as users need to push for change.

Read more: The law that made Facebook what it is today

The problem with InfoWars

Facebook is currently copping criticism from both sides of the US political spectrum for its decision to ban the InfoWars page – a far-right media organisation known for spreading conspiracy theories – from its platform. As of July this year, Facebook was holding the line that InfoWars’ content did not breach its community standards guidelines despite persistent evidence that the site was peddling fake news and hate speech. Last week, Facebook became the latest in a string of social platforms – starting with YouTube at the end of July – to ban the organisation.

Why does the ban matter?

Facebook is the world’s largest social network, with more than 2.2 billion active users a month, and wields an enormous amount of influence in the way that we view the world. Its attempts to better deal with fake news in the wake of Russian tampering in the US election and the UK Brexit referendum is evidence that it knows we’re watching and we’re not happy.

Understanding online community governance

We all have to follow the Facebook policy. No matter what our personal opinions are, we all have to follow this.

– Facebook moderator trainer, “Inside Facebook

It’s easy to point the finger at a moderator when you disagree with their assessment of a comment or a post, but it is important to understand that moderation is used to enforce the standards of an online community. Moderation decisions are framed by an organisation’s community standards policy, and individual moderators rarely if ever have any say in formulating the overall policy.

Read more: Facebook turns to real people to fix its violent video problem

Community standards set the tone for the community that is built on them.

In fact, we all follow a range of community standards every day in our offline lives. Some of these are legal – defamation laws, anti-discrimination laws, and laws around hate speech, for example – and some are social codes, like saying please or thank you, or giving up our seats on public transport to those who need them.

Online communities are no different. Research has shown that the online space, particularly the large social platforms like Facebook, can reinforce problematic social hierarchies and prejudices around gender, sexuality and race.

It’s OK – everyone else says it

When they say “freedom of speech”, what they’re really saying is “we really want to permit people to do whatever they want on this platform and we will do the bare minimum to make that socially acceptable”.

– Roger McNammee, former Mentor to Mark Zuckerburg, “Inside Facebook

In 1974 Elisabeth Noelle-Neumann wrote about a phenomenon she described as the “spiral of silence”. Essentially, it amounts to the silencing of minority opinion for fear of reprisal.

Social media has turned Noelle-Neumann’s theory on its head, becoming a place where the “vocal minority” push out more restrained voices. They generate what Mark Zuckerberg’s former mentor Roger McNamee referred to in the recent documentary Inside Facebook: Secrets of a Social Network as “the crack cocaine of [Facebook]”, attracting “the most highly engaged people on the platform”. This content is also more likely to be shared, meaning that more people will see it, and more often.

Problematically, the more we are exposed to something, the more we are inclined to believe it’s true. This is what we refer to as the Illusory Truth Effect.

By allowing groups that are, by any commonsense standard, hate groups – be they racial, sexual or gender-based – or pages such as InfoWars that demonstrably peddle fake news, to proliferate across their site, Facebook is in effect telling its users that those views are not only acceptable but normal. Which leads us to…

‘The standard you walk by, is the standard you accept’

In 2013, the then head of the Australian Army, Lieutenant-General David Morrison, sent an impassioned video message about the culture of sexual harassment in the Australian armed forces, and the need to do better. His assessment that “the standard you walk by, is the standard you accept” applies equally to Facebook.

Facebook and other social platforms can and have changed their policies as a result of public pressure. In particular, where that pressure is highly visible and essentially threatens their bottom line.

Read more: Why do we stay on Facebook? It's complicated

For example, in 2013 the activist group Women, Action and Media, sent emails to more than 5,000 companies that advertise on Facebook. Their campaign prompted Nissan to withdraw spending on Facebook advertising until the platform could assure the company that its ads would not be placed near offensive content. Given that YouTube’s decision to axe InfoWars was largely based on advertiser pressure, one can assume that this was front of mind for the social behemoth as well.

So we need to be, as Roger McNamee suggested in the documentary, “more focused, more persistent and [to] stop accepting [Facebook’s] excuses, stop accepting their assurances”.

Facebook claims to want to help build a better global community. But if it isn’t as interested as doing that in practice as it is in theory, then we’re going to need to apply some pressure.

Want to write?

Write an article and join a growing community of more than 187,400 academics and researchers from 5,001 institutions.

Register now