Menu Close

Human writes

The responsibilities of social media companies over free speech

Protests, like this one in Pakistan, have spread over the Muslim world in response to an incendiary video depicting Muhammad that was posted on YouTube. EPA/Rahat Dar

The global uproar over the YouTube trailer for Innocence of Muslims may have subsided, but the controversy lingers. Last week I wrote about the US obligations regarding that video, and concluded that it was under no international obligation to censor the movie. In fact, it is forbidden under its own domestic Constitution from doing so.

The (non) blocking of access by YouTube

But what about YouTube’s obligations? YouTube is a subsidiary of Google, and owns the platform upon which the video was placed. It has the right, if it wishes, to remove the video. To do so would not breach US law, as it is a private body which is not bound by the US Constitution. Indeed, the White House requested that Google remove the video, but Google has refused.

It is tempting to say that YouTube should take the video down, given the mayhem that the video has sparked, and the almost universal view that it is a grubby film that adds little of value to the sum of human experience. However, a take-down would shut the gate after the horse has bolted. Copies of the video have likely proliferated, including on sites beyond Google. In any case, I doubt many of the rioters have viewed the movie: many are simply angry at its very existence.

Google has blocked access to the video in some countries, including Indonesia and India, as the video’s content breaches the law of those States. It also chose to take the site down in Egypt and Libya, due to the volatile situation in those countries. I will return to that decision below.

Censorship and social media companies

It is trite to note that the internet and social media are incredibly powerful forces. After all, social media played a key role in the overthrows of long-standing dictators in Tunisia and Egypt in early 2011. Now a YouTube video has sparked protests and riots across the globe, with the latter leading, tragically, to several deaths. It is fair to say that internet and social companies have, perhaps unwittingly, come to exercise significant power over political and social developments.

The most popular and influential social media platforms are run by private companies, namely Google, Facebook and Twitter. What responsibilities should these companies have with regard to the content they host? In particular, should they exercise censorship powers beyond that mandated by the laws of the States they operate in? Should, for example, YouTube have a “tougher” censorship policy and clamped down on Innocence of Muslims before it became uncontrollably viral?

A starting point for discussion is to examine the companies’ censorship policies. YouTube’s Community Guidelines prohibit hate speech, as well as other “bad stuff” like abuse of animals or instructions on bomb making. Its policy on violent videos was amended in 2007 after it was criticised for removing videos showing police abuse in Egypt. YouTube clearly embraced the potential for its site to highlight human rights abuses (which sometimes requires the presentation of violence).

Facebook’s Statement of Rights and Responsibilities provides that users cannot post content that “is hate speech, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence”.

Both YouTube and Facebook have the unilateral right to remove content in accordance with their policies, and there are no avenues to appeal their decisions. After all, they are their sites.

Certain take-down decisions have attracted controversy. For example, Facebook eventually removed a site promoting a Third Intifada in Israel and the Occupied Territories after initially resisting requests to do so, as well as a site associated with the Syrian military. But what is Facebook’s criterion for “taking sides” regarding the political and revolutionary messages that might turn up on its site? Given the contemporary importance of Facebook pages in promoting political activism, Facebook must tread a fine line between allowing its platform to be used for the organisation of peaceful protests (which may easily contain comments that are far from peaceful) and pages that promote violence and hate. But how does Facebook determine its own political red lines?

Social media companies have enormous potential power over the true global extent of the enjoyment of free expression, which is not commensurate with their expertise or their accountability. After all, they are not traditional media companies well-versed in making editorial decisions on what to and what not to include in the limited space of a newspaper or a half hour broadcast. Rather, they are the hosts of unlimited amounts of information created and disseminated by others.

Given those realities, the best policy is probably that adopted by Twitter, which exercises less discretion over the removal of content. Instead, it will “withhold” tweets in accordance with valid legal orders. It will endeavour to inform the user concerned and publicise the fact of the withdrawal. It will also presumably still remove tweets which clearly breach its “content boundaries”, but these are significantly narrower than those of YouTube or Facebook.

Twitter has largely knocked the free speech ball back into the arena of States, which is probably where it should be. It, like YouTube and Facebook, lacks the credentials to fairly and consistently arbitrate free speech.

Innocence of Muslims: too incendiary to leave up?

Returning to Innocence of Muslims, YouTube has blocked the video in Egypt and Libya of its own accord due to the “difficult” situations in those countries, rather than any identified breach of its guidelines. I said in last week’s post that I was not “perturbed” over those actions. Without wishing to impugn Google’s humanitarian motives in taking such action, I now believe that I may have spoken too soon.

First of all, the upload can be removed in accordance with YouTube’s normal policy by legal order in either Egypt or Libya. Perhaps surprisingly, no such order was apparently made. Secondly, as noted above, YouTube’s measures were probably too late to make much of a difference to the protests and riots on the ground. Thirdly, YouTube risks setting a precedent whereby it rewards the unreasonable and unjustified reactions of a violent few by giving in to their demands. Such an action may simply legitimise more violent reactions the next time incendiary content is placed online. Free speech must not be held hostage in that way.

Want to write?

Write an article and join a growing community of more than 180,400 academics and researchers from 4,911 institutions.

Register now