Menu Close

Censorship through the millennia. And trying to locate it in the 21st Century

Shutterstock

Once upon a time we all knew what censorship was, who the good and bad guys were, and what could be done to make the world a better place. Look up the noun “censor” in the Oxford English Dictionary and you’ll find an outline of a much-told story under definition 2 (b):

An official in some countries whose duty it is to inspect all books, journals, dramatic pieces, etc, before publication, to secure that they shall contain nothing immoral, heretical, or offensive to the government.

Attributing the first instance of this usage to the English poet John Milton, the lexicographers illustrated it with a quotation from his anti-censorship pamphlet, Areopagitica (1644):

He (the author) … must appear in print like a punie (i.e. a new schoolboy) with his guardian, and his censors hand on the back of his title, to be his bayl and suretye that he is no idiot, or seducer.

Following Milton’s gendered rendering, the story, therefore, went something like this: the censor was the bad guy (Milton’s “temporising and extemporising licencer” with his “cursory eyes”). The writer was the good guy (Milton’s “learned” champion of “free writing and free speaking”). And the plot involved the struggle of the latter against the former not just in his own interests, as a member of the “Republic of Letters”, but in the interests of creating a freer and more grown-up commonwealth for all.

True, the odds were stacked in favour of the all-powerful, infantilising state. Yet no matter how often the struggle played out, the outcome was assured: the seemingly puny champions of freedom and truth would prevail in the end.

There wasn’t much room for us so-called “ordinary readers” in all this. We were either the innocents the paternalistic-repressive state was supposedly trying to protect, or the voiceless fellow citizens on whose behalf the writers were supposedly fighting. But, if we wanted to make the world a better place, it was clear who we needed to support.

Messiness of history

For about three centuries, that is, for the greater part of what we could call the “age of print”, this story had some currency and even some plausibility. I’ll gloss over the messiness of the actual history that all too often throws up inconvenient facts. It reveals in some cases, for example, censors who were not cursory or paranoid state bureaucrats but “learned men” in Milton’s sense who believed they were making the world a better place.

And the canonical story hardly dated overnight. Even in the early years of the digital revolution, it looked like it had plenty of time to run.

Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.

So wrote John Perry Barlow, former lyricist for the American rock band, Grateful Dead, in the opening of his 1996 “Declaration of the Independence of Cyberspace”, an Aeropagitica for the digital age.

Barlow wasn’t being quixotic. Far from showing any signs of weariness, the old state giants were already gearing up to make the most of the opportunities the new technologies afforded for extending their sovereignty, whether repressively (think of China), defensively (think of the UK) or aggressively (think of Russia).

The complication was that the emerging tech giants of the post-industrial world were themselves poised to become the new disrupters in ways Barlow did not anticipate.

Over the course of the next decade the likes of Google, Facebook and Twitter — the “private superpowers” as historian and commentator Timothy Garton Ash dubs them — turned Barlow’s brave new cyber world into a vast profit-making machine effectively run on surveillance algorithms. At the same time they created the conditions for other actors, whether of the state (think of Donald Trump), allied to it (think of India’s social media vigilantes), or outside it (think of the worldwide population of trolls), to wield new forms of “temporising and extemporising” power.

Sometimes adding the threat of violence to the mix, these new enemies of free expression act like a novel breed of self-appointed censor, deforming, infantilising or closing down public debate at every opportunity.

Freedom to hold opinions

Opening the digital Pandora’s Box may have spelt the end of the old story but well before the 1990s developments in international law had already introduced other complications:

Everyone has the right to freedom of opinion and expression. This right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.

In the aftermath of World War II and amid the gathering shadows of the Cold War, Article 19 of the Universal Declaration of Human Rights (1948) represented a major turning point in the long story. It marked the moment the battle-scarred “giants of flesh and steel” collectively agreed if not to curb their powers, then at least to affirm the freedom of expression as a shared ideal.

Only six years later, however, another key UN instrument, the International Covenant on Civil and Political Rights (drafted 1954, signed 1966), added some significant qualifications. The first was under its own Article 19(3) which covers the “rights and reputations of others” as well as “national security”, “public order” and “public health or morals”. Then in Article 20 it prohibited “propaganda for war” and “any advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence”.

As the Oxford English Dictionary notes, the phrase “hate speech”, the origins of which it traces to a report about Hitler in the New York Syracuse Herald for 29 September 1938, now encompasses

hatred or intolerance, esp towards a particular social group on the basis of ethnicity, religious beliefs, sexuality, etc.

At the same time Article 15(3) of the companion Covenant on Economic, Social and Cultural Rights embellished the Universal Declaration. It specifically required states,

to respect the freedom indispensable for scientific research and creative activity.

Taken together, these legal, cultural and technological developments made the canonical story look less and less tenable in the new millennium. They have also reopened the most basic questions once again: What is censorship? If thinking in simple binaries still makes any sense, then who is on the side of the good and who the bad? And what can we ordinary citizen-netizens do to make the world a better place?

Want to write?

Write an article and join a growing community of more than 191,000 academics and researchers from 5,059 institutions.

Register now