File 20171014 3527 1vy4jnu.jpg?ixlib=rb 1.1

Here’s how Australia can act to target racist behaviour online

Racists take advantage of social media algorithms to find people with similar beliefs. from www.shutterstock.com

Here’s how Australia can act to target racist behaviour online

Although racism online feels like an insurmountable problem, there are legal and civil actions we can take right now in Australia to address it.

Racism expressed on social media sites provided by Facebook and the Alphabet stable (which includes Google and YouTube) ranges from advocacy of white power, support of the extermination of Jews and the call for political action against Muslim citizens because of their faith. Increasingly it occurs within the now “private” pages of groups that “like” racism.

The Simon Wiesenthal Center 2017 Digital Terrorism and Hate Report card. Simon Wiesenthal Center

At the heart of the problem is the clash between commercial goals of social media companies (based around creating communities, building audiences, and publishing and curating content to sell to advertisers), and self-ascribed ethical responsibilities of companies to users.

Although some platforms show growing awareness of the need to respond more quickly to complaints, it’s a very slow process to automate.

Australia should focus on laws that protect internet users from overt hate, and civil actions to help balance out power relationships.


Read more: Tech companies can distinguish between free speech and hate speech


Three actions on the legal front

At the global level, Australia could withdraw its reservation to Article 4 of the International Convention to Eliminate All Forms of Racial Discrimination. Such a move has been flagged in the past, but stymied by opposition from an alliance of free speech and social conservative activists and politicians.

The convention is a global agreement to outlaw racism and racial discrimination, and Article 4 committed signatories to criminalise race hate speech. Australia’s reservation reflected the conservative governments’ reluctance to use the criminal law, similar to the civil law debate over section 18C of the Racial Discrimination Act in 2016/7.

New data released by the eSafety Commissioner showed young people are subjected to extensive online hate. Amongst other findings, 53% of young Muslims said they had faced harmful online content; Indigenous people and asylum seekers were also frequent targets of online hate. Perhaps this could lead governments and opposition parties to a common cause.


Read more: Australians believe 18C protections should stay


Secondly, while Australian law has adopted the European Convention on Cyber Crime, it could move further and adopt the additional protocol. This outlaws racial vilification, and the advocacy of xenophobia and racism.

The impact of these international agreements would be to make serious cases of racial vilification online criminal acts in Australia, and the executive employees of platforms that refused to remove them personally criminally liable. This situation has emerged in Germany where Facebook executives have been threatened with the use of such laws. Mark Zuckerberg visited Germany to pledge opposition to anti-immigrant vilification in 2016.

Finally, Australia could adopt a version of New Zealand’s approach to harmful digital communication. Here, platforms are held ultimately accountable for the publication of online content that seriously offends, and users can challenge the failure of platforms to take down offensive material in the realm of race hate. Currently complaints via the Australian Human Rights Commission do elicit informal cooperation in some cases, but citizen rights are limited.

Taken together, these elements would mark out to providers and users of internet services that there is a shared responsibility for reasonable civility.

Digital platforms can allow racist behaviour to be anonymous. from www.shutterstock.com

Civil strategies

In addition to legal avenues, civil initiatives can empower those who are the targets of hate speech, and disempower those who are the perpetrators of race hate.

People who are targeted by racists need support and affirmation. This approach underpins the eSafety commissioner’s development of a Young and Safe portal, which offers stories and scenarios designed to build confidence and grow skills in young people. This is extending to address concerns of women and children, racism, and other forms of bullying.

The Online Hate Prevention Institute (OHPI) has become a reservoir of insights and capacities to identify and pursue perpetrators. As proposed by OHPI, a CyberLine could be created for tipping and reporting race hate speech online, for follow up and possible legal action. Such a hotline would also serve as a discussion portal on what racism looks like and what responses are appropriate.

Anti-racism workshops (some have already been run by the E Safety commissioner) have aimed to push back against hate, and build structures where people can come together online. Modelling and disseminating best practice against race hate speech offers resources to wider communities that can then be replicated elsewhere.

The Point magazine (an online youth-centred publication for the government agency Multicultural New South Wales) reported two major events where governments sponsored industry/community collaboration to find ways forward against cyber racism.

What makes a diverse Australia?

The growth of online racism marks the struggle between a dark and destructive social movement that wishes to suppress or minimise the recognition of cultural differences, confronted by an emergent social movement that treasures cultural differences and egalitarian outcomes in education and wider society.

Advocacy organisations can play a critical role in advancing an agenda of civility and responsibility through the state, the economy and civil society. The social movements of inclusion will ultimately put pressure on the state and in the economy to ensure the major platforms do in fact accept full responsibilities for the consequences of their actions. If a platform refuses to publish hate speech or acts to remove it when it receives valid complaints, such views remain a private matter for the individual who holds them, not a corrosive undermining of civil society.

We need to rebalance the equation between civil society, government and the internet industry, so that when the population confronts the industry, demonstrating it wants answers, we will begin to see responsibility emerge.

Governments also need to see their role as more strongly ensuring a balance between the right to a civil discourse and the profitability of platforms. Currently the Australian government seems not to accept that it has such a role, even though a number of states have begun to act.


The Cyber Racism and Community Resilience Project CRaCR explores why cyber racism has grown in Australia and globally, and what concerned communities have and can do about it. This article summarises the recommendations CRaCR made to industry partners.