As the federal government prepares to introduce legislation to regulate content on social media, Canadians have reason to be concerned about the effectiveness of its approach and the poor example we are about to set for countries that don’t share our commitment to human rights.
Heritage Minister Steven Guilbeault has hinted in recent weeks that Canada’s forthcoming legislation will be modelled after Germany’s NetzDG law. The law allows for social media companies to be fined up to 50 million euros for failing to remove what the legislation calls “obviously illegal” content from their sites within 24 hours of being notified.
The details of the government’s approach remain unknown since no meaningful public consultations were held about the development or drafting of this legislation. What we do know about the upcoming bill should concern all Canadians for at least two reasons.
‘Lawful but awful’
The first is that it won’t be effective in dealing with the bulk of the harmful content we find on the internet today.
Social media companies are not perfect at removing content that violates Canadian law, such as child sexual exploitation material or terrorist propaganda, but they’ve improved considerably in recent years. Where they struggle, however, is in dealing with “lawful but awful” content that is legal under the laws of most democracies, including Canada’s, but is known to create real-world harm.
Consider the vast amounts of pandemic-related misinformation on YouTube and Facebook, or the casually racist or misogynistic memes that populate many Instagram feeds.
The broad protections that Canada’s Charter of Rights and Freedoms provide for the right to free expression makes it difficult for governments to ban such content outright, or to even restrict the expression of such harmful and distasteful ideas in public spaces. Correspondingly, a new law that seeks to penalize technology companies for failing to promptly remove illegal material will only scratch the surface of the problems with harmful content online.
More troubling, however, is the example that the forthcoming legislation will set for countries that don’t share our respect for human rights.
Authoritarian governments around the world are adopting social media laws that are similar to the one set to be unveiled here in Canada. Those laws impose draconian penalties on social media companies that fail to take down content that is illegal under national laws.
The problem, however, is that the laws in many authoritarian countries criminalize forms of expression that are protected under international human rights law, from voices dissenting against the regime in power to the cultural and religious expression of minority communities.
Pakistan provides a stark example of this trend. Last year, the country enacted a law strikingly similar to what Ottawa is considering, but in the context of a legal system where blasphemy can be punished by death and where it’s a crime to violate “religious, cultural or ethnic sensibilities.”
In Poland, the increasingly authoritarian government of Andrzej Duda also introduced similar legislation in parliament last month, while Victor Orban’s administration in Hungary is also reported to be considering legislating a similar measure.
Internet at risk
Canadians should be concerned about the enactment of such laws in faraway places not only because we value human rights, but because this type of legislation puts the future of a global internet at risk.
As governments seek to regulate the online sphere according to their own peculiar national laws — regardless of whether those laws comply with international human rights standards — there is a risk that the internet will splinter into a series of national networks. That has profound implications for all of us.
Against this bleak international backdrop, Canada needs to think carefully about our approach to regulating online harm. Rather than going it alone by seeking to enforce laws aimed at social media companies, Canada should work with other rights-respecting democracies to develop a multilateral approach to addressing harmful online content.
This is precisely what was done to deal with terrorist and violent extremist content online following the 2019 Christchurch massacre, when a coalition of governments led by New Zealand and France worked with industry and civil society stakeholders to devise the Christchurch Call to Action.
A multilateral approach grounded in the shared language of human rights can help keep the internet free and open while moderating its worst excesses. It will also deny authoritarians around the world of the argument that what’s good for Canada is good for them too.