Menu Close
A speech bubble overlaid with barbed wire
Video streaming platform Rumble promotes itself as anti-censorship. (Shutterstock)

Meet Rumble, Canada’s new ‘free speech’ platform — and its impact on the fight against online misinformation

On June 23, the Canadian government tabled a bill that would expand existing hate speech policy to better address online hate speech.

Over the coming weeks, the government will consult with the public on devising a regulatory framework around how to make social media platforms more transparent and accountable with their content moderation practices. But the growing popularity of Rumble, a Canadian video-sharing platform championing online free speech, foreshadows a tumultuous road to implementation.

Canada’s attorney general and minister of justice David Lametti announces bill C-36 to deal with online hate speech.

Rumble is a relatively small (compared to YouTube) video-sharing platform that gained over 30 million monthly users almost overnight. Chris Pavlovski, Rumble’s CEO from Brampton, Ont., has credited much of the platform’s success to a surge in interest among American conservatives frustrated by big tech’s crackdown on hate speech and misinformation.

Rumble’s lenient approach towards harmful and fallacious speech has attracted the likes of Dan Bongino, the popular American right-wing political pundit known for spewing COVID-19 and 2020 presidential election conspiracy theories.

Another right-wing American political commentator, Dinesh D'Souza, has also acquired a large audience on the platform. D'Souza’s political commentary includes racist and conspiracy-driven content, such as attributing the 9/11 attacks to “the cultural left.”

Recommended misinformation

Rumble doesn’t just accept harmful content, it amplifies it. Journalists reviewing the platform’s recommendation system found that if a person were to search the term “vaccine” on Rumble, they would be three times more likely to be recommended videos containing COVID-19 misinformation than accurate information.

Rumble launched in 2013 with the goal of creating a video-sharing platform that could compete with YouTube. Pavlovski’s goal was initially too risky for investors, but the growing animosity towards big tech sparked new interest in the alternative streaming service.

Pavlovski recently received investments from Silicon Valley’s leading conservative billionaires, including author J.D. Vance, venture capitalist Peter Theil and former Trump adviser Darren Blanton.

Pavlovski said that he never intended for Rumble to become a conservative hotspot. Rather, Rumble was developed for “dedicated content creators who are being stifled elsewhere.”

Content without censorship

Pavlovski has described Rumble as different from YouTube and Facebook because it uses far fewer algorithms for recommending and reviewing content. It’s promoted as a streaming service where creators can gain exposure without fear of suppression or censorship.

On Rumble, videos are displayed in chronological order to users based on who they follow on the platform. Algorithms are not used to filter high risk video content; video content is subject to human review. Algorithms are mainly involved when “trying to figure out which videos are viral and which videos we need to put humans on to look at to distribute” Pavlovski explains.

Rumble also has a more lenient approach to content moderation. Users are prohibited from posting videos that include illegal activities such as pornography, child exploitation or harassment. However, videos claiming election fraud and coronavirus conspiracies remain permissible on the streaming platform.

Pavlovski has remarked that “Rumble will never censor political discussion, opinion or act like the arbiters of truth.”

Neutral platforms

Pavlovski’s criticism of big tech’s reliance on AI to amplify certain content over others is not unfounded. National governments are increasingly concerned by how the discoverability of content impacts their local creators.

In addition, new research indicates that deplatforming malicious content creators can lead them to alternative platforms that are more difficult to control. However, Pavlovski’s claims should be taken with a grain of salt.

Tarleton Gillespie, a principal researcher at Microsoft Research New England, argues that the notion of neutrality advocated by platforms is a mere distraction. The same can be said of Rumble.

“My goal is to keep it as fair as possible. We’re not interested in taking any position on any type of content, we just want to be a platform, and I believe that’s why we’ve seen so much growth,” Pavlovski said in an interview with FOX Business, adding that the company has “stuck to our core policies we started with in 2013.”

The streaming service may list its content in chronological order, but it is not a mere conduit for information. Rumble still tags, categorizes and sorts content; selects which content is trending or viral; and determines which content to license and distribute. These practices are integral to the service’s business model.

Marketing Rumble as a champion of free speech is strategic because it helps the video-sharing platform evade liability for the content appearing on its website. However, a platform that promotes content via algorithmic manipulation and/or receives notice of unlawful content, is arguably a “publisher” according to Canadian common law and, therefore, liable for harmful content appearing on its platform.


Read more: Parler: what you need to know about the 'free speech' Twitter alternative


To combat online misinformation, platforms like Rumble, YouTube and Facebook must be more transparent about how their algorithms organize and promote content. Disclosing substantial information about their content moderation practices is key to enabling accountability, public trust and democratic deliberation.

Whether the Liberals will be successful in implementing the new bill and enforcing transparency among platforms remains to be seen. In the meantime, “free speech” platforms like Rumble will continue to attract users frustrated by the content moderation practices of incumbent platforms.

Want to write?

Write an article and join a growing community of more than 191,200 academics and researchers from 5,061 institutions.

Register now