New EU rules require social media platforms to take down flagged posts within 24 hours – and modelling shows that’s fast enough to have a dramatic effect on the spread of harmful content.
Women need better protection from online hate and misogyny, both while using social media and when working for technology companies.
A key piece of federal law, Section 230, has been credited with fostering the internet and allowing misinformation and hate speech to flourish. Here’s how it could be reformed.
The turmoil at Twitter has many people turning to an alternative, Mastodon. The social media platform does a lot of what Twitter and Facebook do, but there are key differences.
Moderating content is a box that still needs to be ticked.
Platforms have started a silent censorship war through this opaque (and often harmful) approach to content moderation.
Elon Musk said he wants to make Twitter a platform for free speech. Here is what research shows about claims of political bias and excessive moderation.
The age of the free speech free-for-all is over – but public online spaces are possible.
Musk has long touted Twitter’s potential as an open and inclusive ‘town square’ for public discourse – but the reality is social media platforms were never meant to fulfil this role.
Elon Musk’s attempt to take over Twitter uses free speech as the motivation, but research shows that unregulated online spaces result in increased harassment for marginalized users.
We found LGBTIQ+ groups are exposed to an unacceptable level of discrimination and intimidation, including death threats, targeting of Muslims and threats of stoning or beheading.
The social media giant’s third-party review panel upheld Facebook’s ban on Donald Trump. A corporate governance expert explains why Facebook created the Oversight Board.
Imagine if Facebook’s content was hosted on a blockchain — across many thousands of ordinary computers — and governed equally by each of them, rather than Mark Zuckerberg.
Facebook’s choice of profits over the people is difficult to reconcile with its commitment to free speech.
Deplatformed groups can all too easily flock to alternative platforms to coordinate.
Clubhouse offers a rare experience of spontaneity and intimacy online. But as the new social platform grows, it may face problems of moderation and abuse.
New research suggests tech firms need to improve how they detect abuse in response to the evolving use of coded language.
A video purporting to show a suicide is reportedly circulating on TikTok, reigniting debate about content moderation on social media. Collaborating with competitors may be the key.
The platform also took down another 2,000 communities, including left-leaning groups. The move comes just months ahead of the 2020 US presidential election.
The order requires Facebook, Twitter and Google to remove certain content globally, based on it being defamatory under India’s local law.