We found LGBTIQ+ groups are exposed to an unacceptable level of discrimination and intimidation, including death threats, targeting of Muslims and threats of stoning or beheading.
The social media giant’s third-party review panel upheld Facebook’s ban on Donald Trump. A corporate governance expert explains why Facebook created the Oversight Board.
Imagine if Facebook’s content was hosted on a blockchain — across many thousands of ordinary computers — and governed equally by each of them, rather than Mark Zuckerberg.
Facebook’s choice of profits over the people is difficult to reconcile with its commitment to free speech.
Deplatformed groups can all too easily flock to alternative platforms to coordinate.
Clubhouse offers a rare experience of spontaneity and intimacy online. But as the new social platform grows, it may face problems of moderation and abuse.
New research suggests tech firms need to improve how they detect abuse in response to the evolving use of coded language.
A video purporting to show a suicide is reportedly circulating on TikTok, reigniting debate about content moderation on social media. Collaborating with competitors may be the key.
The platform also took down another 2,000 communities, including left-leaning groups. The move comes just months ahead of the 2020 US presidential election.
The order requires Facebook, Twitter and Google to remove certain content globally, based on it being defamatory under India’s local law.
Mark Zuckerberg may try to minimise their concerns, but Facebook moderators and other online workers are beginning to organise for their own protection.
Australia’s latest defamation ruling has made Facebook publishing a minefield, but there are strategies to ensure better social media outcomes for everyone.
The Facebook boss’s calls for outside help to draft new rules on what is acceptable behaviour online should be welcomed. So what’s his next step?
Taking effective action against online sharing of graphic content isn’t straightforward. But, yet again, the government’s inclination seems to be to legislate first and discuss later.
Children can't handle watching livestreamed massacres – and adults shouldn't have to.
It’s time for social media platforms to be more open about how livestreaming works, how it is moderated, and what should happen if or when the rules break down.
The news that a former moderator is suing Facebook over unsafe work practices suggests it’s time we finally took the mental health of moderators seriously.
Research has shown that large social platforms like Facebook can reinforce problematic social hierarchies and prejudices around gender, sexuality and race.
Comments like ‘little girl needs to keep to herself before daddy breaks her face’ get a free pass in the name of free speech.
Facebook wants to stop violent videos appearing in its feeds, but we must ensure human moderators don’t suffer.