A video purporting to show a suicide is reportedly circulating on TikTok, reigniting debate about content moderation on social media. Collaborating with competitors may be the key.
The platform also took down another 2,000 communities, including left-leaning groups. The move comes just months ahead of the 2020 US presidential election.
The order requires Facebook, Twitter and Google to remove certain content globally, based on it being defamatory under India's local law.
Mark Zuckerberg may try to minimise their concerns, but Facebook moderators and other online workers are beginning to organise for their own protection.
Australia's latest defamation ruling has made Facebook publishing a minefield, but there are strategies to ensure better social media outcomes for everyone.
The Facebook boss's calls for outside help to draft new rules on what is acceptable behaviour online should be welcomed. So what's his next step?
Taking effective action against online sharing of graphic content isn't straightforward. But, yet again, the government's inclination seems to be to legislate first and discuss later.
Children can't handle watching livestreamed massacres – and adults shouldn't have to.
It's time for social media platforms to be more open about how livestreaming works, how it is moderated, and what should happen if or when the rules break down.
The news that a former moderator is suing Facebook over unsafe work practices suggests it's time we finally took the mental health of moderators seriously.
Research has shown that large social platforms like Facebook can reinforce problematic social hierarchies and prejudices around gender, sexuality and race.
Comments like 'little girl needs to keep to herself before daddy breaks her face' get a free pass in the name of free speech.
Facebook wants to stop violent videos appearing in its feeds, but we must ensure human moderators don't suffer.