Menu Close

Logan Paul’s dead body video reinforces call for better child protection online

YouTube vlogger Logan Paul posted and then removed a video depicting an apparent suicide victim. Shutterstock

Logan Paul, a vlogger who has millions of subscribers, recently posted a video on YouTube depicting a man’s dead body found in a well-known suicide site in Japan. He later removed the demonetised video following a clamour of criticism.

It’s the latest in a long line of posts carrying disturbing images that routinely appear on social network sites, often for days, before being taken down in response to a public outcry.

The video remained on Paul’s YouTube channel for several hours and got more than 6.3m views. After complaints piled up, The vlogger – and not YouTube – removed the content from his channel and then apologised in a monetised video, which has had nearly 36m views to date.

The actions of Paul, whose online antics are popular among young people, brings into sharp focus questions about what can be done to protect children from such material online. It highlights the need for a robust and enforceable code for online services to design sites and apps in a way that is appropriate for different age groups.

Evidence is mounting about the harmful affect of social networking sites on the well-being of children, including sleeplessness, obesity, compulsive use, and vulnerability to advertising.

It’s in this climate that Facebook has announced its Messenger Kids app – a stripped down version of the company’s instant messaging service for children under 13 that is controlled by their parents.

But current strategies, such as parental controls or confiscating phones, are less effective than providing a broad framework of knowledge and competencies. Research shows that digitally literate children take more risks but come to less harm. When asking children and young people about their online experiences, they often report that their parents or teachers aren’t equipped to provide the guidance and technological support needed to enable adequate control, such as network-level internet filtering or to switch off website tracking.

It’s clear that digital services should adapt to the needs and capacities of people of different ages. Facebook’s efforts to bring in a product purely for children is laudable, if executed in the correct way. But YouTube Kids – which at launch was billed by Google as YouTube “reimagined for families” with safety in mind – recently highlighted the pitfalls of marketing a service to children that can be easily exploited.

The kids aren’t alright

Social media firms use personalisation software to present content to the user in a relevant and engaging way. However, there is little transparency in the criteria used for ranking information and the consequences that the content could have on users, especially when violent and grotesque online posts – tagged and disguised as child friendly – is systematically populating sites targeted at children.

Facebook and Google argue that – because they don’t alter posts, but instead reshape the way they are presented to the user – they aren’t media companies. The E-Commerce Directive (Article 15) states that providers that act as a “mere conduit”, “caching” or “hosting” service are not obliged “to monitor the information they transmit or store”.

Research shows that parents are too often unequipped to apply adequate online controls. Shutterstock

Because EU law fails to impose a monitoring obligation on such providers, the value of a responsible research and innovation (RRI) approach to the design, implementation and use of personalisation algorithms becomes mandatory. We suggest that social media platforms should take editorial responsibility and adopt a code of ethics to promote corporate responsibility. The UK’s Digital Economy Act 2017 makes provision of a code of practice for such providers, but it’s not clear whether there is enough of an appetite in Westminster and beyond to bring stringent guidelines to life.

Major social media networks are designed by behavioural psychologists and machine and human technologists to create an online environment that extends user engagement including dopamine-driven feedback loops. These strategies, such as “likes”, can have a detrimental influence on childhood development, self-regulation, emotional well-being and relationships.

What can be done?

We have worked with 5Rights (a campaign group founded by crossbencher Baroness Beeban Kidron) to call for terms of service on websites to be more accessible, demand clearer information about how personal data is shared and tracked and conclude there should be greater controls that allow users to delete their own content.

Late last year, 5Rights reviewed the UK government’s Internet Safety Strategy. It makes 36 policy recommendations based on the developmental needs of children from infancy to maturity, concluding that industry and government must adapt the online environment to be fit for children by acting beyond commercial consideration. The majority of six-to-nine year olds, for example, are unaware that their web use is tracked for ad targeting.

The World Health Organisation recently listed gaming addiction as a mental health condition. Because of a child’s in-built need to seek rewards, the priming that is a precursor to gaming, betting and gambling is highly affecting and the ability to know when to stop and turn off devices may be compromised. An age-appropriate web and app design should anticipate independent child use, include time outs, easy exit and the ability to switch off auto-play and other excessive gamification techniques.

Baroness Kidron put forward an amendment to the Data Protection Bill for technology companies to be subject to “minimum standards of age-appropriate design”, such as hiding GPS locations, preventing data from being widely shared and ensuring high privacy settings are switched on by default when a user is under 16 years of age.

The Logan Paul debacle illustrates the present dangers children are exposed to online, but if parliament agrees to tougher regulations, we may yet see a brighter future in the digital world.

Want to write?

Write an article and join a growing community of more than 181,000 academics and researchers from 4,921 institutions.

Register now