Dec 17, 2024, 1:19 AM
Dec 17, 2024, 1:19 AM

Ofcom's safety code fails to address suicide risks urgently needed

Highlights
  • Ofcom has introduced a new code for social media companies focusing on illegal online content as part of the Online Safety Act.
  • Child safety advocates expressed disappointment over the absence of measures addressing suicide and self-harm content.
  • The lack of targeted regulations raises concerns about ongoing risks to vulnerable individuals online and calls for immediate action.
Story

In November 2023, Ofcom, the UK regulator, unveiled a new set of rules designed to ensure that social media companies take significant action against illegal and harmful online content. This initiative is part of the implementation of the Online Safety Act, which stakeholders hope will enhance the safety of digital platforms. However, child safety advocates expressed dissatisfaction with the new regulations, noting that they do not include specific measures aimed at addressing content related to suicide and self-harm. This concern was emphatically voiced by Andy Burrows, chief executive of The Molly Rose Foundation, whose establishment was inspired by the tragic experience of losing teenager Molly Russell to suicide linked to such content. Burrows articulated his organization's astonishment and disappointment, emphasizing the fate of lives and the vital need for immediate and decisive action. Maria Neophytou, the acting chief executive at the NSPCC, echoed these sentiments, highlighting concerns that the current measures fail to adequately tackle serious types of illegal online content, including child sexual abuse material. Neophytou fears that a lack of adequate response could create loopholes, allowing significant platforms to evade regulations without facing enforcement actions. In response, Ofcom chief executive Dame Melanie Dawes urged tech firms to meet the established safety standards or face enforcement measures, including hefty fines or site restrictions. Technology Secretary Peter Kyle called the launch of these regulations a major step toward achieving safer online environments. Although the rules cover various forms of online harm, including terrorism-related content and hate speech, there are criticisms regarding their sufficiency and effectiveness in protecting vulnerable individuals online and potentially preventing imminent threats to life.

Opinions

You've reached the end