In a recent statement, UK Science Secretary Peter Kyle emphasized the obligation of social media platforms, particularly Facebook (operated by Meta) and X (formerly Twitter), to adhere to UK legal standards. This assertion follows a controversial decision by Meta to adjust its regulations concerning fact-checkers, suggesting a potential decrease in content moderation for harmful material. Kyle articulated that while these companies may implement changes that primarily affect American users, they are required to comply with British law when operating in the United Kingdom, particularly regarding illegal content.
Mark Zuckerberg, the CEO of Meta, announced earlier this week that the modifications in content moderation rules would enable content moderators to “catch less bad stuff.” This change, confined to the United States, is expected to also lessen the number of innocent posts that could be unjustly removed. Kyle made it clear during a BBC interview with Laura Kuenssberg that such decisions made by Meta reflect its stance towards its American audience but must abide by UK legal frameworks that necessitate the removal of illegal online content.
The urgency of addressing harmful online content is underscored by Ian Russell, who lost his daughter Molly to suicide at the age of 14 after she encountered detrimental content on social media platforms. Russell urged the Prime Minister to initiate stricter internet safety regulations, expressing concern that the UK’s stance on online safety is regressing. He criticized the approach taken by Zuckerberg and Elon Musk, head of X, asserting that these tech leaders are moving towards a more laissez-faire model that neglects user safety, particularly for vulnerable young individuals.
Despite the concerns voiced, a spokesperson for Meta insisted that there would be no alteration in how the company manages content related to suicide, self-injury, and eating disorders. The spokesperson emphasized the continued use of automated systems to identify and handle high-severity content. However, critics argue that current laws in the UK regarding internet safety are insufficient, especially regarding live streaming and the promotion of self-harm or suicide-related content.
Peter Kyle acknowledged the existing shortcomings in the UK’s online safety laws, labeling them as “very uneven” and “unsatisfactory.” While the Online Safety Act, passed in 2023, was designed to bolster protections against harmful content, it faced significant backlash. Initial provisions proposed to compel social media companies to eliminate certain “legal-but-harmful” content—such as posts that could encourage eating disorders—were scrapped after critics raised concerns about potential censorship. Ultimately, the act shifted its focus to requiring social media platforms to empower users to filter out unwanted content, particularly to protect children from harmful material.
Kyle voiced his frustrations with the legislation’s amendments but stopped short of indicating whether he would press for the reinstatement of previously proposed measures to strengthen the law. Nonetheless, he pointed out that the act possesses some empowering provisions that could be utilized to address emerging safety issues. Furthermore, he indicated that impending measures would be introduced to ensure online platforms deliver age-appropriate content.
Companies that fail to comply with these legal stipulations face “very strident” penalties, according to Kyle. He voiced the importance of legislative agility in adapting laws to keep pace with evolving technologies, expressing openness to new legal frameworks that could enhance online safety standards. As the digital landscape continues to evolve rapidly, the necessity for effective regulation and safeguarding measures becomes increasingly critical to protect users, especially minors, from harmful online content.