Business

Britain Unveils Pioneering Codes of Practice for Online Safety

New Safety Regime for Social Media Platforms

Starting today, Britain's online safety regime mandates social media companies, including Meta's Facebook and ByteDance's TikTok, to address criminal activities on their platforms and enhance user safety by design.

Ofcom's First Codes of Practice

The media regulator, Ofcom, has published its initial codes of practice aimed at combating illegal harms such as child sexual abuse and the encouragement of suicide. Platforms have until March 16, 2025, to evaluate the risks posed by illegal content to both children and adults.

Implementation of Safety Measures

Post the deadline, these companies must begin implementing measures to mitigate these risks. This includes enhanced moderation, user-friendly reporting systems, and built-in safety checks, according to Ofcom.

Strict Standards and Potential Penalties

Ofcom Chief Executive Melanie Dawes emphasizes the regulator's close monitoring of the industry to ensure compliance with strict safety standards. The Online Safety Act, enacted last year, sets higher standards for platforms like Facebook, YouTube, and TikTok, with a strong focus on child protection and illegal content removal.

Under the new code, reporting and complaint functions must be more accessible and user-friendly. High-risk providers are required to employ automated tools like hash-matching and URL detection to identify child sexual abuse material. Ofcom can impose fines of up to £18 million or 10% of a company's annual global turnover for non-compliance.

Britain's Technology Secretary Peter Kyle views the new codes as a significant advancement in online safety, with full support for Ofcom to use its powers, including fines and court-ordered site blockades, against non-compliant platforms.