|
![]() |
|
all |
Ofcom Unveils Sweeping Online Safety Rules to Protect Children in the U.K.
LOS ANGELES (April 24, 2025) — The Association of Sites Advocating Child Protection (ASACP) wishes to notify its family of sponsors, members, stakeholders, and the broader internet community that the U.K.’s communications regulator, Ofcom, today finalized a groundbreaking set of safety measures requiring technology companies to take stronger steps to protect children online. ASACP continues to work with Ofcom on behalf of the industry and in the interest of protecting children through effective policy and technical measures. The new rules, which will take effect this July, mandate that social media platforms, gaming services, and search engines used by children in the United Kingdom implement more than 40 safety requirements under the Online Safety Act. The regulations are designed to shield young users from dangerous online content, including material related to suicide, self-harm, eating disorders, pornography, and online abuse. Companies will also be obligated to address content that promotes misogyny, violence, and cyberbullying. “These changes are a reset for children online,” said Dame Melanie Dawes, Ofcom’s Chief Executive. “They will lead to safer social media feeds, fewer encounters with harmful material, and tougher protections against unwanted contact. Our mission is to ensure a safer generation of children online — and we are ready to take enforcement action if companies fail to deliver.” What’s Changing The new Codes of Practice — shaped by input from more than 27,000 children and 13,000 parents, civil society groups, child safety experts, and industry stakeholders — aim to embed a “safety-by-design” approach in digital services. Key measures include safer content feeds, robust age verification, rapid response to harmful content, more user control, and simplified reporting, while accountability and oversight also receive more attention. For example, tech companies must adjust their recommendation algorithms to filter out harmful material from children’s feeds if their platforms are deemed medium or high risk. High-risk services must also use advanced age assurance technology to distinguish children from adults, and platforms without strong age checks must assume that younger users are present and adjust their content accordingly. All platforms must have systems to detect and respond swiftly to harmful content. They must also ensure that children can easily report harmful content and understand the platform’s terms of service in clear language. Children will receive new tools to manage their online experiences, such as muting users, blocking messages, turning off comments, and choosing who can invite them to group chats. Every platform must designate a person responsible for managing its child safety protocols, and senior leadership must review risk management practices annually. These regulations add to existing obligations requiring services to prevent access to illegal content, including grooming and child sexual exploitation. They also reinforce specific mandates for adult websites to keep minors from viewing explicit content. What’s Next Digital services accessible to U.K. children must complete a child safety risk assessment by July 24, 2025, and begin implementing protective measures immediately after. Full compliance with Ofcom’s Codes of Practice will be expected starting July 25, 2025. Companies that fail to comply could face significant penalties, including fines and, in extreme cases, legal action to block access to their platforms within the U.K. Ofcom plans to continue consulting on additional safety requirements in the months ahead, further solidifying its long-term strategy to create a safer online environment for young users. More information on Ofcom’s regulations can be found here. “ASACP supports thoughtful regulations that protect the innocence of youth while preserving the rights of adults,” said Executive Director Tim Henning. “Besides Ofcom’s requirements for the U.K., the association promotes establishing a global child safety standard that includes our free child protection resources for parents, Best Practices tailored to specific adult market segments, and our comprehensive Code of Ethics for all website and mobile app publishers.” Among ASACP’s most notable achievements are its RTA (Restricted To Adults) meta-labeling system, which prevents children from accessing adult-oriented web pages and apps, and its CSAM Reporting Tipline, which has received and processed over 1.25 million reports since its inception and remains a vital global resource. To learn more about how your business can help protect itself by protecting children, email [email protected].
About ASACP Founded in 1996, ASACP is a nonprofit organization dedicated to online child protection. ASACP is comprised of the Association of Sites Advocating Child Protection and the ASACP Foundation. ASACP is a 501(c)(4) social welfare organization that manages a membership program, providing resources to companies to help them protect minors online. The ASACP Foundation is a 501(c)3 charitable organization responsible for the CP Reporting Tipline and RTA (Restricted To Adults) website labeling system. ASACP has invested nearly 29 years in developing progressive programs to protect minors, and its assistance to the digital media industry’s child protection efforts is unparalleled. For more information, visit ASACP.org.
###
|