Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      AI and its impact on the developer experience, or ‘where is the joy?’

      July 23, 2025

      Google launches OSS Rebuild tool to improve trust in open source packages

      July 23, 2025

      AI-enabled software development: Risk of skill erosion or catalyst for growth?

      July 23, 2025

      BrowserStack launches Figma plugin for detecting accessibility issues in design phase

      July 22, 2025

      Power bank slapped with a recall? Stop using it now – here’s why

      July 23, 2025

      I recommend these budget earbuds over pricier Bose and Sony models – here’s why

      July 23, 2025

      Microsoft’s big AI update for Windows 11 is here – what’s new

      July 23, 2025

      Slow internet speed on Linux? This 30-second fix makes all the difference

      July 23, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      Singleton and Scoped Container Attributes in Laravel 12.21

      July 23, 2025
      Recent

      Singleton and Scoped Container Attributes in Laravel 12.21

      July 23, 2025

      wulfheart/laravel-actions-ide-helper

      July 23, 2025

      lanos/laravel-cashier-stripe-connect

      July 23, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      ‘Wuchang: Fallen Feathers’ came close to fully breaking me multiple times — a soulslike as brutal and as beautiful as it gets

      July 23, 2025
      Recent

      ‘Wuchang: Fallen Feathers’ came close to fully breaking me multiple times — a soulslike as brutal and as beautiful as it gets

      July 23, 2025

      Sam Altman is “terrified” of voice ID fraudsters embracing AI — and threats of US bioweapon attacks keep him up at night

      July 23, 2025

      NVIDIA boasts a staggering $111 million in market value per employee — since it became the world’s first $4 trillion company

      July 23, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Development»Ofcom Finalizes Online Child Safety Rules to Protect UK’s Youngest Internet Users

    Ofcom Finalizes Online Child Safety Rules to Protect UK’s Youngest Internet Users

    April 24, 2025

    Ofcom, Child Safety Rules, Online Child Safety Rules

    The United Kingdom communications regulator Ofcom has finalized a comprehensive set of child safety rules under the Online Safety Act, ushering in what it calls a “reset” for how children experience the internet.

    Announced Thursday, the new regulations require over 40 practical safeguards for apps, websites, and online platforms accessed by children in the UK. These range from filtering harmful content in social feeds to robust age checks and stronger governance requirements. The measures apply to platforms in social media, gaming, and search—any online service likely to be accessed by children under 18.

    “These changes are a reset for children online,” said Dame Melanie Dawes, Ofcom’s Chief Executive. “They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content. If companies fail to act they will face enforcement.”

    The finalized Codes of Practice are the product of consultations with over 27,000 children, 13,000 parents, civil society organizations, child protection experts, and tech companies. The rules will be enforceable from July 25, 2025.

    Algorithmic Filters, Age Assurance, and Governance

    A key focus of the reforms targets personalized recommendation algorithms—often the pathway through which children are exposed to harmful content. Under the new rules, platforms using recommender systems must filter out harmful material from children’s feeds if they pose medium or high risks.

    The rules also impose mandatory age assurance on the most high-risk services. Platforms must verify users’ ages with a high degree of accuracy, and if unable to do so, must assume children are present and provide an age-appropriate experience. In some cases, this may mean blocking children’s access entirely to certain content, features, or services.

    In addition, all providers must maintain fast-action processes to quickly assess and remove harmful material once identified.

    “These reforms prioritize safety-by-design,” said a UK-based child safety policy expert. “The burden is finally shifting onto platforms to proactively assess and mitigate risks, rather than waiting for harm to happen.”

    Child Safety Rule: More Control, Better Support for Children

    Beyond content moderation, the rules talk about giving children more control over their online environment. Required features include:

    • The ability to decline group chat invites.

    • Tools to block or mute accounts.

    • The option to disable comments on their own posts.

    • Mechanisms to flag content they do not wish to see.

    Services must also provide supportive information to children who search for or encounter harmful material, including around topics like self-harm, suicide, or eating disorders.

    Clear and accessible reporting and complaint tools are also mandatory. Ofcom requires platforms to ensure their terms of service are understandable to children and that complaints receive timely, meaningful responses.

    Accountability at the Top

    A standout requirement under the new framework is “strong governance.” Every platform must designate a named individual responsible for children’s safety, and senior leadership must annually review risk management practices related to child users.

    “These aren’t just tech tweaks. This is a cultural shift in corporate responsibility,” said the child saffety policy expert. “They [Ofcom] are holding leadership accountable for keeping children safe.”

    Also read: Australia Gives Online Industry Ultimatum to Protect Children from Age-Explicit Harmful Content

    Enforcement, Deadlines, and What’s Next

    Tech firms have until July 24, 2024, to finalize risk assessments for services accessed by UK children. From July 25, 2025, they must implement the measures outlined in Ofcom’s Codes—or demonstrate alternative approaches that meet the same safety standards.

    Ofcom has the authority to issue fines or apply to the courts to block access to non-compliant sites in the UK.

    The child safety measures build upon earlier rules introduced under the Online Safety Act to prevent illegal harms, such as grooming and exposure to child sexual abuse material (CSAM). They also complement new age verification requirements for pornography websites.

    More regulations are expected soon. Ofcom plans to launch a follow-up consultation on:

    • Banning accounts found to have shared CSAM.

    • Crisis response protocols for real-time harms.

    • AI tools to detect grooming and illegal content.

    • Hash matching to prevent the spread of non-consensual intimate imagery and terrorist material.

    • Tighter controls around livestreaming, which presents unique risks for children.

    “Children deserve a safer internet. This framework lays the foundation, but we’re not stopping here,” Ofcom said in a statement.

    Resources for Parents and Children

    To accompany the new regulations, Ofcom published guidance for parents, including videos and answers to common safety questions. It also launched child-friendly content explaining what changes children can expect in their favorite apps and platforms.

    As the codes go before Parliament for final approval, stakeholders across the tech ecosystem will be watching closely. For many, this marks a critical test of how well regulatory bodies can compel tech giants to prioritize child safety over engagement metrics.

    Source: Read More

    Facebook Twitter Reddit Email Copy Link
    Previous ArticleBCPS Cyberattack Confirmed: Employee and Student Data Potentially Compromised
    Next Article Cybercrime Losses Jump 33% in 2024, FBI Report Shows

    Related Posts

    Development

    Singleton and Scoped Container Attributes in Laravel 12.21

    July 23, 2025
    Development

    wulfheart/laravel-actions-ide-helper

    July 23, 2025
    Leave A Reply Cancel Reply

    For security, use of Google's reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.

    Continue Reading

    Windows Update Cleanup Tool: 3 Ways to Clear Saved Cache

    Operating Systems

    ZSH Quickstart Kit

    Linux

    CVE-2025-37826 – Linux Kernel UFS SCSI Null Pointer Dereference Vulnerability

    Common Vulnerabilities and Exposures (CVEs)

    The toughest phone I’ve tested packs a ridiculously long battery (and it’s $180 off)

    News & Updates

    Highlights

    Machine Learning

    Google Unveils Gemini 2.5 Flash in Preview through the Gemini API via Google AI Studio and Vertex AI.

    April 18, 2025

    Google has introduced Gemini 2.5 Flash, an early-preview AI model accessible via the Gemini API…

    CVE-2025-4508 – PHPGurukul e-Diary Management System SQL Injection Vulnerability

    May 10, 2025

    Raspberry Pi 5 Desktop Mini PC: PiGro – system configuration tool

    June 8, 2025

    CVE-2025-45001 – React Native Keys Information Disclosure

    June 9, 2025
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.