Close Menu
Tech News VisionTech News Vision
  • Home
  • What’s On
  • Mobile
  • Computers
  • Gadgets
  • Apps
  • Gaming
  • How To
  • More
    • Web Stories
    • Global
    • Press Release

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Trending Now

Review: Plantaform Smart Indoor Garden

29 June 2025

‘We are the media now’: why Tesla’s robotaxis were dominated by Elon Musk superfans

29 June 2025

The Best MagSafe Power Banks for Your iPhone

29 June 2025
Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact
Facebook X (Twitter) Instagram Pinterest VKontakte
Tech News VisionTech News Vision
  • Home
  • What’s On
  • Mobile
  • Computers
  • Gadgets
  • Apps
  • Gaming
  • How To
  • More
    • Web Stories
    • Global
    • Press Release
Tech News VisionTech News Vision
Home » Tech firms must meet July deadline for new Ofcom child safety rules
What's On

Tech firms must meet July deadline for new Ofcom child safety rules

News RoomBy News Room24 April 2025Updated:24 April 2025No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email

Ofcom has finalised its new online child safety rules, with tech firms operating in the UK expected to meet the regulator’s requirements by 25 July.

Under the Online Safety Bill, the digital watchdog has said that any provider that operates personalised recommendations, for example via a social media feed, and poses a medium or high risk of harmful content, must configure their algorithms to filter out harmful content.

The rules also state that the riskiest services must use “highly effective” age assurance to identify which users are children.

Ofcom said that services with minimum age requirements which don’t use strong age checks must assume younger children are using their platform and therefore make sure they have an age-appropriate experience.

Additionally, technology providers will be required to have processes in place to review, assess and quickly tackle harmful content when they become aware of it.

The regulator says that platforms must give children more control over their online experience, including by allowing them to indicate what content they don’t like; to accept or decline group chat invitations; to block and mute accounts; and to disable comments on their own posts.

There must also be supportive information for children who have come across or searched for harmful content.

From the July deadline, tech firms are expected to facilitate easier reporting and complaints, with children finding it straightforward to report content or complain.

Finally, under the rules, these online services must have a named person accountable for children’s safety, with a senior body carrying out annual reviews of the management of risk to children.

Ofcom said that providers of services likely to be accessed by UK children now have until 24 July to finalise and record their assessment of the risk their service poses to children, which the regulator may request. Following this, they should implement safety measures to mitigate those risks.

From 25 July, companies are expected to apply the safety measures set out in Ofcom’s rules.

If firms fail to comply with these requirements, Ofcom now has the power to impost fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK.

Dame Melanie Dawes, Ofcom chief executive, said: “These changes are a reset for children online. They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content. Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.”

Rachael Annear, partner and part of the International Data Protection and Cyber Practice at law firm Freshfields said that the new guidance highlights the close collaboration between Ofcom and the Information Commissioner’s Office on the “crucial intersection of data protection and age assurance for online services.”

“Although this joined up approach offers welcome clarity, implementing robust age checks will still present significant challenges for services,” continued Annear. “A range of age assurance tools will be available, but choosing the most appropriate method will require careful consideration.

“Services must strike a delicate balance between ensuring children’s safety and respecting individual’s privacy.”


Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

OpenAI Leadership Responds to Meta Offers: ‘Someone Has Broken Into Our Home’

29 June 2025

The Best Printers for Home and Office

29 June 2025

The unbearable obviousness of AI fitness summaries

29 June 2025

The Best Dash Kitchen Appliances for Small Apartments and Budgets

29 June 2025
Editors Picks

The Best Printers for Home and Office

29 June 2025

The unbearable obviousness of AI fitness summaries

29 June 2025

The Best Dash Kitchen Appliances for Small Apartments and Budgets

29 June 2025

Apple’s F1 movie is finally here — and it’s good

29 June 2025

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Trending Now
Tech News Vision
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact
© 2025 Tech News Vision. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.