Ofcom has finalised its new online child safety rules, with tech firms operating in the UK expected to meet the regulator’s requirements by 25 July.
Under the Online Safety Bill, the digital watchdog has said that any provider that operates personalised recommendations, for example via a social media feed, and poses a medium or high risk of harmful content, must configure their algorithms to filter out harmful content.
The rules also state that the riskiest services must use “highly effective” age assurance to identify which users are children.
Ofcom said that services with minimum age requirements which don’t use strong age checks must assume younger children are using their platform and therefore make sure they have an age-appropriate experience.
Additionally, technology providers will be required to have processes in place to review, assess and quickly tackle harmful content when they become aware of it.
The regulator says that platforms must give children more control over their online experience, including by allowing them to indicate what content they don’t like; to accept or decline group chat invitations; to block and mute accounts; and to disable comments on their own posts.
There must also be supportive information for children who have come across or searched for harmful content.
From the July deadline, tech firms are expected to facilitate easier reporting and complaints, with children finding it straightforward to report content or complain.
Finally, under the rules, these online services must have a named person accountable for children’s safety, with a senior body carrying out annual reviews of the management of risk to children.
Ofcom said that providers of services likely to be accessed by UK children now have until 24 July to finalise and record their assessment of the risk their service poses to children, which the regulator may request. Following this, they should implement safety measures to mitigate those risks.
From 25 July, companies are expected to apply the safety measures set out in Ofcom’s rules.
If firms fail to comply with these requirements, Ofcom now has the power to impost fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK.
Dame Melanie Dawes, Ofcom chief executive, said: “These changes are a reset for children online. They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content. Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.”
Rachael Annear, partner and part of the International Data Protection and Cyber Practice at law firm Freshfields said that the new guidance highlights the close collaboration between Ofcom and the Information Commissioner’s Office on the “crucial intersection of data protection and age assurance for online services.”
“Although this joined up approach offers welcome clarity, implementing robust age checks will still present significant challenges for services,” continued Annear. “A range of age assurance tools will be available, but choosing the most appropriate method will require careful consideration.
“Services must strike a delicate balance between ensuring children’s safety and respecting individual’s privacy.”