Safe internet use for children – New guidelines in the UK
In the United Kingdom, online services will soon have to implement new rules for the protection of children.
The UK's media regulator has surveyed thousands of children and parents about their experiences on the internet and published evaluations here.
(Image: Ofcom)
The British media regulator (Office of Communications, Ofcom) has published new guidelines for the protection of children on the Internet. Providers of online services will soon have to comply with these in the legal area of the United Kingdom in order to better protect children and young people from harmful and illegal content. This includes misogynistic, violent, hateful or offensive content, bullying and grooming, as well as content related to suicide, self-harm, eating disorders and pornography.
According to Ofcom, the new guidelines or codes of conduct build on or supplement existing rules to protect all users – more than 40 "practical measures" are proposed. They were developed through a dialogue process with affected companies, child protection organizations, experts and families. Among other things, feedback from 27,000 children and young people and 13,000 parents was included in the authority's decision-making process. The process was initiated back in May 2024. Ofcom had published initial proposals for measures at this time.
More control over content shown, clear contacts and reporting channels
The following codes of conduct were defined, among others
- Safer feeds: personalized feeds directly feed harmful content to children. Accordingly, providers that operate a recommendation system and have a medium or high risk of harmful content should configure their algorithms in such a way that harmful content is filtered out of children's feeds.
- Effective age checks: High-risk services should use "highly effective age verification" to identify children as users. As a result, children may only be able to use certain parts of an app or not use it at all.
- Act quickly: All websites and apps should have processes in place to review, assess and quickly remove harmful content as soon as they become aware of it.
- More choice and support for children: Providers should give children more control over their online experience. For example, they should be able to indicate what content they do not like. They should also be able to block contacts, chats and comments and receive help more easily if they come across harmful content and accounts.
- Easier reporting and complaints: The reporting and complaints system should be designed in such a way that children can use it easily. Terms of use should also be formulated in such a way that children can understand them.
- Strict management: Providers should appoint a responsible person to ensure the safety of children and young people. A higher-level body should review the handling of risks for children on an annual basis.
Implementation, otherwise penalties
Providers of online services that fall within the scope of the child protection regulations must now carry out and document risk assessments for children by July 24, 2025. Ofcom can also request these assessments. If the new guidelines are also confirmed by then through parliamentary procedures that are still open, corresponding safety measures must be implemented from July 25, 2025. The authority also points out that it will take enforcement action if providers do not act immediately to reduce risks to children in their offerings. Both financial penalties and exclusion from the market may be imposed. The authority also reserves the right to extend some of the regulations to other services. It is collecting proposals for this until July 22.
Melanie Dawes, Ofcom's Chief Executive, commented on the publication: "These changes are a new start for children online. They mean safer social media feeds with less harmful and dangerous content, protection from being contacted by strangers and effective age controls for adult content. Ofcom has a role to play in creating a safer generation of children online. If companies fail to act, they will face enforcement action."
While the UK is currently relying on the new guidelines to improve the protection of children and young people online, Australia plans to enforce a social media ban for children under the age of 15 from the end of the year.
(kbe)