Skip to content

Data regulator issues stark child safety warning to tech firms

The ICO has reprimanded social media platforms for not using available age assurance

Child safety

The UK’s data regulator has demanded tech platforms hosting social media and video sharing content act immediately to strengthen age assurance measures in the name of child safety.

In an open letter to applicable firms, the Information Commissioner’s Office (ICO) said that the technologies required to prevent children from accessing harmful content are “readily available”, yet many services still offer sub-standard measures.

“Age assurance technologies have rapidly advanced in recent years,” the ICO said. “We are concerned that services have not yet implemented the technology that is now available to protect young children.”

The bulk of social media platforms currently have a minimum age of 13, however, the ICO notes that in many cases, the only age verification check in place is a self-declaration from the user.

“This means underage children can easily access services that have not been designed for them. This puts under-13s at risk by allowing their information to be collected and used unlawfully, without the protections they are entitled to.”

The ICO said risks to children are “growing” and the “now is the time to act” to ensure they can be kept safe.

The watchdog said technologies including facial age estimation, digital IDs or one-time photo matching are far more effective than the easily circumvented self-declaration process.

It has, however, also warned that age assurance checks must not infringe with users’ data rights.

“Any age assurance technology you choose must comply with data protection law. It must be lawful, fair, proportionate, secure, collect the minimum data necessary, and be clearly explained to users in an age-appropriate way.”

Topics

Register for Free

Bookmark your favorite posts, get daily updates, and enjoy an ad-reduced experience.

Already have an account? Log in