New laws determining the responsibility of digital platforms to protect vulnerable users under the Online Safety Act come into force today.
The act, which received Royal Assent back in October 2023, has been phasing in the enforcement of its new requirements.
In the spring, the portion of the act that requires digital platforms to take action to remove illegal content took effect. The next phase, concerned with protecting children from seeing harmful content, has now come into force.
From today, platforms hosting potentially inappropriate content, from pornography to hate speech and references to self-harm, suicide and violence, must enforce effective age verification tools to prevent children from viewing it.
Where before sites could simply ask users their age, taking responses at face value, stricter verifications will be required.
There are a handful of approved methods for doing so, including credit card checks, requesting users to submit IDs and, in a major boost for age verification companies like Yoti, facial age estimation scans.
Digital platforms that do not comply with the requirements could face an £18m fine, or 10% of the company’s global annual revenue, whichever is greater.
“Our lives are no longer split between the online and offline worlds – they are one and the same. What happens online is real. It shapes our children’s minds, their sense of self, and their future. And the harm done there can be just as devastating as anything they might face in the physical world,” said Technology Secretary Peter Kyle.
“The time for tech platforms to look the other way is over. They must act now to protect our children, follow the law, and play their part in creating a better digital world.
“And let me be clear: if they fail to do so, they will be held to account. I will not hesitate to go further and legislate to ensure that no child is left unprotected.”
According to Ofcom, a thousand platforms have already confirmed they have age checks in place, including the UK’s most visited adult website PornHub.
The enforcement comes as figures from Ofcom reveal that children as young as eight have accessed online pornography.
Responding to the enforcement of the next phase of the act, Jake Moore, global cybersecurity advisor at ESET said the age verification requirements would “likely have a few teething problems” but is a “huge step towards online safety for children” nonetheless.
“Although some of the ways to verify ages may sound like they pose potential privacy and security risks by collecting data such as ID uploads or financial information, there are methods in place to reduce further harm,” Moore said.
“Online privacy has been completely avoided since the birth of social media and other sites with harmful content but this is a move towards the classic adage of better late than never.”