News laws requiring tech platforms to take down intimate images shared without the consent of the subject within 48 hours are set to come into effect.
Through an amendment to the Crime and Policing Bill, online platforms could face fines of up to 10% of their worldwide revenue or even a total UK ban for failing to comply with the 48 hour rule.
The government said that its aim is to ensure that victims of non-consensual image sharing only have to report an image once for it to then be removed across multiple platforms in one go and automatically deleted at every new upload.
Ofcom is considering plans to treat any kind of explicit image shared without consent with the same legal severity as child sexual abuse and terrorism content.
The move is part of the government’s pledge to tackle online harms, in particular those effecting women and girls. The rule follows the criminalisation of using AI technology to generate sexualised deepfakes of people.
Several UK regulatory bodies are currently investigating social media platform X over the use of its AI chatbot Grok to make such images.
“As director of public prosecutions, I saw firsthand the unimaginable, often lifelong pain and trauma violence against women and girls causes. As prime minister, I will leave no stone unturned in the fight to protect women from violence and abuse,” said Prime Minister Sir Keir Starmer.
“The online world is the frontline of the 21st century battle against violence against women and girls. That’s why my government is taking urgent action: against chatbots and ‘nudification’ tools.
“Violence against women and girls has no place in our society, and I will not rest until it is rooted out.”