UK Law Mandates 48-Hour Image Removal
Analysis based on 15 articles · First reported Feb 18, 2026 · Last updated Feb 19, 2026
The new regulatory framework in the United Kingdom will directly impact technology companies, particularly social media platforms and internet service providers, by imposing stricter content moderation requirements and potential financial penalties. This could lead to increased operational costs for compliance and a shift in how these companies manage user-generated content, potentially affecting their profitability and market valuations.
The United Kingdom government has introduced new laws, through an amendment to the Crime and Policing Bill, requiring tech platforms to remove non-consensual intimate images within 48 hours of being flagged. Failure to comply could result in fines of up to 10% of worldwide revenue or a ban from operating in the United Kingdom. The initiative, championed by Prime Minister Keir Starmer, Technology Secretary Liz Kendall, and Minister Alex Davies-Jones, aims to protect women and girls from online abuse. United Kingdom===Ofcom is considering treating such images with the same severity as child sexual abuse and terrorism content, and will fast-track decisions on requiring hash matching technology. The government also plans to publish guidance for internet providers on blocking access to rogue websites. This builds on previous actions, such as calling out Grok (chatbot) for sharing non-consensual images and legislating against 'nudification' tools, bringing chatbots under the Online Safety Act.
Set up alerts, explore entity relationships, search across thousands of events, and build custom intelligence feeds.
Open Dashboard