The Online Safety Act UK – And how we’ve got it so wrong

sustainable website scroll down

TL;DR: People making the laws are either too stupid or fail to understand that the Internet and, its users, don’t recognise borders. This is why The Online Safety Act will fail.

Peter-Kyle-MP-Online-Safety-Act
Peter Kyle MP

Monday 17th March sees the ‘Illegal Harms Codes of Practice‘ coming into force as part of  the Online Safety Act (OSA). A piece of legislation years in the making and given Royal assent in October 2023. Yes, that’s right 18months ago – and that right there is just a good indication of how poorly thought through this has been by the Government, and Ministers like Peter Kyle MP.

What are the aims of the Act?

The OSA aims to make the UK “the safest place in the world to be online” by creating a new regulatory framework that places duties of care on online service providers to protect users, especially children, from harmful content.
Key aims include:

  • Protecting children from harmful content online
  • Holding tech companies accountable.
  • Tackling illegal content.
  • Establishing age verification requirements.
  • Increasing transparency from tech companies

The bill itself states in its explanatory notes that it:

Red quote“establishes a new regulatory framework to address illegal and harmful content online, with the aim of making the UK the safest place in the world to be online.”

Whilst these are all very noble & just causes, and it’s imperative to protect children from the harmful content the internet can serve up, in practical terms the OSA just falls short in so many ways.

UK Online Safety Act: Flaws and Loopholes Analysis

After studying the OSA it’s easy to play Devil’s advocate, but the loopholes & flaws are just too large to ignore. The Act contains several notable flaws and potential loopholes.

Here’s its key weaknesses:

Definitional Ambiguities

“Legal but harmful” content: While the Act initially included provisions for “legal but harmful” content for adults, these were removed in later amendments. This creates a potential protection gap where harmful but legal content remains unregulated (Part 3, Chapter 2).

Vague definition of harm: Section 187 defines harm as “physical or psychological harm,” but provides limited specificity about psychological harm thresholds, potentially leaving interpretation too open-ended.

Enforcement Concerns

OFCOM’s capacity: The Act grants OFCOM extensive regulatory powers (Part 7) without necessarily ensuring proportionate funding or technical expertise for effective oversight of numerous platforms simultaneously.

Proactive technology requirements: Section 116 mandates the use of “accredited technology” to identify CSEA (Child Sexual Exploitation & Abuse) content, but lacks technical specificity about what constitutes adequate scanning, potentially creating uneven implementation.

Privacy and Free Expression Tensions

End-to-end encryption implications: The Act’s requirements for platforms to identify and remove illegal content (Part 3, Chapter 5) create tension with encrypted services, potentially undermining privacy protections without explicitly addressing this conflict.

Chilling effect on expression: While the Act includes duties to protect “content of democratic importance” (Section 15), the enforcement mechanisms may incentivise over-removal to avoid penalties, creating a disproportionate impact on legitimate speech.

Implementation Challenges

Risk assessment burdens: Sections 8-10 require platforms to conduct risk assessments without providing sufficiently detailed methodologies, potentially creating inconsistent safety standards across services.

SME exemptions but scaling concerns: The Act provides exemptions for smaller businesses (Part 3, Chapter 6), but lacks clear transition procedures for platforms experiencing rapid growth, creating potential regulatory gaps.

The burden of implementation is so large that OFCOM (which has received no additional funding or resources) has set a timetable for when each step of OSA will require compliance (Compliance timetable on OFCOM website). This reaches into November 2025, although OFCOM say that is to be confirmed.

International Jurisdiction Issues

Extraterritorial application complexity: While Section 181 addresses territorial scope, enforcement against non-UK companies remains challenging without adequate international cooperation frameworks.

Potential conflict with other regulatory regimes: The Act operates alongside other frameworks (EU Digital Services Act, US DMCA etc.) creating potential compliance conflicts for international platforms.

This analysis reveals that while the UK Online Safety Act attempts comprehensive regulation of online harms, its implementation may face significant challenges due to definitional ambiguities, practical enforcement limitations, and tensions with other rights and regulatory frameworks.

Why the Act will fail.

Peter Kyle – Secretary of State for Technology said on February 27th, 2025, BBC Radio 4 that he would:

Red quote“..Use some disruption”

To stop people watching material he deems inappropriate. Yes, you read that right. That’s the level of delusion politicians have.

There’s no doubting that children watching porn is bad and harmful to them. In fact the recent Baroness Bertin’s 2025 report on pornography regulation, it pointed out that teenagers are starting to view erotic asphyxiation as ‘normal’.

Again, Politicians are using their moral compass and proclivities to dictate to us all. This from Baroness Bertin’s report:

Red quote“One study found that 13% of sexually active girls aged 14 to 17 had already been choked. Anecdotal evidence submitted to this Review also suggests that boys as young as ten are asking teachers how to choke girls during sex”

Anecdotal evidence‘ – so not factual or a survey or research, someone’s opinion.

Back to Peter Kyle’s ‘some disruption‘, because of the way the internet works, there’s very little any Government can do legally to police it. There are countless tools available to anyone who cares to use a search engine, the two main ones that people will use are TOR (the Onion Routing project) or a VPN (Virtual private network).

For Mr Kyle it must be disheartening to learn that, according to a Forbes survey:

Red quoteMore than three quarters of Brits (76%) are familiar with Virtual Private Networks (VPNs).”

and of those using VPNs, the reason given as to why:

Red quoteBypassing censorship or restrictions (20%): VPNs are a valuable tool for individuals living in countries with strict internet censorship or restrictions. These users rely on VPNs to access blocked websites and communicate freely online”

Using the TOR network (via the TOR browser) is also a viable alternative for accessing restricted material (& the dark net). What politicians are failing to grasp in a big way is that the stick doesn’t work when it comes to the internet. They need a new approach (more on that later).

Even Internet Service Providers (ISPs) have given up on restrictions, previously websites which would have been blocked are now easily circumnavigated by site owners with proxies, or ISPs just have stopped bothering.

ISP blocking websites pre OSASo, Peter Kyle, how are those ‘…some disruptions’ working out? Badly, I’d say.

Images

So let’s address images and how a moral compass sets the tone. Apologies if you watch Last Week Tonight with John Oliver, but this was too good an example to not use to show how hard OFCOM are going to have it.

And that’s just one of the many issues with the Act. We’re relying on someone else’s proclivities & peccadillos to be the arbiter of what is acceptable.

Over-reach

Politicians all over the world think that their legislation is the global yard-stick, it just isn’t. The Online Safety Act has literally no enforcement powers outside of the UK & its territories. Much like the DMCA from America – which is great tool for reporting copyright infringement (for example), and the EU counterpart.

Instead, it relies on inter-country cooperation.

So let’s say OFCOM don’t like an app in the Play store. They say to Google to remove it for UK users in the Play store, Google honour the request and they’ve complied with the Act.

But users still want the banned app, so they side-load the APK file (which is probably hosted in a non-UK friendly country), and hey presto they have the app. This isnt technical stuff and there isn’t really an awful lot OFCOM or the Government can do about it.

Encryption

End-to-end encryption is the thorn in the side of the OSA, Section 49 requires providers to:

Red quote“operate a system or process for… reporting CSEA content present on the service to the NCA”

When combined with Section 116’s technology requirements, this creates an obligation that fundamentally conflicts with the technical limitations of end-to-end encryption.

Red quote“Section 116: CSEA Content – Use of Accredited Technology
Section 116(1) empowers the Secretary of State to require use of “accredited technology” to:

“(a) identify CSEA content present on a service, and
(b) swiftly take down that content once identified.”

This requirement fundamentally conflicts with end-to-end encryption, which prevents service providers from scanning message contents.

Nobody is defending CSEA material, but yet again politicians are using the law as a blunt instrument. As it currently stands here’s just a few of the following mainstream apps & services will be affected by the Act:

  • Banking Apps
  • WhatsApp
  • Apple iMessage
  • Facebook messenger
  • Instagram DMs
  • Twitter DMs
  • LinkedIn messages
  • Signal
  • Telegram

The Act’s approach to encryption also reveals several key flaws:

No Absolute Protection: Unlike some proposed amendments that would have explicitly protected end-to-end encryption, the final Act contains no absolute prohibition on measures affecting encryption.
Balancing Framework: The Act creates a balancing test (Sections 104 and 121) rather than a bright-line rule protecting encryption.
Implementation Uncertainty: The practical implementation of provisions like Section 116 (CSEA detection) remains unclear for fully encrypted services.
Deferred Resolution: The Act effectively delegates the resolution of encryption conflicts to OFCOM’s implementation decisions rather than resolving them legislatively.

This analysis demonstrates that while the Act acknowledges encryption concerns, it creates a regulatory framework that could significantly impact encrypted services depending on implementation decisions by OFCOM and the Secretary of State, Peter Kyle.

Conflicts with Other Regulatory Regimes

The interaction between the UK Online Safety Act and other international regulatory frameworks creates substantial compliance challenges:
EU Digital Services Act (DSA) conflicts: Services operating across the UK and EU face potentially contradictory compliance requirements:

The DSA’s “country of origin” principle contrasts with the UK’s destination-based approach, creating dual regulatory burdens.
Content classification differences between the regimes may lead to scenarios where content permitted under one framework must be removed under another.
Timeline misalignments between implementation schedules create operational complexities for platforms serving both markets.

Data protection regime tensions: The Act’s provisions interact problematically with international data protection frameworks:

Requirements for content monitoring (particularly in Sections 116-119) may conflict with GDPR provisions in the EU and UK.
International data transfers necessary for compliance create additional regulatory burdens and legal uncertainties.
The lack of explicit provisions addressing these intersections leaves service providers in difficult compliance positions.

Free trade agreement implications: The Act’s requirements may potentially conflict with digital trade provisions in agreements like CPTPP:

Localisation requirements implied by some compliance measures could violate free trade principles.
The Act’s differential treatment of services based on user location may be challenged under non-discrimination provisions in trade agreements.

Risk of regulatory fragmentation: Without sufficient harmonisation provisions, the Act contributes to the growing international patchwork of digital regulations:

Service providers face increasing compliance costs navigating multiple regional regulatory regimes. The Act’s Section 3 risk assessments may conflict with similar but non-identical requirements in other jurisdictions, forcing platforms to maintain multiple compliance systems.
This regulatory fragmentation disproportionately impacts smaller platforms lacking resources for multi-jurisdictional compliance teams.

Evidence gathering barriers: Section 87 grants Ofcom information-gathering powers, but these face practical limitations when applied to non-UK entities:

Foreign service providers may refuse compliance with information notices, citing jurisdictional conflicts.
The Act lacks sufficient provisions addressing situations where international judicial cooperation is unavailable.

These expanded points highlight how the international dimensions of the UK Online Safety Act create significant implementation challenges that may undermine its effectiveness while creating substantial compliance burdens for global platforms.

Have Governments learnt nothing?

In February 2011, a small marketplace called the Silk Road popped up and showed everyone what the internet really is – borderless. It ran for 2½ years until October 2013 when the FBI shut it down.

Why? A misconfigured server leaked IP addresses. That’s it. One technical mistake.

Despite media characterisations of the Silk Road as a lawless digital bazaar, the marketplace maintained its own internal governance structure and rules. According to court documents from United States v. Ross William Ulbricht, prohibited items included:

  • No child pornography or child exploitation material
  • No stolen goods, credit cards, or personal information
  • No counterfeit currency or documents
  • No fraudulent services (including hitmen-for-hire)
  • No weapons or ammunition (this was added to the prohibited list later)

The Silk Road operated with two key technological protections:

Accessibility exclusively through the Tor network, which routes internet traffic through multiple servers to obscure its origin.

Transactions conducted solely using Bitcoin, providing a pseudonymous payment system.

Fast forward to 2025 and the OSA. Politicians still don’t get it.

The technical vulnerabilities that killed Silk Road have been fixed by newer platforms. Developers learned from those mistakes. And under OSA age verification, users are put at risk, like the Tea App data breach

Here’s the reality: You can write all the laws you want, but the internet finds workarounds. Always has, always will. The Silk Road showed us this 14 years ago.

Politicians keep thinking they can control the digital world with legislation. They can’t. The cat-and-mouse game continues, and honestly, the cat isn’t winning.

The people who made this post possible

Thanks for reading our handy guide to The Online Safety Act 2023.

Big thanks to :

for their help in planting seeds, providing inspiration & in John Oliver’s case, explaining about moderation and censorship better than I could’ve. ©HBO, used under fair use policy.

If you’d like to work with a sustainable website designer, with a no nonsense approach.

Feel free to contact QED 

To see the effect of our
content creation,
See our case study
on The SV Group

We created content over a six month period targeting key areas where their business wanted to expand