OSA: Safeguarding or Censorship

When Parliament pushed the Online Safety Act through, the headlines promised protection. Children would be safer, parents reassured, tech giants brought to heel. Ofcom was even handed the power to fine American companies billions. 1

But beneath the triumphant slogans, another reality lurks. The Act compels intrusive identity checks, threatens private communications, and treats ordinary citizens as if they are all suspects. Free speech, a right explicitly protected under British law, is rebranded as an obstacle. One MP even told me directly: “Protecting free speech should not stop us from tackling the growing epidemic of online harm.” 2

That sentence should chill every reader. Because in law, protecting free speech is not optional. It is the very thing that should stop a government from trampling further. When a minister says otherwise, they are not safeguarding liberty. They are admitting they do not care for it.

What is OSA?

The Online Safety Act (OSA) was sold to the public as a shield. It would protect children from harmful content, force tech companies to take responsibility, and create a “safer” internet. At least, that is the story told from the government benches.

At first glance, OSA does not sound like a bad idea. Platforms such as Meta have failed, repeatedly, to deal with real dangers: extremist propaganda spreading unchecked, scams that drain the savings of the elderly, and online harassment that ruins lives. Their refusal to act against coordinated disinformation campaigns has shaped elections, fuelled divisions, incited violence, and destabilised democracies across the West. 3

That is just what we see as adults. For children, the risks are magnified: exposure to hate, encouragement of self-harm, exploitation, and a constant stream of adult anxieties bleeding into their online spaces. It is natural, then, for people to welcome a government finally stepping in to hold these companies accountable.

In practice, the law does something different. It hands Ofcom sweeping powers over speech, demands intrusive identity checks, and compels companies to scan private communications. That is the promise of the Online Safety Act: order imposed on chaos. But the mechanism is not targeted enforcement against genuine harms. It is blanket surveillance. Vague standards and mass scanning sweep up lawful speech alongside criminal content. The cure is not proportionate medicine but quarantine for all: a system that punishes everyone in the hope of restraining the few.

And the reach is not confined to Britain. American platforms, from giants like Meta to smaller services with only a handful of UK users, can be fined for failing to comply. It is an extraordinary assertion of jurisdiction, as if Westminster were deputising itself as the world’s online censor. 4

And that is the true danger here: not safety, but censorship.

The Illusion of Safety

The dangers of the Online Safety Act are not hypothetical. They are written into the law itself, in ways that directly conflict with the rights Britain is bound to uphold.

Under Article 8 of the European Convention on Human Rights (ECHR), every citizen has the right to privacy in their communications. 5 The OSA undermines this by compelling intrusive identity verification and enabling the scanning of private messages. What is framed as “safeguarding” is, in reality, surveillance.

Under Article 10 of the ECHR, citizens are guaranteed the right to freedom of expression. 6 Yet the OSA allows for lawful content to be age-gated or removed entirely if deemed “harmful” by vague standards. Political debate, satire, or difficult conversations can all be restricted, not because they are illegal, but because they might upset or disturb.

The contradictions extend beyond human rights law. The Data Protection Act 2018 and UK GDPR require data minimisation: collecting only what is necessary for a defined purpose. 7 Yet the OSA demands expansive data collection, including biometric checks carried out by third-party companies. In other words, it mandates surveillance far beyond what the law itself permits.

Other democracies have already stumbled down this path. Germany’s NetzDG has been condemned by the UN and human rights groups for encouraging over-blocking of lawful speech. 8 France’s SREN law, narrower in scope, has already faced legal challenges for infringing privacy and expression. 9 If these countries have run aground with lighter versions of OSA, why should the UK believe its own heavier hand will pass the tests of proportionality and legality?

The truth is simple: the Online Safety Act is not just poorly designed. It is unlawful by its very nature.

The False Premise of Protection

The Online Safety Act claims to protect children. That phrase is repeated like a mantra: protection, safety, care. And yet, step into any overstretched CAMHS clinic, or speak to parents waiting months for help, and the reality is obvious. The Act cannot bandage wounds, shorten waiting lists, or place a social worker where one has been cut. What it does instead is demand ID checks, scan private messages: as if surveillance could substitute for care.

The tools to limit online risks already exist. Internet service providers offer parental controls and child-specific filters. Parents can restrict access to adult content, manage screen time, and block harmful sites without a single new law. These safeguards work, and they do so without requiring mass identity checks or government surveillance. 10

Meanwhile, the government has chosen to cut funding from the very services that protect children in the real world:

  • More than £600 million has been slashed from early intervention funding since 2013. 11

  • Over 600,000 under-18s are waiting for CAMHS treatment, many for months or years. 12

  • One-third of consultant child psychiatrist posts remain vacant. 13

  • And when the chance came to vote for an inquiry into grooming gangs, not a single Labour MP supported it. 14

If the goal is truly to protect children, why is the government investing in surveillance rather than safeguarding? Why are resources being poured into Ofcom enforcement while child mental health services collapse?

The answer is uncomfortable but clear: the Online Safety Act is not a child-protection bill. It is a censorship bill draped in the language of child safety. The real problems: underfunded care, long waiting lists, systemic failures in safeguarding remain untouched.

The Lawful Alternative

There is another path, one that does not require surveillance of every citizen. A proportionate, lawful approach exists; But it has been neglected in favour of political theatre.

The real safeguards are not in scanning private messages or demanding biometric ID. They are in restoring funding to social care and child mental health. They are in expanding early intervention hubs nationwide. They are in training and retaining CAMHS professionals so children are not left on waiting lists that stretch for years. And they are in equipping parents with the tools that already exist: ISP-level parental controls that can filter content without demanding surveillance of everyone else.

These alternatives are effective. More importantly, they are legal. They strengthen protection without undermining the rights enshrined in British law. Yet they have been passed over. Why? Because, in the government’s own words, “protecting free speech should not stop us.” That single admission reveals the truth: the Online Safety Act is not about children at all. It is about bypassing rights.

That is why citizens must push back. Ask your MP the only question that matters: How does the OSA protect children more than fully funding social care and mental health? Demand repeal or reform of the Act’s most intrusive provisions.

If Britain truly wishes to protect its children, it must do so lawfully and by funding the institutions that are designed to protect us all. Not by shredding the freedoms our country has fought for. Not by tarnishing the freedoms our children will inherit.

1 UK Parliament, Online Safety Act 2023, c. 40, legislation.gov.uk.

2 MP correspondence with DDL Smith, response to Online Safety Act inquiry, ddlsmith.com/osa-chat-response, 2025.

3 European Commission, “Disinformation and Democracy,” European Democracy Action Plan, 2021

4 Ofcom, “Online Safety Act: Overview of Ofcom’s role,” Ofcom, 2023.

5 Council of Europe, European Convention on Human Rights, Article 8.

6 Council of Europe, European Convention on Human Rights, Article 10.

7 UK Information Commissioner’s Office, “Principles of the UK GDPR,” ICO.org.uk, 2023.

8 United Nations Human Rights Council, “Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression,” UN Doc. A/HRC/38/35, 2018.

9 French Data Protection Authority (CNIL), “Opinion on the Age Verification Provisions of the SREN Law,” 2022.

10 BT, Sky, TalkTalk, Virgin Media, “Parental Controls and Online Safety Tools,” ISP industry guide, 2023.

11 Action for Children, “Children and Young People’s Services: Funding and Spending,” 2023.

12 NHS England, “Children and Young People with an Open Referral to Mental Health Services,” NHS Digital, 2023.

13 Royal College of Psychiatrists, “Workforce Census: Child and Adolescent Psychiatry,” 2022.

14 House of Commons Hansard, “Child Sexual Exploitation: Proposed Inquiry,” Division Votes, 2023.

DDL Smith

DDL Smith is an author from Dartford, Kent in the UK. Spending most of his youth scriptwriting and creating short films for online media, he is passionate about creating deeper stories that shows through his novels.

http://www.danieldlsmith.com
Previous
Previous

When the Earth Remembers

Next
Next

Shadows in the Pines: Folklore in Decay