Keir Starmer keeps saying the right things, but he isn’t doing them. Last week, the Prime Minister stood at a lectern and promised a "national emergency" response to online abuse, specifically the scourge of non-consensual deepfakes and misogyny. It sounded great. It looked decisive. Yet, for those of us watching the data, the optics don’t match the itinerary.
While Starmer talks about "putting tech firms on notice," his ministers are busy holding the door open for them. Recent analysis of government records reveals a staggering reality. In the two years leading up to late 2025, tech giants like Google and Meta secured over 100 ministerial meetings. During that same period, the groups actually fighting to keep your kids safe—campaigners, bereaved families, and child safety experts—were lucky to get a fraction of that time.
This isn't just a scheduling conflict. It’s a power imbalance that explains why our online safety laws feel like they’re being written with a "don’t hurt the profits" disclaimer attached.
The Appeasement Problem
Baroness Beeban Kidron, a leading voice in the fight for digital rights, didn't mince words when she accused the government of "appeasing" big tech. She’s right. When you meet with a $4 trillion company ten times more often than you meet with a charity representing dead children, you’ve made a choice.
You’ve chosen the economy over the user.
We're seeing a pattern where the government waits for a scandal—like the recent uproar over Elon Musk’s Grok AI—before they suddenly find the "urgency" to act. Starmer’s latest push to bring AI chatbots under the Online Safety Act is a perfect example. We’ve known about the chatbot loophole for two years. Why did it take a "scandal" to trigger an amendment?
The Sticking Plaster Strategy
Ian Russell, whose daughter Molly died after being fed a diet of self-harm content by algorithms, has been the most vocal critic of this "dither and delay" approach. He’s warned that the government is essentially using sticking plasters to treat a gunshot wound.
The Online Safety Act was supposed to be the "world-leading" solution. Instead, it’s become a bureaucratic maze.
- Ofcom’s Slow Walk: The regulator has been accused of lacking the "urgency and scale" needed. It’s taken years to even get the first codes of practice in place.
- The Loophole Game: Tech firms are masters at finding the gaps. When one site gets blocked, a "mirror site" pops up. When social media is regulated, they move the harmful features into "standalone" AI tools.
- The Engagement Trap: The core problem isn't just the content; it’s the design. Features like infinite scroll and autoplay are designed to keep users hooked, yet the government is only now "consulting" on whether to restrict them.
If you’re a parent, you don’t want a consultation. You want a kill switch for the features that are making your child miserable.
A Fight With Big Tech?
Starmer likes to use "tough guy" rhetoric. He’s told the media, "If that means a fight with the big social media companies, then bring it on."
But where is the fight?
So far, the "fight" looks like a lot of polite meetings in Westminster. We’re told that tech firms must remove "revenge porn" within 48 hours or face being blocked. That sounds tough until you realize that blocking a global platform like X or Meta in the UK is a "nuclear option" that no government has actually had the guts to trigger.
The reality is that these platforms derive only a tiny fraction of their global revenue from the UK. To them, a fine from Ofcom—even a 10% global revenue fine—is just the cost of doing business. If Starmer wants to lead, he needs to stop asking for permission and start dictating the terms of entry into the UK market.
What Needs to Change Right Now
If the UK wants to be a "leader not a follower" in online safety, the government has to shift its focus from content to corporate culture.
- Direct Liability: We need to stop fining the company and start prosecuting the executives. If a CEO knows their algorithm is serving suicide content to 13-year-olds and does nothing, they should face a jail cell, not just a line item on a balance sheet.
- Safety by Design: Regulation should focus on the mechanics. Ban "infinite scroll" for under-18s. Force "high-friction" age verification. If a platform can't prove it's safe for a child, it shouldn't be allowed to have children on it.
- Transparency Over Access: Cut the ministerial meetings. If a tech firm wants access to the heart of government, it should be contingent on full transparency regarding its internal safety data. No more secret algorithms.
The Online Safety Act is currently a "laissez-faire" model masquerading as a shield. It’s time for Starmer to decide if he’s the Prime Minister or a lobbyist’s best friend.
Don't wait for the government to fix this for you. Check your child's privacy settings today. Use third-party monitoring tools that actually work. And most importantly, keep the pressure on your MP. The only reason we have any laws at all is because families like the Russells refused to be ignored.
Stop believing the press releases. Watch the meeting logs. That’s where the real policy is made.