Smartphone “Block”: New NSPCC Warning to Parents Over Rising Online Threats

March 17, 2026

A major intervention from the NSPCC this morning, Tuesday 17th March 2026, has sent shockwaves through the UK’s parenting community. Following a series of landmark remarks by Ofcom Chief Executive Melanie Dawes at an NSPCC event, the charity is demanding a “culture shift” in how tech giants design smartphones and apps for children.

The warning comes as new data highlights a staggering reality: despite minimum age requirements, nearly 72% of children aged 8-12 are active on social media platforms not designed for them.

The 48-Hour “Intimate Image” Deadline

One of the most significant updates today is the government’s move to legally compel platforms to remove non-consensual intimate images within 48 hours. This follows a surge in “sextortion” cases—where criminal gangs trick teenagers into sending photos and then blackmail them—a trend that has tragically led to several teen suicides in the UK over the past year.

NSPCC CEO Chris Sherwood has blasted tech companies for “knowingly putting children in harm’s way,” arguing that if self-regulation doesn’t improve immediately, a total social media ban for under-16s may be the only solution.

The New Demands for Tech Giants

Ofcom has today issued a formal ultimatum to major platforms like TikTok, Instagram, and Snapchat, giving them until 30 April to prove they are taking action in four key areas:

  1. Highly Effective Age Checks: Moving beyond simple “self-declaration” of age, which children easily bypass.
  2. Failsafe Grooming Protections: Implementing strict defaults that prevent strangers from contacting children.
  3. Algorithmic Safety: Reining in “personalised feeds” that are currently the main pathway for children to encounter harmful, addictive, or violent content.
  4. No More “Product Testing” on Kids: Ensuring new AI tools and features are risk-assessed before they are deployed to younger users.

The “Nudification” App Crackdown

In a direct response to the rise of AI-generated deepfakes in UK schools, the government has also confirmed a new offence in the Crime and Policing Bill. It is now illegal for companies to supply “nudification apps”—AI tools specifically designed to create non-consensual sexualised images.

What UK Parents Can Do Today

While the law is catching up, the NSPCC is urging parents to take three immediate steps:

  • Enable Parental Controls: Don’t rely on the app’s default settings. Use built-in OS tools (like Apple’s Screen Time or Google Family Link) to restrict age-inappropriate content.
  • The VPN Warning: Be aware that children are increasingly using VPNs to bypass home Wi-Fi filters and age gates. Check your child’s device for unfamiliar apps that could be masking their browsing history.
  • Open the Conversation: 1 in 10 UK parents say their child has already been blackmailed online. The NSPCC advises having regular “safety chats” so children feel they can speak up without the fear of having their phone confiscated.

Safety and Legal Disclaimer

This article is for informational purposes only. Online safety laws and platform terms of service are subject to frequent change. If you are concerned about a child’s safety online, you can contact the NSPCC Helpline on 0808 800 5000 or Childline on 0800 1111. This text does not replace professional legal or safeguarding advice.