A groundbreaking report by the NSPCC, in collaboration with PA Consulting, has found that popular social media, messaging, and gaming platforms are systematically failing to protect girls from online harm. From profile creation to day‑to‑day engagement, girls under 18 are “vulnerable to grooming, abuse and harassment” at every turn.

Using fake profiles of teenage girls across ten leading platforms, researchers discovered that site design features—like prompts to expand social networks—actively encourage risky interactions. These features make it easy for adult strangers to identify young girls and initiate unsolicited contact.

Key Statistics

The charity surveyed 3,593 adults via YouGov, revealing that 86 % believe tech companies are not doing enough to protect under‑18s online. Parents of girls aged 4–17 ranked their top concerns as:

  • contact from strangers (41 %)
  • online grooming (40 %)
  • bullying from peers (37 %)
  • sexual harassment (36 %)

Over half (52 %) admitted being significantly worried about their daughter’s digital experiences.

Real Life Stories

One 15‑year‑old who reached out to Childline described receiving unsolicited naked images from multiple strangers. She said:

“I’ve been sent lots of inappropriate images online recently, like pictures of naked people that I don’t want to see. At first I thought they were coming from just one person, so I blocked them. But then I realised the stuff was coming from loads of random people I don’t know. I’m going to try and disable ways people can add me, so hopefully I’ll stop getting this stuff.”

Rani Govender, NSPCC Policy Manager for Child Safety Online, states:

“Parents are absolutely right to be concerned about the risks their daughters’ are being exposed to online, with this research making it crystal clear that tech companies are not doing nearly enough to create age-appropriate experiences for girls.

“We know both on and offline girls face disproportionate risks of harassment, sexual abuse, and exploitation. That’s why it’s so worrying that these platforms are fundamentally unsafe by design – employing features and dark patterns that are putting girls in potentially dangerous situations.

“There needs to be a complete overhaul of how these platforms are built. This requires tech companies and Ofcom to step up and address how poor design can lead to unsafe spaces for girls.

“At the same time Government must lay out in their upcoming Violence against Women and Girls (VAWG) Strategy steps to help prevent child sexual offences and tackle the design failures of social media companies that put girls in harm’s way.”

What can be Done?

In response, NSPCC is urging both tech firms and UK regulator Ofcom to implement a range of solutions:

  • Abusability studies: Evaluate new features for gendered risks before rollout
  • Screenshot reporting: Make it easier to report inappropriate messages and detect identifiable info
  • “Cooling‑off” periods: Enforce limited communications when new connections are made
  • Restricting adult contact: Implement stricter controls on video calls from non‑trusted adults

NSPCC also wants Ofcom to fill gaps in its Illegal Harms Codes, developing best‑practice safety guidance tailored by age, and ensuring enforcement under the Online Safety Act. With these changes, tech platforms could better safeguard girls from persistent and gender‑based online threats. Without urgent action, the report warns, the digital world remains a perilous space for young girls.

Read more about this on the NSPCC website, and check out our Safeguarding blog for more articles like this.