Nine out of 10 children think tech firms should be responsible for protecting them from violent, sexual and inappropriate online content, according to an NSPCC survey.
The charity’s research, which polled more than 2,000 children aged 11 to 16, also suggests youngsters overwhelmingly think social media platforms should have tools to help them remove posts from the internet.
The NSPCC has published the findings of the survey before presenting a petition to the Government on Monday calling for a statutory regulator to force tech companies to improve their child protection.
The petition has almost 46,000 signatures.
Peter Wanless, the charity’s chief executive, has called on the next prime minister to prioritise improving online safety for children.
He said: “Children themselves want to go online without the fear of seeing graphic and disturbing material and being vulnerable to abuse.”
More than 90% of the survey’s respondents said social media platforms should protect them from bullying and content about self-harm and suicide.
Ruth Moss, whose daughter Sophie took her own life at the age of 13 after looking at self-harm and suicide content on social media, said: “Children are protected by legislation in so many aspects of life, including traditional media.
“We would be horrified if our children were exposed to abuse or damaging imagery in films, television or the press, so why should the internet and social media be any different?”
More than eight in 10 children think social media platforms should make it easier for them to take down posts.
A similar number also want platforms to prioritise requests from children to remove content and make it harder to share screenshots.
The survey, which was carried out by market researcher company ComRes on behalf of the charity, found 90% of respondents had a social media account, with more than half of those using Facebook, Instagram, WhatsApp, YouTube and Snapchat.