Pedophiles start using virtual reality headsets to view child abuse images, data shows | UK News

Pedophiles have started using virtual reality headsets to view images of child abuse, according to police figures.

Use of the technology was recorded in eight cases in 2021/22 – the first time it has been specifically mentioned in a crime report.

During this period, police recorded 30,925 offenses involving pornographic images of children – the highest number recorded by police in England and Wales.

Of these, 9,888 were recorded on social media or gaming sites — including 4,293 on Snapchat, 1,361 on Facebook, 1,363 on Instagram and 547 on WhatsApp.

The NSPCC, which collated the data, has called for a number of amendments to the Online Safety Act to prevent more children from being abused.

Sir Peter Wanless, chief executive of the NSPCC, said: “These new figures are incredibly shocking but only reflect the tip of the iceberg of what children experience online.

“We hear young people feeling powerless and disappointed as online sexual abuse threatens to become the norm for a generation of children.

“By creating a child safe advocate who stands up for children and families, the government can ensure that the Safe Online Act systematically prevents abuse.”

read more:
NSPCC’s Childline reports 45% increase in boys experiencing online sexual abuse
Child abuse investigation: Turning a blind eye should be illegal
Mother reveals trauma of court delays after ex-husband sexually abused daughter

Snapchat is popular with young people

The NSPCC also wants to change the law to mean senior executives of social media sites will be held criminally liable if children are abused.

Sir Peter said: “It would be inexcusable if five years later we were still catching up to the widespread abuse that is rampant on social media.”

A government spokesman said: “Protecting children is at the heart of the Safe Online Act and we have taken strong, world-leading measures to achieve this, while ensuring the interests of children and families are represented through the Children’s Commissioner.

“Virtual reality platforms, within scope, will be forced to protect children from exploitation and remove vile child abuse content.

“If companies fail to handle this material effectively, they face substantial fines and possible criminal sanctions against their senior managers.”

A spokesman for Meta, which owns Facebook, Instagram and WhatsApp, said: “This horrific content is banned from our app and we report incidents of child sexual exploitation to NCMEC (National Center for Missing and Exploited Children).

“We are an industry leader in developing and using technology to prevent and remove this type of content, and we work with police, child safety experts and industry partners to address this societal issue.

“Our work in this area is never done, and we will continue to do everything we can to keep this content off our apps.”

A Snapchat spokesperson said: “Any sexual abuse of children is abhorrent and illegal. Snap has dedicated teams around the world working closely with police, experts and industry partners to combat this practice. .

“If we proactively detect or learn of any pornographic content that exploits minors, we immediately remove it, delete accounts, and report offenders to authorities. Snapchat has extra protections in place to make it harder for young users to be detected and connections and strangers.”

Roxy Longworth
image:
Roxy Longworth

“I can’t control it”

When Roxy Longworth was 13, a 17-year-old boy she didn’t know contacted her on Facebook and forced her to send pictures via Snapchat.

She said it left her feeling alone and filled with guilt, and soon one of his friends started using the images to push for more explicit images.

“My whole life I’ve been doing what he told me and hiding it from everyone,” Roxy said. “Then it became clear that the more photos he had, the more he blackmailed me, until finally he asked me to send a video. He and his friends, at that point they had me completely, I couldn’t control it.”

This had a devastating effect on her mental health.

“The stigma of it buried me,” she said. “I ended up being very sick. I self-harmed a lot, I stopped sleeping, and I ended up being hospitalized with a psychotic episode. I was on suicide watch for about a year.”

She wrote a book called When You Lose It as a way of accepting what happened, but she says knowing the pictures exist is still haunting.

Roxy added: “It’s like a creepy feeling that you’re trying to forget, and then you realize those pictures are still there.

“They’re in group chats with hundreds of people, they’re everywhere.

“The thing is – those pictures are of a 13-year-old girl. It’s messy. It’s disgusting.”

Anyone feeling emotionally distressed or suicidal can call 116 123 or email jo@ Samaritans for helpSamaritan.org in the UK.

Source link