British police are reportedly investigating the sexual abuse of a child’s avatar in the metaverse – prompting the NSPCC to warn that tech firms must do more to protect young users.
Online abuse is linked with physical abuse in the real world and can have a devastating impact on victims, the charity’s campaigners said.
The comments were made in response to a report published by Mail Online that officers are investigating a case in which a young girl’s digital persona was sexually attacked by a gang of adult men in an immersive video game.
It is thought to be the first investigation of a sexual offence in virtual reality by a UK police force.
The report said the victim, a girl under the age of 16, was traumatised by the experience, in which she was wearing an augmented reality headset.
The metaverse is a 3D model of the internet where users exist and interact as avatars – digital versions of themselves that they create and control.
About 21% of children aged between five and 10 had a virtual reality (VR) headset of their own in 2022 – and 6% regularly engaged in virtual reality, according to the latest figures published by the Institute of Engineering and Technology.
Richard Collard, associate head of child safety online policy at the NSPCC, said: “Online sexual abuse has a devastating impact on children – and in immersive environments where senses are intensified, harm can be experienced in very similar ways to the ‘real world’.”
He added that tech companies are rolling out products at pace without prioritising the safety of children on their platforms.
“Companies must act now and step up their efforts to protect children from abuse in virtual reality spaces,” Mr Collard said.
“It is crucial that tech firms can see and understand the harm taking place on their services and law enforcement have access to all the evidence and resources required to safeguard children.”
In a report published in September, the NSPCC urged the government to provide guidance and funding for officers dealing with offences that occur in virtual reality.
The charity also called for the Online Safety Act to be regularly reviewed to make sure emerging harms are covered under the law.
Ian Critchley, who leads on child protection and abuse for the National Police Chiefs’ Council, said that the grooming tactics used by offenders are always evolving.
He added: “This is why our collective fight against predators like in this case, is essential to ensuring young people are protected online and can use technology safely without threat or fear.
“The passing of the Online Safety Act is instrumental to this, and we must see much more action from tech companies to do more to make their platforms safe places.”
The act, which passed through parliament last year, will give regulators the power to sanction social media companies for content published on their platforms, but it has not been enforced yet.
Ofcom, the communications regulator, is still drawing up its guidelines on how the rules will work in practice.
A spokesperson for Meta, which owns Facebook, Instagram and operates a metaverse, said: “The kind of behaviour described has no place on our platform, which is why for all users we have an automatic protection called personal boundary, which keeps people you don’t know a few feet away from you.
“Though we weren’t given any details about what happened ahead of this story publishing, we will look into it as details become available to us.”