The Apple Vision Pro is the tool for Australians to access three-dimensional, immersive online environments, also known as the metaverse. Released in Australia earlier this month, it allows users to take a (virtual) walk in the savanna from their living room, watch sports matches in an immersive environment, or even buy a house after completing a virtual inspection.
But these new online environments also have the potential to enable new crimes.
are those crimes occurring in the metaverse. They challenge our definitions of crimes in the digital realm, because they do not fit into existing frameworks of reporting and investigating crime.
Our tackles this problem by shedding light on the key characteristics of metacrimes. And by understanding these crimes better, we will be better able to fight them.
Metaverse, metacrime and cybercrime
The metaverse is a loose term describing a kind of three-dimensional, virtual world that users access via a virtual reality headset.
The 2018 movie Ready Player One provides a good visualisation of what the metaverse might look like. In the movie, people put on special goggles and pick their avatar to enter a massive, interactive digital universe where they can do almost anything.
Our research found crimes committed in the metaverse have commonalities with conventional cybercrime. For example, both involve different kinds of illegal activities happening online or in virtual spaces. As technology gets better, these crimes are also becoming more global and anonymous. This makes it near impossible to catch the perpetrators.
But we also found a number of metacrime features that do not overlap with conventional cybercrime.
The unique features of metacrimes
One such feature is immersive VR attacks, which are made to feel real through .
Immersion is created through a number of sensory techniques in the headset, including visual, sound and haptic (touch). This creates a feeling of spatial presence that allows the user to perceive and experience the virtual space as real. This means negative experiences such as sexual violence and harassment also feel real.
Unless you are constantly recording your interactions in the metaverse via your headset, crucial evidence of that unpleasant interaction would not be captured. Some companies have created user controls, such as a that can be activated around your avatar. However, we do not yet have sufficient research to know whether these are effective.
Our study argues the impact of metacrimes will also be exacerbated for vulnerable populations, especially children who occupy a large proportion of active metaverse users. Difficulties in verifying children’s age online add extra concerns about grooming and minor abuse.
These risks are not hypothetical.
In 2022, researchers from the Center for Countering Digital Hate conducted 11 hours and 30 minutes of recorded user interactions on Meta’s Oculus headset in the popular VRChat. They that users, including children, encounter abusive behaviour approximately every seven minutes.
Bullying and sexual harassment was also rife, and minors were often manipulated into using racist slurs and promoting extremist ideas.
In January 2024, police in the United Kingdom launched the first case of rape in the metaverse after a 16-year-old girl’s avatar was attacked. Police the victim suffered psychological and emotional trauma, similar to an attack in the physical world.
The outcomes of the case are currently pending and are likely to set a legal precedent for the protection of minors in the metaverse. At the moment, metacrime presents new challenges in defining, measuring and pursuing avatars’ liability that conventional cybercrime does not usually confront.
We also found other risks including hacking and recording of a person’s environment. Manipulation of VR technologies, such as haptic suites that enable users to physically engage with virtual spaces, also enable perpetrators to inflict direct physical harm on users.
This can include inflicting , , and .
Where to from here?
Major tech companies such as Apple, Meta and Microsoft are heavily in the metaverse, developing both hardware and software to enhance their platforms. Research firm Gartner by 2026, 25% of people will spend at least an hour each day in the metaverse for work, shopping, education, social media and entertainment.
This prediction may be not too far away from reality. Australia’s eSafety Commissioner’s national online safety survey conducted in 2022 49% of metaverse users said they had entered the metaverse at least once a month in the last year.
It is therefore urgent that governments and tech companies develop metaverse-specific legal and regulatory frameworks to safeguard immersive virtual environments. ³Ô¹ÏÍøÕ¾ and international legal frameworks will need to account for the new characteristics of metacrime we have identified. Law enforcement will need to in metacrime reporting and investigations.
In the past, companies have talked about using new technologies responsibly – but when their platforms were used for crimes and harms. Instead, tech leaders deploy what researchers are now calling an “” (for example, “I’m sorry you experienced this on our platform”).
But this does nothing tangible to tackle the problem, and metaverse companies should instil clear regulatory frameworks for their virtual environments to make them safe for everyone to inhabit.
If this article has raised issues for you, or if you’re concerned about someone you know, call Lifeline on 13 11 14.
The ³Ô¹ÏÍøÕ¾ Sexual Assault, Family and Domestic Violence Counselling Line – 1800 RESPECT (1800 737 732) – is available 24 hours a day, seven days a week for any Australian who has experienced, or is at risk of, family and domestic violence and/or sexual assault.