Metaverse could be a problem for children’s mental health
If tech companies, retailers, content creators, and investors can all agree on one thing, it’s this: there’s a lot of money to be made in the metaverse.
But as CEOs scramble to push past their competitors to gain a foothold in the fledgling digital space, some psychologists and mental health experts say the race to profit is distracting attention from a crucial question: will the metaverse be a safer place be, especially for kids and teens?
The answer is not encouraging. Recent research has shown myriad negative effects of social media on the psyche of children and young people, from the prevalence of bullying and harassment to self-esteem and body image issues. The same pitfalls might be just as prevalent – if not worse – in the wide-open Metaverse, with its array of vast virtual worlds designed for both work and play.
But if tech companies take these concerns seriously from the start and build solutions into their Metaverse products, they could actually benefit children’s mental health, some experts say.
“All these new tools and all these new opportunities could be used for good or ill,” Mitch Prinstein, a clinical psychologist who serves as the chief science officer for the American Psychological Association, tells CNBC Make It.
Potentially worse than social media
Today’s social media platforms are already dangerous for some kids and teens. The level of virtual reality immersion could make these problems worse, says Albert “Skip” Rizzo, a psychologist who serves as director of medical virtual reality at the USC Institute for Creative Technologies.
“The immersion in a world that is different than observing and interacting … on a flat screen,” says Rizzo. “Once you’re actually embodied in a space, even if you can’t be physically touched, we can be exposed to things that take on a level of realism that could be psychologically offensive.”
Another problem with using 3D digital avatars in the metaverse is that the ability to modify your likeness to project a version of yourself that’s different from real life “can be pretty dangerous for teenagers,” says Prinstein .
“You are what other people think of you when you’re young,” he says. “And the idea of fictionalizing their identity and getting very different feedback can really mess up a teenager’s identity.”
Prinstein worries that tech companies are targeting their social media and metaverse platforms to this highly suggestive demographic — during a key period of their brain’s mental and emotional development — with potentially dire consequences.
“This is just an aggravation of the problems we’re already seeing with the impact of social media,” he says. “It creates more loneliness. This creates far more body image concerns [and] Exposure to dangerous content associated with suicidal tendencies.”
Some problems are already here
In December, Meta launched a social virtual reality platform, Horizon Worlds. Last March, Microsoft introduced a cloud collaboration service for virtual 3D business meetings. Other companies like Roblox and Epic Games are gaining a foothold in the metaverse through popular online games.
One such game publisher, VRChat, is already showing signs of dangers for young users. In December, an investigation by the nonprofit Center for Countering Digital Hate (CCDH) found that minors regularly engage in sexual content, racist and violent language, bullying, and other forms on VRChat’s platform, which is typically accessed through Meta’s Oculus headsets were subjected to the harassment.
Meta and Oculus have policies prohibiting this type of negative behavior on their VR platforms. When asked for comment, a Meta spokesperson referred CNBC Make It to the company’s previous statements about trying to “responsibly” build a metaverse and the Oculus platform’s tools for reporting abuse and blocking other users. VRChat did not immediately respond to CNBC Make It’s request for comment.
That’s part of the problem, says CCDH CEO Imran Ahmed: Security policies, no matter how well-intentioned, can be difficult to monitor and enforce in virtual spaces.
“Virtual reality really needs a lot of security built in from the start because you can’t search [the metaverse] because of hate or sexual abuse,” he says. “That will not do. It happens in a moment [and] you can’t do anything.”
Ahmed’s prediction: Parents must be wary of their children’s access to the metaverse. “I think parents will be asking, ‘Do I feel safe knowing that Mark Zuckerberg is the guy responsible for deciding who influences my kids, who might bully them, and whether they’re safe in cyberspace or not?” he says.
“You are motivated to make profits”
The irony is that virtual reality and the metaverse show promise for improving users’ mental health. Rizzo’s research at USC, for example, shows the potential for virtual reality treatments to promote empathy in patients and help with issues like mental trauma and PTSD.
But Rizzo and Prinstein agree that the responsibility lies with tech companies to prioritize the safety of their users over their own incentive to make profits.
Ahmed says tech companies could deploy tools to keep the metaverse safe for young users, including strict age verification tools to prevent predators from impersonating younger users, as well as numerous content moderators and “Quick Response” when users report violations of inappropriate behavior.
“There is no reason why there could not be presenters in rooms where children are present [or] virtual escorts,” he says. “But of course that would cost money.”