Meta says that the Metaverse is where we’ll live in the future, but the graphics still leave much to be desired.
Nearly 20 years have passed since Linden Labs released Second Life, an early attempt at an immersive, multi-user universe where people could live, work, and make a lot of money. After 20 years, the promise that was first hinted at in Second Life is getting closer to coming true as the persistent digital world of the Metaverse makes its way into the mainstream.
The constant coverage and hype about the Metaverse would make the average person think that they will need to start making plans for a life where they are always wearing a VR headset.
The major problem of The Metaverse.
If Mark Zuckerberg gets his way, a billion of us will be in the Metaverse by the decade’s end. Research bank Citi says the metaverse industry will support an economy worth anywhere from $8 trillion to $13 trillion by the same date. McKinsey says that numbers like these have brought over $177 billion in investments to the Metaverse since the beginning of 2021.
The only problem is that the graphics of the platforms being booed as the future look about the same, if not worse, than those of Second Life, which is 20 years old.
This week, when Meta announced that its metaverse platform Horizon Worlds would be coming out in France and Spain, people laughed at the idea. Most of the criticism was about CEO Mark Zuckerberg’s “dead-eyed” cartoon avatar, which had no legs. This led to a quick redesign.
Not only the big tech companies that have been around for a long time are hurt. People have also said bad things about how Web3 metaverse platforms like Decentraland graphics.
All of this makes me wonder why the graphics in the Metaverse are so bad.
There are many reasons why that could be the case, and different platforms, depending on how good their graphics are, offer various reasons.
One of the biggest problems with metaverse right now is that rendering graphics in real-time takes a lot of processing power and superfast internet speeds, to which users don’t always have access. Graphics cards and broadband connection speeds make it hard for metaverses to show graphics with a lot of detail, so they often use graphics with a larger brush.
Metaverse often have worse graphics than MMOs because they are meant to be much more open-world. The Metaverse doesn’t just let users follow a list of pre-programmed commands like games do. Instead, it has an infinite number of options that can’t be pre-rendered and called upon when needed.
There is also the idea that a metaverse that looks like a cartoon is better than one that looks most like real life but has a few fatal flaws.
Video games already use the idea of the “uncanny valley,” which is when graphics are almost perfect but have one flaw that makes people feel uneasy. And in an environment where things are rendered in real-time and users can make almost any choice they want, there are just too many things that could go wrong and make people feel like they are in the “uncanny valley.”
The trouble with the legs
When it comes to legs, the problem is tough to solve.
In February, Andrew Bosworth, then Meta’s vice president of Reality Labs and now its chief technology officer, told CNN Business that legs are “super hard and basically not workable just from a physics point of view with existing headsets.”
“It’s a hardware problem,” says Gijs Den Butter of SenseGlove, a Dutch Corp that makes haptic feedback gloves and other devices. “Right now, manufacturers have a headset with controllers or hand tracking, and our computer for the metaverse is just like that,” he says. “In its current state, it doesn’t have legs because the hardware can see your hands and maybe your arms and track those, but when you look forward, you can’t see your legs.”
That’s hard because body tracking algorithms that help figure out where you’re pointing in the Metaverse need information from body parts they can see, and if you stand up straight and look straight ahead, you can’t see your own legs. So, the Metaverse computers trying to make a digital copy of your body don’t have legs.
For the time being, this is less of a problem for crypto-based metaverses like Decentraland and The Sandbox, which mostly use browser- or desktop-based interfaces instead of fully immersive VR.
Weronika Marciniak, a Hong Kong-based metaverse designerat Future Is Meta, says, “It’s really Facebook/Meta and Microsoft—these immersive platforms—that don’t have avatars with legs.” “In most worlds, such as VRChat, Decentraland, Sandbox, and others, avatars have legs, but sensors don’t always have legs.” These platforms get around the problem by “pretending,” or, as Marciniak corrects herself, “assuming the position of users’ legs.”
Den Butter says that major mainstream metaverse platforms don’t have legs because they don’t have enough processing power. “All parts that move are basically made from a kinematic model, and legs are no different,” he says. “The math models for hands are pretty complicated, but for legs, there are only a few points to work with.”
He says that existing low-end hardware like an Azure Connect or Wii Camera could process the relevant data points. This means sending and processing that data to render in the Metaverse, either locally or through computing, isn’t likely to create too much lag.
Instead, he and Marciniak blame the lack of legs on the hardware, specifically the fact that existing devices worn on the head don’t give enough visibility.
But that will probably change soon. In December 2021, Nike bought RTFKT. Marciniak thinks this could be the first step toward controllers for our feet that work like headsets. She thinks they might be working on real shoes or socks with sensors connecting to VR headsets.
Look at it from another angle.
Otherside, made by the same people who made Bored Ape Yacht Club, looks different from all the others. Built on Improbable’s M2 engine, Otherside looks like it is from the year 2022, which the creators say “is no small feat.”
Rob Whitehead, the co-founder of Improbable, says, “We don’t just give our partners a platform and walk away.” They talk with partners about what they want from the Metaverse and make that happen. He says, “There are some great projects, but they look like you took an app and tried to turn it into a metaverse.” “It looks cool, but we do best to take games and make them more like games and more metaversal.”
Improbable spent a lot of time researching and developing its M2 engine so that it could render thousands of unique characters using machine learning techniques that push processing onto users’ GPUs instead of sending the data through the cloud. Whitehead says, “The problem is that if you double the number of people in a dense space, you have to send four times as much data.”
Whether or not other metaverses will change how they handle visuals is a different question. But this question will become more critical if the Metaverse becomes popular in the way its supporters want. We have yet to see.