9 October 2022

How edge computing will support the metaverse


Andy Wolber

When people point to an example of the metaverse, expect edge computing to be nearby. That single statement encapsulates the essence of the relationship between these two sometimes amorphous concepts.

Importantly, the metaverse more closely resembles virtual reality than augmented reality. AR applications add information to your environment: An arrow to indicate direction, text to label or describe, or a button to access additional information. VR systems, as initially conceived by Jaron Lanier in 1987, supplant your surrounding environment with a simulated one. You might think of the metaverse as a comprehensive, coordinated network of various VR environments.
What is the metaverse?

Science fiction stories, simulators and immersive game environments offer the most vibrant visions of virtual environments. A holodeck as depicted on Star Trek encapsulates the experience well: Choose an environment, open a door and enter a virtual world created and managed by a hidden computer.

The fictional holodeck purports to provide the ship’s crew members an immersive experience that includes encounters with computer-devised characters in settings as richly developed as the rest of the show. Fans familiar with Avatar, Neuromancer, Ready Player One or the Matrix films may similarly recognize variations on the metaverse theme.

Simulators and games suggest the potential for virtual worlds. Formula 1 drivers for years have used simulations to learn and practice race routes, since each track presents a different sequence of straightways and turns. Pilots who fly virtual planes in Microsoft Flight Simulator get a bit of added reality, since the system can draw from historical simulated weather data. And anyone who has played Minecraft or any massively multiplayer online role playing game has been exposed to the potential of a persistent virtual environment.

The concept of the modern metaverse as presented by most corporations, though, remains limited, likely due to each company’s desire for centralization and control. For example, Meta, the company formerly known as Facebook, envisions the metaverse as the next generation of social networking, with the entire environment managed and maintained by the company.

This sort of centralized, single-company controlled environment isn’t conceptually all that different from Linden Lab’s Second Life, which launched back in 2003. Yes, bandwidth, graphics and features differ, but that doesn’t change the fact that the company mediates the experience.

A desirable metaverse will likely be both decentralized and community-defined and controlled. Much like internet networking standards have made it possible to connect previously isolated communities of computers, an open metaverse would allow metaverse persona(s) and virtual property to persist across all metaverse platforms.

Adobe, EA, Epic Games, Google, Meta, Microsoft, NVIDIA and many other companies are members of a Metaverse Standards Forum, which seeks to facilitate interoperability between corporate-controlled platforms. However, other initiatives, such as the Open Metaverse Alliance and Open Metaverse Conference, seek to establish a more open, user-controlled metaverse. The conference is notably embraced by Neal Stephenson, who coined the term metaverse in his 1992 novel, Snow Crash.

How might edge computing affect the metaverse?

The computing power necessary to deliver a virtual world can be significant, since the system needs not only to track various objects, characters and environmental effects but also adapt the display as any or all of these move through virtual space. The calculations required increase as more people pack into a similar virtual space, such as when many people meet in a large conference hall for a lecture or concert hall for a performance. Image resolution and details tend to degrade when systems hit peak processing limits.

Edge computing supports the metaverse by minimizing network latency, reducing bandwidth demands and storing significant data locally. Edge computing, in this context, means compute and storage power placed closer to a metaverse participant, rather than in a conventional cloud data center.

Latency increases with distance—at least for current computing and networking technologies. Quantum physics experiments can convey information at a distance without significant delay, but those aren’t systems we can scale or use for standard purposes—yet.

In a virtual world, you experience latency as lag: A character might appear to hesitate a bit as it moves. Inconsistent latency produces movement that might appear jerky or communication that varies in speed. Lower latency, in general, means smoother movement.

Edge computing can also help reduce bandwidth, since calculations get handled by either an on-site system or one nearby, rather than a remote location. Much as a graphics card works in tandem with a CPU to handle calculations and render images with less stress on the CPU, an edge computing architecture moves calculations closer to the metaverse participant. Edge computing systems generally work in conjunction with a cloud system when network connections are available.

Similarly, edge computing may leverage local storage to enhance metaverse performance. For example, much as a smart mapping system might load information about both your local area and recently referenced sites, an edge computing system might store the most relevant content and leave less-likely-to-be-accessed data elsewhere.
What do you think?

How often do you rely on AR or VR systems? Have you spent any significant time in virtual environments? What has your experience been? Do you think any of these systems—AR, VR or the metaverse—are likely to achieve broad usage and adoption soon? Or do you think they’ll remain limited to the domain of a few niche enterprise or entertainment applications? Mention or message me on Twitter (@awolber) to let me know what you think about AR, VR or the metaverse.

No comments: