When people point to an example of the metaverse, expect edge computing to be nearby. That single statement encapsulates the essence of the relationship between these two sometimes amorphous concepts.
Importantly, the metaverse more closely resembles virtual reality than augmented reality. AR applications add information to your environment: An arrow to indicate direction, text to label or describe, or a button to access additional information. VR systems, as initially conceived by Jaron Lanier in 1987, supplant your surrounding environment with a simulated one. You might think of the metaverse as a comprehensive, coordinated network of various VR environments.
What is the metaverse?
Science fiction stories, simulators and immersive game environments offer the most vibrant visions of virtual environments. A holodeck as depicted on Star Trek encapsulates the experience well: Choose an environment, open a door and enter a virtual world created and managed by a hidden computer.
The fictional holodeck purports to provide the ship’s crew members an immersive experience that includes encounters with computer-devised characters in settings as richly developed as the rest of the show. Fans familiar with Avatar, Neuromancer, Ready Player One or the Matrix films may similarly recognize variations on the metaverse theme.
SEE: Don’t curb your enthusiasm: Trends and challenges in edge computing (TechRepublic)
Simulators and games suggest the potential for virtual worlds. Formula 1 drivers for years have used simulations to learn and practice race routes, since each track presents a different sequence of straightways and turns. Pilots who fly virtual planes in Microsoft Flight Simulator get a bit of added reality, since the system can draw from historical simulated weather data. And anyone who has played Minecraft or any massively multiplayer online role playing game has been exposed to the potential of a persistent virtual environment.