Skip links

Building The Metaverse With Open Source

Ensuring that virtual worlds are open, safe, and accessible to all will be paramount to a successful metaverse.

The term “metaverse” has been thrown around a lot lately. Whether you believe it is a reality or not, the adoption of the term signals a significant shift in how people think about the future of our online interactions. With today’s technological advancements and the increase in geographically distributed social circles, the concept of seamlessly connected virtual worlds being part of a global metaverse has never felt more appealing.

Virtual worlds enable a range of different scenarios and bring to life an array of rich and vibrant experiences. Students are able to explore the past by stepping into a past time period, embodying historical figures, and interacting with structures built centuries ago. Coworkers can meet virtually for coffee chats, no matter where in the world they are working. Musicians and artists can interact with their fans remotely in small or large digital venues. Conferences can reach new audiences, and friends and family can connect to explore interactive spaces.

When we built traditional virtual world platforms (the predecessors to metaverse applications) in the past, there was limited access to powerful graphics hardware, high-bandwidth network infrastructure, and scalable servers. However, the recent advancements in hardware optimization and cloud computing have allowed virtual worlds to reach new audiences. The complexity of what we are able to simulate has increased significantly.

Today, there are many companies investing in online virtual worlds and technologies. This indicates a fundamental shift in how people interact with each other, create, and consume content online.

Some tenets related to the metaverse are familiar through traditional web2 processes, including identity systems, social networks, communication protocols, and online economies, while other elements are newer. The metaverse is starting to see a proliferation of 3D environments (many of which are created and shared by users), the use of avatars, and the incorporation of VR (virtual reality) and AR (augmented reality).

Building Virtual Worlds The Open Source Way

This shift in computing paradigms brings opportunities to drive forward open standards and projects that encourage the creation of decentralized, distributed, and interoperable virtual worlds. This can start at the hardware level with projects such as Razer’s open-source virtual reality (OSVR) schematics allowing for experimentation in headset development, and then go all the way up the stack. For the device layer, the Khronos Group’s OpenXR standard has already been widely adopted by headset manufacturers. It allows applications and engines to target a single API, with its device-specific capabilities supported through extensions.

These tools allow creators and developers of virtual worlds to concentrate on mechanics and content. Although the techniques used to build 3D experiences are not new, the increased interest in the metaverse has resulted in the production of new tools and engines for creating immersive experiences. Many libraries and engines have variations in how they run their virtual worlds, but most of them will share the same underlying development concepts.

At the core of all virtual worlds is the 3D graphics and simulation engine (like Babylon.js and the WebGL libraries that it interacts with). The code is responsible for managing the game state of these worlds (so that interactions that manipulate the state of the worlds are shared between visitors) and for drawing updates to environments on screen. Game simulation states can include objects and avatar movement so that when a user moves through a space, others can see it happening in real time. The rendering engine will use the perspective of a virtual camera to draw 2D images on the screen, mapped to what a user looks at in digital space.

The video game world consists of 2D and 3D objects that represent virtual locations. The experiences can vary, ranging from simple rooms to entire planets, limited only by the creators’ imaginations. Inside virtual worlds, objects have transforms that instantiate them to a particular place in the world’s 3D coordinate system. The transform represents an object’s position, rotation, and scale within a digital environment. These objects can have mesh geometry created with a 3D modeling program, textures, and materials assigned to them and can trigger events in the world, play sounds, or interact with users.

Once a virtual world is created, the application renders content to the screen through a virtual camera. Similar to a camera in the real world, the camera inside a game engine has a viewport and various settings that will change the way a frame is captured. For these immersive experiences, the virtual camera draws many updates every second (around 120 frames per second for high-end VR headsets) to reflect the way you are moving within the space. Virtual reality experiences also specifically require that the virtual camera draws twice: once for each of the user’s eyes, slightly offset by their interpupillary distance (the distance between the center of each pupil).

Some other key characteristics that make up the metaverse include participants taking on digital bodies (known as avatars), user-generated content that is created and shared by the users of the various platforms, voice and text chat in real time, and the ability to navigate through differently themed worlds and buildings.

The Different Approaches To Building The Metaverse

Prior to choosing a development environment for building the metaverse, one should consider which tenets are most critical for the kinds of experiences users will have in that virtual world. Many of the libraries and frameworks for authoring immersive content provide a wide range of core graphics capabilities so developers can focus on the content and interactivity. The first choice you are faced with is whether you should target a native experience or the browser. Both will have different considerations for how a virtual world unfolds. A proprietary metaverse necessarily offers limited connections across virtual worlds, whereas open source and browser-based platforms, building on top of existing web standards and operating through the Khronos group and W3C, ensure interoperability and content portability.

Web applications like Mozilla Hubs and Element’s Third Room build on the current web protocols to create open-source solutions for developing browser-based virtual world applications. These tools, connecting 3D spaces embedded into web pages, utilize open-source technologies, such as three.js, Babylon.js, and A-Frame, for content authoring. They also utilize open-source real-time communication protocols for voice projection and synchronized avatar movement.

Open-source game engines like Open 3D Engine (O3DE) and Godot Engine have native development capabilities and features. Open-source engines give developers the flexibility of being able to extend or change core systems, allowing for more control over the end experience.

Open Access

As with any emerging technologies, it is critical to consider the use cases and impact on the people who use them. Immersive VR and AR devices have unprecedented capabilities to capture, store, process, and utilize data about individuals, including their cognitive state, attention, and physical movement patterns. Additionally, virtual worlds amplify the benefits and problems we are seeing with social media today and will require careful implementation of moderation techniques, appropriate access permissions, and trust and safety systems so that users have a positive experience when venturing into these spaces.

As the web evolves to encompass immersive content and spatial computing devices, it is important to think carefully about the type of experiences being created and interoperability across different applications. Ensuring that these virtual worlds are open, safe, and accessible is paramount. The metaverse is an exciting prospect, and it can only be realized through collaborative open-source software movements.