Delivering the Entire Metaverse

Mark Billinghurst
8 min readMar 26, 2022

Key Takeaways

  • The Metaverse was imagined 30 years ago, and a broad Metaverse taxonomy created 15 years ago, with four key elements
  • Most current Metaverse applications are focusing on shared VR or AR experiences, ignoring key components of the Metaverse
  • Interesting opportunities exist when you combine two or more Metaverse elements, going far beyond what is currently available
  • Empathic Computing is one example of a research field that draws from all elements of the Metaverse taxonomy to create experiences that could tranform shared understanding.

The Metaverse

Since late 2021, the Metaverse has been everywhere, at least in the AR/VR/XR communities. Looking at Google trends, in October 2021 ten times as many people searched for the term “Metaverse” compared to “Virtual Reality”, while six months before that it was the opposite. Companies are renaming themselves, Metaverse products proliferating, and almost every tech conference seems to have Metaverse in the name or at least the titles of the presentations.

Google Trends search terms, blue is search volume for Metaverse, red for Virtual Reality

However, the concept of the Metaverse goes back decades earlier than the current hype. The term was coined by Neil Stephenson in his brilliant book Snow Crash from thirty years ago. Earlier science fiction books like True Names and Johnny Mnemonic from the 1980’s and The Veldt from the 1950's, among others, also had some similar ideas.

In Stephenson’s Snow Crash he imagined the Metaverse as the convergence of a virtually enhanced physical reality, and a physically persistent virtual space. The Metaverse is always on and always accessible. The main character, Hiro Protagonist, is a pizza driver in the real world, but anytime he can escape into the Metaverse where he is much more than that. Talking about his job, Hiro says:

“It won’t pay the rent, but that’s okay — when you live in a shithole, there’s always the Metaverse, and in the Metaverse, Hiro Protagonist is a warrior prince.”

A key concept is that actions in the real world affect the virtual characters of people in the Metaverse, and similarly actions in the Metaverse affects the real world. A current simple example of this vision is the Metaverse Park developed by Delta Reality using the Niantic Lightship SDK. This seamlessly blends actions in the real world with those in VR and vice versa.

Delta Reality’s Metaverse Park

In the Metaverse Roadmap from 2007, the concept is defined further and in particular the authors point out that there are four types of technology needed to enable the Metaverse:

  • Augmentation: Technologies that layer information onto our perception of the physical environment.
  • Simulation: Technologies that model reality.
  • Intimate: Technologies that are focused inwardly, on the identity and actions of the individual or object.
  • External: Technologies that are focused outwardly, towards the world at large.

These terms can be used as end points on two continuums from Intimate to External, and Augmentation to Simulation, which provides a taxonomy to classify Metaverse technologies. Into this taxonomy we can place the four key components of Augmented Reality, Virtual Worlds, Lifelogging and Mirror Worlds.

The Metaverse taxonomy from The Metaverse Roadmap

While AR and VR are well known, the terms Lifelogging and Mirror Worlds may be unfamiliar. Lifelogging refers to technologies that allow a user to record or monitor their internal states and so augment their lives. A simple example of this would be a smart watch that provides a daily log of heart rate or physical activity. Mirror Worlds are those technologies that try can capture and create a virtual simulation of a person’s external reality. A simple example of this would be Google Street View that provides immersive snapshots of the real world at different times in the past.

Delivering the Entire Metaverse

Considering the Metaverse taxonomy and the four technology components, key actions in the Metaverse include Augmenting, Immersing, Sensing and Capturing.

Key technology actions needed for the Metaverse

However, many of the current Metaverse applications are just focused on AR/VR and the Augmenting/Immersing aspects of the Metaverse, but not the other elements. For example, Meta’s flagshop Horizon Workrooms is designed as a VR space for collaborative work, while it’s Horizon Worlds application is a large scale social VR space. Similarly, Spatial.io started by developing a multiplatform AR/VR application for meetings, and have recently pivoted to providing the Metaverse for Events, Culture, Exhibitions and Experiences. Neither of these provide any support for Lifelogging/sensing, or Mirror Worlds/capturing. There are many other examples of applications with Metaverse in their name that are at best social VR spaces, and at worst, places to buy virtual land and sell NFTs.

So, there is a potential to consider the entire Metaverse taxonomy and develop expriences that go far beyond just being multi-user VR spaces, or AR chat environments. At a minimum applications could combine elements of two more more of the Metaverse quadrants. For example, the upcoming Microsoft Mesh allows people in AR and VR to collaborate together, and the Mesh SDK should make it easy for developers to create their own hybrid AR/VR shared experiences.

Microsoft Mesh enabling people in AR/VR to collaborate together

More exciting is to consider combining elements of three quadrants. For example, connecting spaces and developing experiences that mix elements of AR, VR and Mirror Worlds together. You could imagine a collaborative application that allows an AR user in the real world to capture their real environment and live-stream it as a 3D model to a remote collaborator in VR, who also appears as a virtual avatar in the AR user’s real world. In my own lab, we have been exploring this area for a long time and have developed connected space systems that using live streamed 360 video or 3D geometry to capture the real world and enable AR/VR users to collaborate together [1].

Research that connects spaces and combines elements of AR, VR and Mirror Worlds

One example of a connected space experience is our Wearable RemoteFusion system [2] that allows a user in VR to collaborate with an AR user in the real world. In this application depth sensing cameras are used to create a real time 3D mesh model of the AR users real environment that the VR user is immersed in. Whenever the AR user moves an object, the VR user can see the 3D environment update. The VR user’s gaze and gesture cues in VR are transmitted back to the AR user who can see them overlaid on the real world, enabling them to collaborate together easily. So, this application combines real world capture with AR and VR views. Of course it isn’t a persistent experience, so not a true Metaverse application, but it gives an example of the type of interface that such an application could have.

The Wearable RemoteFusion system. A user in AR collaborates with a remote using in VR, who is immersed in a real time 3D view of the AR user’s environment. They can use gaze and gesture cues to work together.

Combining AR, VR and Lifelogging would explore how physiological sensing could be used to enhance collaborative AR/VR experiences. Physiological cues could be used to capture people’s emotional state, stress level, or cognitive load which could then be shared between AR/VR collaborators to improve understanding and emotional connection. Recording this over long periods of time would help with tracking health, learning, work efficiency and other factors. EEG could be used to record brain activity, and particularly brain synchronisation between people collaborating together in AR/VR [3]. VR headsets are starting to appear with integrated physiological sensors in them (e.g. the HP Omnicept with gaze and heart rate, Project Galea with EEG and many other sensors), and AR displays will soon follow, so it will become easy to explore this application space.

Combining Lifelogging with AR and VR

The ultimate goal should be to develop experiences that draw from all elements of the Metaverse; Augmenting, Immersing, Sensing and Capturing. This is what is done in the emerging field of Empathic Computing [4], where technology is being used to enable people to share what they are seeing, hearing and feeling with others, and so create deeper understanding. Empathic Computing goes beyond current Metaverse applications because it is not only focused on collaborative AR/VR, but also how technology can be used to capture and share people’s real surroundings with collaborators, and how physiological sensors can be used to recognise and share people’s emotional or cognitive state.

Combining all the elements of the Metaverse

Empathic Computing is in its infancy, but some early projects have shown the its potential, and possible benefits that could be brought into the Metaverse. For example, in [5] we showed that placing two people in the same virtual body, and sharing heart rate from one person to the other could make the people feel more connected, and affect the emotional state of the receiver. In [6] we measured the brain activity of two people druming in a shared AR/VR application, changing the visual experience based on the amount of brain synchronisation occurring. People again felt more connected, with one person saying “It’s quite interesting, I actually felt like my body was exchanged with my partner.” These and other similar applications show the types of experiences that could be provided in future Metaverse applications.

The NeuralDrum interface

Conclusion

Although the Metaverse concept has been around for at least thirty years, current applications don’t capture its full potential. Researchers and developers should consider moving beyond just focusing on shared AR and VR experiences, and integrate elements from the entire Metaverse taxonomy into their work. This could create a new generation of applications that connect people in new and exciting ways. People should be able to immerse themselves in virtual copies of their real friends surroundings, have virtual avatars that reflect their emotions, interact with intelligent AR training agents that respond to their learning over time, and more. In this way the Metaverse could move from it’s science fiction past to a tool that could drammatically improve our future.

Evolution of Metaverse development from Social VR applications (technology from one quadrant), hybrid AR/VR applications, to using elements from 3 quadrants, to full Metaverse experiences.

References

[1] Teo, T., Lawrence, L., Lee, G. A., Billinghurst, M., & Adcock, M. (2019, May). Mixed reality remote collaboration combining 360 video and 3d reconstruction. In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1–14).

[2] Sasikumar, P., Gao, L., Bai, H., & Billinghurst, M. (2019, October). Wearable remotefusion: A mixed reality remote collaboration system with local eye gaze and remote hand gesture sharing. In 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 393–394). IEEE.

[3] Barde, A., Gumilar, I., Hayati, A. F., Dey, A., Lee, G., & Billinghurst, M. (2020, December). A review of hyperscanning and its use in virtual environments. In Informatics (Vol. 7, №4, p. 55). Multidisciplinary Digital Publishing Institute.

[4] Piumsomboon, T., Lee, Y., Lee, G. A., Dey, A., & Billinghurst, M. (2017, June). Empathic mixed reality: Sharing what you feel and interacting with what you see. In 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR) (pp. 38–41). IEEE.

[5] Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017, May). Effects of sharing physiological states of players in a collaborative virtual reality gameplay. In Proceedings of the 2017 CHI conference on human factors in computing systems (pp. 4045–4056).

[6] Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving brain synchronicity in XR drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1–4).

--

--

Mark Billinghurst

Augmented Reality Expert, Professor of Human Computer Interaction, Interface Researcher, Entrepreneur