³Ô¹ÏÍøÕ¾

Why the metaverse isn’t ready to be the future of work just yet

. According to Statistics Canada, .

Authors

  • Victoria (Vicky) McArthur

    Associate Professor, School of Journalism and Communication, Carleton University

  • Robert J. Teather

    Associate Professor and Director, School of Information Technology, Carleton University

While telework allows employees to save time and money on commuting costs and can offer a better work/life balance, have illustrated how collaborative work can potentially suffer in remote contexts.

Proponents of the metaverse suggest it could be the future of remote work since virtual worlds and virtual reality (VR) afford us the ability to collaborate in virtual spaces. But some companies argue – impromptu encounters between employees that can spark innovation.

The compounding effects of and are partly behind recent interest in the metaverse as the future of remote work.

Metaverse is a term first coined by novelist Neal Stephenson more than three decades ago in his 1992 science-fiction novel Snow Crash.

Today, “the convergence of physical and virtual space accessed through computers and enabled by immersive technologies such as virtual reality, augmented reality and mixed reality.” These technologies are collectively referred to as extended reality.

User interaction issues

This isn’t the first time companies have explored the use of virtual environments to support remote work. In the early 2000s, , where they hosted corporate training and recruitment events. But where Second Life was a platform for desktop computers, the metaverse is primarily one to be used with VR head-mounted displays.

While metaverse supporters claim VR environments can virtually recreate in-person collaborative experiences or “water cooler moments,” the ways users interact with VR systems can introduce some usability issues.

In order to interact with 3D content in virtual environments, modern VR systems almost universally use 3D input devices such as controllers or wands, tracked in 3D typically using cameras positioned facing outward on the device. .

However, these 3D controllers have been shown to lack the finesse of conventional computer input devices or , resulting in worse performance in common tasks such as acquiring targets (for example, clicking on icons).

Eye tracking is becoming increasingly prevalent as well, with recent entries such as the Apple Vision Pro relying almost entirely on eye-based interaction. For repeated target acquisition tasks, eye tracking can cause eye fatigue, and its .

Ergonomics and physical strain

Ergonomic issues also persist. While VR head-mounted displays are becoming increasingly affordable and portable compared to devices from the late 1990s, .

Another common issue is that VR controllers are known to yield arm and shoulder fatigue in users. .”

There are methods to alleviate this fatigue, . However, most commercial device manufacturers have not employed such techniques yet, instead favouring 1:1 scale interaction. In other words, this means any movement you do in real life is executed the same in VR.

Notably, such approaches go back to the earliest examples of 3D interaction (not necessarily in VR), , despite extensive academic research to improve the situation since then.

To their credit, Apple has shown an awareness of this issue with Vision Pro, using in a comfortable arm pose to act as a “click” action. This technique, likely adapted from , uses a combination of eye gaze to select targets and gestures with the hand to manipulate them.

Cybersickness

A final usability hurdle is cybersickness – visually induced motion sickness commonly observed with VR use. Presently, .

For years, cybersickness was incorrectly attributed to technical issues, such as display refresh latency. , the phenomenon is more complex and caused by human factors .

The primary cause of cybersickness is believed to be visual-vestibular mismatches. This occurs when your eyes tell you you’re moving while your inner ear sense of motion tells you that you are not. VR systems using a controller joystick to move through the environment will immediately increase likelihood of cybersickness than those using natural walking.

Postural stability – our ability to keep upright with conflicting visual information – also plays a role, and sudden motions in VR can cause users to lose their stability. Both issues can be mitigated through better system design, such as minimizing or eliminating movement, or using movement techniques .

Since we often want to move around in a larger virtual environment than the physical space available, other techniques intended for use with joystick or steering-based virtual movement have also been shown to be effective in reducing cybersickness. Examples of this include or .

The future of work

As for the future of work in the metaverse, while some may be eager to see the shift in the very near future, there are still too many issues that make working in VR complicated. These issues, unfortunately, will not be easily solved by simply releasing new hardware or software.

Until these issues of user interaction, ergonomics and cybersickness are resolved, the metaverse will not be ready to fully replace traditional office environments or provide a completely effective alternative for remote work.

For now, remote work in the metaverse may appeal to early adopters or companies looking to experiment with virtual spaces, but it is unlikely to become a mainstream solution in the immediate future.

The Conversation

/Courtesy of The Conversation. View in full .