Combating VR Sickness with User Experience Design

Updated on 08-May-2017

Introduction

Virtual reality (VR) gaming has become the new frontier of game development, exciting every creative developer. It is a space where the possibilities for unique gaming mechanics are seemingly endless. However, as more game developers transition from traditional game development to VR, one issue is becoming a limiter to that creative freedom. Nausea, dubbed in the community as “VR sickness,” (VRS) is widely considered the biggest hurdle for VR game developers to date and for good reason: it is one of the largest barriers to entry for new adopters of the technology.

As indie developers, we at Well Told Entertainment* have identified user experience design as the number one priority to creating an enjoyable experience in VR. Whether the task is to hold a steady frame rate or ease a user into a world, our mission is to make sure that everyone—from the VR novice to the seasoned veteran—has a great time. However, there are no experts in this space. Having spent a year as a VR game design company, we have come to know a little more about what helps users of all kinds build literacy in VR, and we'd like to share some code, shader techniques, and design principles with the community as a whole to get more people strapped in and jamming on well-made VR experiences.

Well Told Entertainment* and Learning VR

Our first steps into VR started with the curiosity behind alternate movement schemes from teleportation. Other than strict room-scale experiences, VR users were, and are still, looking for unique and clever ways to escape room scale without having to fall over or feel like they’ve been at sea. In early 2016 we were excited for the new wave of incoming VR games, but like most early adopters of VR we were skeptical after hearing the reports of VRS. After reading on the forums that people were tackling this issue through teleportation, we were eager to try out games like The Lab*, Rec Room*, and Budget Cuts*. Though the games themselves are fantastic experiences in VR, and the teleportation solution effective against VRS, we still felt that something was off. What it boiled down to was that the act of teleportation in VR is immersion breaking, especially when unrelated to the story. Knowing that true immersion is one of VR’s greatest qualities, we were inspired to seek out what it would take for a dynamic solution to movement.

Like any curious individuals, we started our search by perusing the Reddit* forums. (r/vive, r/virtualreality, and r/oculus have plenty of inspired devs sharing their progress on indie development. In our opinion these are the best places to survey and look for creative solutions in the medium.) With movement schemes being the hottest topic in summer 2016, we noticed that a number of devs were sharing their opinions and demos online. I read about techniques such as climbing schemes that involved using your controllers to throw your character, arm pumping to propel your character forward, and even putting one of the controllers in your belt loop to act as a pedometer that moves your character forward as you bounce. From our own research we were able to draw conclusions on unique ways to approach the problem and begin forming our own hypotheses. After a month of random testing by one of our trusted developers, Vincent Wing, we began to make progress. Based on our research, below are some of the best practices we found to combat VRS without compromising dynamic locomotion and interactive gameplay.

Considering User Experience in Virtual Reality

In late June my cofounder, Sam Warner, and I attended a VR event in Playa Vista hosted by the Interaction Design Association. Though the event was dedicated to VR, we seemed to be the only people in attendance who weren’t UX designers by trade, which made networking a little awkward. However, we were happy to find value in the presentation given by Andrew Cochrane, the guest speaker and a director working in new media. Cochrane focused on sharing his insights on VR content creation, insights we used that night to solidify our design approach for VR.

Most developers approaching VR come from either a games or film background, mediums that largely revolve around storytelling.

Cochrane believes that there are no storytellers in VR. Instead, VR is an experiential medium and everyone in this space is an experience designer. In games or films, the creator has the ability to manipulate the story for the user. They follow story-act structures and use editing and effects to direct a user’s attention to enable an emotional response. However, in VR, the goal is to give the audience a feeling of complete immersion to enable an emotional response. Thus it’s important to make sure the user has a strong sense of presence within the virtual space. The more immersive a developer can make the experience feel the better, which places a lot of responsibility on the developer, compared to developing in traditional mediums. In our experience regarding locomotion, the best practice we recommend to reduce immersion breaking and VR sickness is to always provide the player with a strong visual reference.

Spatial Orientation

To create a great feeling of presence in VR, the first step is to introduce the player to their surroundings. Though locomotion tends to get much of the blame, VRS can kick in as soon as the headset is put on, because in effect the player is putting on a blindfold. Covering one’s eyes has a big impact on balance. Try this: stand up and close your eyes. Though at first it seems like a simple task, most people find themselves starting to waver after only a short amount of time. Some may even start to fall over. The reason is because balance relies heavily on visual reference and depth perception. When the visual sense disappears, our bodies resort to proprioception for balance, which varies in ability from person to person. When someone enters a virtual world, it’s important that they have a good sense of spatial awareness before things can get magical.

In developing Vapor Riders ‘99*, a type of skiing/gliding racing game (as seen below in the link to the video), we learned that giving the user a good reference to the floor beneath them had a huge impact in combating VRS. When a player first enters the game, we made sure to give them a good visual of the track beneath their feet, the distance ahead of them, and their surroundings. However, based on the racing nature of the game, we had to take ground reference in Vapor Riders ‘99 a bit further. When we first tested the game in public at Virtual Reality Los Angeles 2016, some expressed discomfort relating to a user’s height to the track, particularly with taller players. In order to solve this issue we put in a simple height calibration system before play, which worked well. Below is the script we used with Unity* software that you can put into your own games.

The two functions we use to calculate and set player height and wingspan are UpdateTPoseInfo() and ScaleAvatar(). The first function checks the position of the head against the position of the ground and does the same between the two controllers. Before the distances are calculated, the player has to assume a T-pose. Otherwise, it’s difficult to get accurate information on the player’s size. 

ScaleAvatar() is used to size the actual player prefab based on the scale information collected with UpdateTPoseInfo(). We divide the default size of the player by the collected sizes and then set the local scale in X and Y of the prefab to the new Vector2. There is no need to scale the player in the z-axis in our game.

This system allows us to determine the difference between the player’s ‘default’ position and their current body position. Vapor Riders relies on head and hand position to determine velocity and movement direction, and this is a necessary step to achieve a consistent game-feel across people with different body sizes and types.

void UpdateTPoseInfo()
    {
        if (!leftWrist)
            leftWrist = manager.leftController.transform.Find("LControllerWrist");
        if(!rightWrist)
            rightWrist = manager.rightController.transform.Find("RControllerWrist");
        Manager.Instance.playerHeight = eyes.position.y – playspaceFloor.position.y;
        Manager.Instance.playerWingspan = Vector3.Distance(leftWrist.position, rightWrist.position);
    }
    void ScaleAvatar()
    {
        float avatarHeight = eyes.position.y – avatarFoot.position.y;
        float avatarWidth = Vector3.Distance(avatarLeftWrist.position, avatarRightWrist.position);
        Vector2 scaleFactor = new Vector2(Manager.Instance.playerHeight / avatarHeight, Manager.Instance.playerWingspan / avatarWidth);
        avatar.localScale = new Vector3(initAvatarScale.x * scaleFactor.x, initAvatarScale.y * scaleFactor.y, initAvatarScale.z);
    }

We don’t fully understand why height in our game made such an impact on player nausea. Ultimately we believe it relied on a few factors relating to vehicle-less locomotion and the lack of a heads-up display (HUD).

In October we began development on a short horror escape room, which became Escape Bloody Mary*. In the game you play a child trying to escape a demon while locked in a bathroom. One of our goals for the game’s presence was to make the player feel like a child without running into the same height issues we faced with Vapor Riders ‘99. Rather than shortening the player to make them feel small in the bathroom, we kept the height-to-floor ratio and instead scaled all the assets in the room based on the headset’s distance to the floor. This way, a player always has to look over set pieces, making them feel smaller without compromising their reference point to the ground.

Level Design

The space a player moves around in is as important as the locomotion scheme itself, especially in VR. When testing Vapor Riders ‘99 we learned how a player interacts with different parts of our test track. When designing our track we wanted to see what worked best without the track being boring. When the player starts we put them through a series of left and right turns along the x-axis, starting with gradual curves and then transitioning to sharper turns. The track then transitions to mostly up and down z-axis movements. User feedback indicated that players were more comfortable with the end of the track as opposed to the beginning. After some troubleshooting we began to figure out why. It wasn’t so much that users preferred up and down movements over left and right (though flying is tons of fun). The cause for discomfort was our inability to ease users into disruptive elements of the level. To avoid nausea during accelerated movements, the player must always have a good reference point of where they are headed and how quickly they will get there. In the beginning of our test track, we quickly put the players into gradually sharper turns, which turned their attention away from the road ahead of them. Without being able to see what was ahead they were caught off guard and unable to adapt their sense of balance to the oncoming track—something to be aware of with fast gameplay.

Effective Common Movement Schemes

Aside from teleportation, there are a few different movement schemes that address VR locomotion and nausea. One of our favorites is dashing. Dashing is similar to teleportation except that instead of cutting from one spot to the other, the player quickly dashes forward to the designated target. This type of movement is executed well in the game Raw Data*. What’s great about dashing versus teleportation is that the user never loses their reference point, which helps make them feel like they do not need to reorient themselves to their new location, something we really enjoy.

For faster movement, such as in Vapor Riders ‘99, using vehicles as transportation devices seems to do the trick well for some of our favorite games. Our goal for Vapor Riders ‘99 was to achieve fast-paced “vehicle-less” movement; however, the addition of a vehicle has its benefits to VR game design. First and foremost, it gives the player a visual reference point for balance. Though the world is moving around them, players can always reorient themselves to the vehicle they are in, much like driving a car. Games such as Hover Junkers* pull this off really well, especially since the vehicle in Hover Junkers has a large platform with good visibility.

Optimization

It was established fairly early that a stable 60fps/120hz is the required baseline for non-nauseating VR content. Anything less and the user suffers a noticeable stutter in frame-to-head position offset. In our quest to complete Escape Bloody Mary, we quickly found that even room-scale games have a fairly hard cap on certain engine features used throughout the scene.

From the beginning of the project, we knew we wanted to protect a few tech-art-related goals from the violent swings of the feature-creep-axe. We wanted to have a large mirror spanning most of one of the walls that would reflect all the lights, characters, props, and static objects. We had to be able to use lights whenever and wherever we wanted in order to manage atmosphere and tension. We wanted to give Bloody Mary 4 pieces of cloth to make her feel submerged and floaty. We also needed to have the moment when she comes out of the mirror be as close to one-to-one as possible to preserve immersion, which meant having two copies of her in the scene at all times.

In order to get all of this running on our machines as smoothly as possible we had to set up a few small systems. For the light management, we created a rough check-in, check-out system where we could move lights around the scene and change their settings as needed. We found that toggling lights on and off gave us larger performance hits thanks to their being duplicated by our mirror script, so we had to shuffle around half as many lights as normally would be renderable in our bathroom scene.

We ended up lowering the resolution of many of the more distant textures as well so that the mirror could more easily handle the entire scene duplication. When we paired this with the light management system we were able to get a stable environment that let us conjure lightning, light four candles, toggle a dynamic flashlight, fire a handgun, and still have a couple of room atmosphere lights to manage the presence of the game.

Between our two Bloody Marys, only four active cloth simulations were ever running. When Mary started to move through the mirror, we began turning off the old Mary's simulations and activating the cloth on Mary coming out of the mirror. Their animations were synced by finite state machines so that they would always be doing the same movement. These two systems let us teleport and spawn her whenever we wanted with a fairly stable frame rate no matter where she was.

Sound Brings Everything Together

The finishing touch to any game is sound, which brings all the elements to life. In film everything on screen that moves needs its own sound effect, no matter how subtle. Games take this a step further in that players have sound effects to accompany their movements, such as footsteps or breathing. But for unknown reasons, this principle is often ignored in VR game development.

Adaptive soundtracks, sound effects triggered by user input which match the rhythm of the soundtrack, are an effective way to include sound in movement where it is seemingly lacking. Thumper* is a great example. One of our long-term goals is to add an adaptive soundtrack to Vapor Riders ’99, because it will give users an audio reference of how much they are affecting directional movement based on the position of their controllers.

Summary Checklist

  • Make decisions based on creating a better immersive experience for the user.
  • Make sure the player starts the experience with a good point of reference to the floor.
  • Ease users into the experience
  • When designing levels, make sure players know what’s coming if they need to make quick decisions, especially when regarding movement.
  • Vehicles can help provide reference points in fast-paced games.
  • Optimize whenever possible to avoid frame lag.
  • Give your movement and interactions sound effects. Consider adaptive soundtracks when applicable.

Now Throw Our Advice Out the Window

Okay maybe not . . . But we want to encourage game developers to think outside the box and find the locomotion scheme that best fits the style of their game. Vapor Riders ‘99 was founded as a byproduct of random trial and error. We started testing movement schemes, and when we stumbled on something fun and non-nauseating we created a game around it. That’s the magic of VR and the pioneering into the unknown that excites us to become better developers in this space. More creative solutions and tools will come only by taking risks, and we can’t wait to see what developers come up with in these early years of VR!

For more such intel resources and tools from Intel on Game, please visit the Intel® Game Developer Zone

Source:https://software.intel.com/en-us/articles/combating-vr-sickness-with-user-experience-design

Connect On :