Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Unreal Engine 4 Virtual Reality Projects

You're reading from   Unreal Engine 4 Virtual Reality Projects Build immersive, real-world VR applications using UE4, C++, and Unreal Blueprints

Arrow left icon
Product type Paperback
Published in Apr 2019
Publisher Packt
ISBN-13 9781789132878
Length 632 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (2):
Arrow left icon
Robert Ruud Robert Ruud
Author Profile Icon Robert Ruud
Robert Ruud
Kevin Mack Kevin Mack
Author Profile Icon Kevin Mack
Kevin Mack
Arrow right icon
View More author details
Toc

Table of Contents (20) Chapters Close

Title Page
About Packt
Contributors
Preface
1. Thinking in VR 2. Setting Up Your Development Environment FREE CHAPTER 3. Hello World - Your First VR Project 4. Getting Around the Virtual World 5. Interacting with the Virtual World - Part I 6. Interacting with the Virtual World - Part II 7. Creating User Interfaces in VR 8. Building the World and Optimizing for VR 9. Displaying Media in VR 10. Creating a Multiplayer Experience in VR 11. Taking VR Further - Extending Unreal Engine 12. Where to Go from Here 1. Useful Mind Hacks 2. Research and Further Reading 3. Other Books You May Enjoy Index

Immersion and presence


Now that we've set up a bit of context about what VR is and a few of the many of the things we can do with it. Let's start getting our hands dirty and learn the following:

  • What makes VR work
  • What can break it
  • What we need to do as developers to make sure the VR experiences we build succeed

To that end, let's lay out a few best practices in VR, and then we'll talk about them in depth.

We'll begin by talking about the experience we're trying to create.

Immersion

When VR works, as we discussed earlier, it works through a process we call immersion, which we described earlier as a perceived experience of being physically present in a virtual world. For an experience to be immersive, a few things need to be true.

Using all the senses

First, it has to encompass a wide enough range of the user's senses that competing senses from outside the VR experience don't pull the user back out of the virtual space. In practice, this is why VR headsets are designed to block out all other light, and why they usually include headphones or on-board audio. Anything we see or hear that isn't part of the VR experience risks breaking immersion.

While vision and sound are pretty easily communicated through the eyes and ears, physical sensations are more difficult to produce. In VR, we refer to physical sensations as haptics. Decades of research have gone into figuring out how to recreate physical sensations, but in practice, it's a tough problem to solve. In the current generation of VR hardware, haptics take the form of a rumble pack in the player's controller that vibrates the controller on cue. While it's limited just to the hand holding the controller, even using such basic haptic feedback as this in your designs is still surprisingly effective for creating a sense of physicality in virtual space. A little vibration when the user's virtual hand contacts an object can go a long way toward making the object feel as though it's physically there and to allow users to sense its boundaries and know when they've made contact with it.

Note

Remember to use all the senses to create an immersive experience, not just the visual. Use sound to involve the ears in the experience and haptic feedback on the controllers to create physical cues.

Make sure sensory inputs match one another and match the user's expectations

Senses need to match the user's expectations, and they need to match one another for an immersive experience to feel real. When the user turns their head, an object they're looking at should move in their view as it would have if it were in the physical world around them. This part is pretty well handled for you by the Unreal engine and your VR hardware, but the next bit, sound, is often overlooked by developers.

Objects that produce sound should use spatialized audio to ensure that sounds seem to come from where the objects appear to be. As we mentioned a moment ago, physical objects should produce a tactile response using haptic feedback when the user appears to touch them.

Note

The behavior of visual objects is pretty much taken care of for you by the HMD and Unreal Engine, but make sure you use spatialized audio to localize sounds to their apparent sources, and experiment with haptic feedback to make physical actions feel more real.

Keep latency as low as possible

The quality of the visual and audio experience matters greatly to immersion, and the absolute most important factor driving this quality is the smoothness and responsiveness of the experience. What this means for developers is that frame rate matters above every other consideration in VR. VR developers use the term latency to describe the responsiveness of a VR application—the time between the user performing an action, such as turning their head, and seeing the visual result (in this case, the world appearing to rotate around them). Developers call this motion-to-photon time, and it's important. If the user turns their head and world lags behind, it won't feel real, and worse, can make them sick. Current VR headsets do quite a lot in hardware and software to minimize and disguise latency, but as a developer, you also have to do a lot as well to keep latency as low as you can possibly get it.

Note

Latency refers to the speed at which a VR application responds visually to the user's actions and is fundamental to immersion in VR. Research suggests that the absolute highest latency you can get away with is 20 milliseconds, but you should be shooting for far less.

In practice, this means that, when you have to choose between detail in your scene and frame rate (and you'll have to make this choice all the time as a developer), choose speed. Users will forgive a lower-resolution texture much more easily than they'll forgive a dropped frame. Much of your work in VR development will focus on getting your scene running at an acceptable frame rate, and we'll talk quite a bit about how to do this in Unreal. For now, make sure you keep this in mind: keeping latency low is absolutely fundamental to immersion in VR, and the choices you make in designing and developing a VR application have to be made with this in mind.

Note

When faced with a choice between image quality and framerate, choose framerate every time. Beautiful textures, high-poly models, and dynamic shadows won't create a convincing experience for the user if they're dropping frames. Users will fill in a remarkable amount of detail in their own minds, meanwhile, if the experience is running smoothly, while they won't believe it at all, or worse, will get sick if latency gets too high.

Make sure interactions with the world make sense

Interactions with objects should be consistent and they should make sense. With the immersive nature of VR comes an increased expectation that objects will behave as they would in real life. In traditional media, delivered on a flat screen and constrained to a frame, the user's eyes and brain are consistently reminded that they're looking at a flat image that isn't real, and they'll forgive a lot. But in VR, the world already surrounds them and seems to be real, and they'll expect it to behave as though it's real too. 

Things that don't behave or respond in ways they would in the real world can pull the user out of the experience and break immersion. There's a limit in practice: of course, you can't make every object in the world interactive, but to the degree possible, you should pay attention to what your users' expectations are going to be, and try to meet them. If you put an object in the scene that looks as if it can be picked up, expect users to try to pick it up, and understand that you'll be working against immersion if it doesn't behave as they thought it would. Try to make objects in the scene that look interactive be interactive, and if they can't be, consider moving them out of the play area or changing their appearance to manage the user's expectation. This is another area where judgment comes into play—not everything can be interactive, and you may not always want it to be, depending on the kind of experience you're trying to create. You should be making conscious choices with immersion in mind when deciding how objects in your world should behave, and those choices should feel consistent with each other within the space of the world and not arbitrary.

Note

Users will try to reach out and touch objects that look as if they can be touched and will try to move them. Try to satisfy their expectations where you can, or design your scene in such a way that these interactions aren't expected.

Explore the unique opportunities for interaction that VR, especially 6DoF VR with hand controllers, gives you. In previous media, users mostly interacted using a mouse, buttons, and joysticks, but in VR, the users hands interact with the world much more directly, and this makes an entirely new range of interactions possible. Where in a traditional game, a watering can might be used by pushing a button, in VR, the user can squeeze the controller grips to pick it up and turn their hand over to use it. Think about what makes sense in your world, and what becomes possible when the user's hands enter the picture, and design to make use of these opportunities. Interfaces don't have to consist of just buttons anymore.

The user's expectations for interaction will vary depending on the type of experience you're creating. If you're making a game that simulates the experience of being in another world, immersion matters a lot. If, on the other hand, you're making a movie viewer, the user probably doesn't really care whether a virtual coffee cup on a table nearby can be picked up, because that's not what they're there for. It's up to you to understand what's going to matter to your users and what isn't and to meet those expectations.

The way you represent the user's hands will drive expectations of how they'll behave as well. If they're modeled as hands, the user may naturally expect that they can pick objects up and move them around. If instead of hands, you display models of the controllers, palettes, weapons, or other tools, you're suggesting a different type of interaction. Users will try to do what it looks like they can do.

Build a consistent world

As we mentioned previously in the discussion about interaction, the whole experience should make sense as it fits together. Users should be able to construct a model of reality, even if it's an abstract or a complete fantasy, from what you give them in the world. The place you're building should feel like a place, with its own language and rules.

The amount of detail you put into your world can have an impact here. The more immersive an experience becomes, the more fragile that immersion becomes. Adding details and immersive elements creates a raised expectation that everything else in the world will live up to that standard too and can pull the user back out of immersion if something doesn't behave consistently with the apparent rules of the world. In many cases, you may want to render your world in a more stylized way to manage the user's expectations. Immersion doesn't require the VR experience to mirror the real world perfectly—it requires the experience to be consistent with itself.

Be careful of contradicting the user's body awareness

Be careful of adding immersive elements that contradict the player's awareness of their body. We all have a natural awareness of where our body is and what it's doing. This is called proprioception, and it's the sense that tells you where your arms and legs are even when you're not looking at them. Representing the user's body in ways that don't match this sense can break immersion.

Rendering the user's hands usually works well, as the motion controllers tell us exactly where they are, but this may not be such a good idea to render the rest of the arm, since we have no information about what that arm is really doing. If you guess and get it wrong, it will feel wrong to the user and can break immersion. It's often better not to guess at all, and simply render the hands up to the wrists, leaving the arms, legs, and body imagined. Interestingly, users seem to prefer this. They tend not to notice that the body is invisible until it's pointed out to them, whereas a body that's rendered but wrong calls attention to itself.

For similar reasons, realistic, fleshy hands can make users feel uncomfortable if they don't match their real-world hands. Hands work much better if they're stylizedas translucent, cartoonish, or robotic so users don't feel as if they're trying to simulate reality and getting it wrong.

Note

Animators commonly refer to a phenomenon called the uncanny valley, which occurs when a simulation gets just close enough to resembling a human that it triggers the viewer's instinctive awareness of everything that's wrong with it. For a simulation to work, it either needs to be stylized enough that viewers don't expect realism, or it needs to get the realism right. Anything in between is creepy. The same principle holds for representations of the user's own body in VR. Don't get it almost right. Get it perfect, or stylize it. 

Decide how immersive you intend your application to be and design accordingly

Finally, not every use of VR needs to be equally immersive. Your choices in this really hinge on what your application is intended to do. If it's a tool for visualizing engineering models, you may be most interested in VR's ability to allow the user to manipulate models easily, and it may not matter so much to you whether they really believe they're in another place. If, on the other hand, you're creating an immersive game or cinematic experience, these choices will be critical. It's up to you to figure out which of these rules matters most for your particular application.

 

Presence

Immersion in VR serves a single goal—the creation of an experience of presence in the user. Presence, as we previously defined it, is a sensation of being in a place, and this is, in large part, a phenomenon perceived physically. Very often, they respond physically and instinctively to things in the world such as heights or objects flying toward them. The body largely believes what it perceives in VR and responds accordingly. If you think of presence primarily in physiological terms, you'll have an easier time understanding how users experience it. What does this experience make your user feel?

A key to understanding presence is to understand that VR doesn't so much work by trying to simulate an environment accurately as it does by triggering and fooling a range of systems that we use to perceive the world. This is one of the reasons why we can get away with low detail in our textures if we get the movement of the world right and keep latency low—our perceptive systems are much more aware of motion than detail. VR doesn't have to fool all the senses—just the right ones in the right ways.

Simulator sickness

A major factor you'll be dealing with quite a lot in VR is simulator sickness. This is a form of visually-induced motion sickness that often occurs in VR, and you'll deal with it a lot.

As humans, we spend most of our time walking upright, which takes a tremendously complicated amount of coordination to achieve, and yet we manage to do it without thinking about it. We manage this through a structure in our inner ear called the vestibular system, which we use to coordinate movement and keep our balance. This system is extremely sensitive, and it works in conjunction with our vision and our sense of our body (proprioception) to understand how we're moving.

Note

You'll hear VR developers talk a lot about the vestibular system or the inner ear. For our purposes, since the vestibular system is located in the inner ear, we mean the same thing when we use the terms interchangeably. This is one of three systems that tell us whether we're moving and how to keep our balance. The other two are our visual system and our proprioception (our natural sense of our body's position). Problems arise when signals from these three systems don't agree with one another.

This creates a problem when visual information tells the body that it's moving, but it can't feel that movement in the inner ear. (Researchers call this the sensory conflict theory.) Seasickness and carsickness happen for the same reason. When visual movement cues and movement cues coming from the vestibular system in the inner ear don't match, the body can respond by triggering nausea, sweating, and other effects. (Researchers don't yet agree on why this is, but one theory suggests that when the senses don't match, the body may assume that it's been poisoned.)

The challenge with VR is that it does such a good job of simulating movement. The user's mind naturally accepts that the movement they see is really occurring, and runs into problems when the signals from the inner ear don't confirm this. Developers need to be conscious of this challenge and deal with it. We'll talk about ways to do this in a moment. (Be aware that the opposite is true too—always show movement in the headset if the user turns their head.)

Note

Simulator sickness, sometimes shortened to simsickness, is a form of motion sickness that can occur in VR. (You'll also sometimes see it shortened to VIMS for Visually-Induced Motion Sickness.) The most common cause of simulator sickness is poorly-designed locomotion. The second most common cause is high latency. How users move through the world, and how smoothly and consistently it responds to their movements are critical factors in combating simulator sickness.

Safety

Another major consideration is safety. Because VR completely overwhelms the user's senses, it's possible to put users into unsafe situations, and it is up to you as a developer to try to avoid this. If you tilt the horizon, for example, there's a high likelihood that your users are going to lose their balance. If you've designed an experience that involves big physical movements, such as swinging a sword or a baseball bat, be aware that your users can't see what's around them and can easily hit objects in the real world. Be conscious as well of factors that can cause eyestrain, such as forcing users to focus on UI elements that are too close to the camera, and photosensitive seizures that can be induced by flashing lights.

With these factors in mind, let's get specific about laying out a few best practices that can help to keep your users comfortable and safe.

 

 

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime