Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Unreal Engine 4 Virtual Reality Projects
Unreal Engine 4 Virtual Reality Projects

Unreal Engine 4 Virtual Reality Projects: Build immersive, real-world VR applications using UE4, C++, and Unreal Blueprints

Arrow left icon
Profile Icon Kevin Mack Profile Icon Robert Ruud
Arrow right icon
NZ$14.99 NZ$51.99
Full star icon Full star icon Full star icon Full star icon Half star icon 4.6 (18 Ratings)
eBook Apr 2019 632 pages 1st Edition
eBook
NZ$14.99 NZ$51.99
Paperback
NZ$64.99
Subscription
Free Trial
Arrow left icon
Profile Icon Kevin Mack Profile Icon Robert Ruud
Arrow right icon
NZ$14.99 NZ$51.99
Full star icon Full star icon Full star icon Full star icon Half star icon 4.6 (18 Ratings)
eBook Apr 2019 632 pages 1st Edition
eBook
NZ$14.99 NZ$51.99
Paperback
NZ$64.99
Subscription
Free Trial
eBook
NZ$14.99 NZ$51.99
Paperback
NZ$64.99
Subscription
Free Trial

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Billing Address

Table of content icon View table of contents Preview book icon Preview Book

Unreal Engine 4 Virtual Reality Projects

Chapter 1. Thinking in VR

"All reality is virtual.That's a strong statement, and it's not obvious if you haven't thought about it before, so I'll say it again—the reality we experience is a construct in our minds, based on highly incomplete data. It generally matches the real world well, which isn't surprising, evolutionarily speaking, but it's not a literal reflection of reality—it's just an inference of the most probable state of the world, given what we know at any one time."Michael Abrash, Chief Scientist at Oculus

"The most important thing about a technology is how it changes people."Jaron Lanier, founder of VPL research, VR pioneer, and interdisciplinary scientist at Microsoft Research

Welcome to the virtual world. (It's bigger on the inside.)

In this book, we're going to explore the process of creating VR applications, games, and experiences using Unreal Engine 4. We'll spend some time looking at what VR is and what we can do to design effectively for the medium, and then, from there, we'll move on to demonstrate these concepts in depth using the Unreal Engine to craft VR projects that illustrate and explore these techniques and ideas.

Every chapter will revolve around a hands-on project, beginning with basics such as setting up your development environment and creating your first test applications in VR, and moving on from there into increasingly in-depth explorations of what you can do in VR and how you can use Unreal Engine 4 to do it. In each project, we'll walk you through the process of building a project that demonstrates a specific topic in VR and explain the methods used and, in some cases, demonstrate a few alternatives. It's important to us, as you build these projects, that you come away not just knowing how to do the things we describe, but also why we do them this way, so you can use what you've learned as a launchpad to plan and execute your own work.

In this first chapter, we'll look at what VR is and a few of the many ways it's currently used in a wide range of fields. We'll talk about the two most important concepts in VR: immersion and presence, and how understanding what these are and how they work will help you to make better experiences for your users. We'll lay out a collection of best practices for developing immersive and engaging VR experiences, and talk about some of the unique challenges posed by VR development. Finally, we'll pull this knowledge together and dig into a method for planning and executing a VR project's design. 

In brief, this chapter is going to take us through the following topics:

  • What is virtual reality?
  • What can we do in VR?
  • Immersion and presence
  • Best practices for VR
  • Planning your VR project

What is virtual reality?


Let's start at the beginning, and talk about virtual reality itself. VR, at its most basic level, is a medium that immerses users into a simulated world, allowing them to see, hear, and interact with an environment and things within this environment that don't actually exist in the physical world around them. Users are fully surrounded by this experience, an effect that VR developers call immersion. Users who are immersed in a space can look around and often move and interact without ever breaking the illusion that they're actually there. Immersion, as we're going to see shortly, is fundamental to the way VR works.

Rob Ruud testing an early build of Ludicrous Speed using an HTC Vive headset

Note

Immersion in VR is a term used to describe a VR system's ability to surround the user with the simulated world. They can look around and, in many cases, move and interact as though they were really there, and because the actual environment is blocked out by the headset, they're given few conflicting cues to remind them that they aren't.

VR hardware

The most common way of immersing a user, and the one we'll be talking about in this book, is through the use of a Head-Mounted Display (HMD), often just referred to as a headset. (There are other ways of doing VR—projecting images on walls, for example, but in this book, we focus on head-mounted VR.) The user's headset displays the virtual world and tracks the movement of their head to rotate and shift the view to create the illusion that they're actually looking around and moving through physical space. Some headsets, though not all of them, include headphones to add to the illusion by enabling sounds in the environment to sound as though they're coming from their sources in the virtual world through a process called spatialized audio

Note

You'll see the termsHMD andheadset used interchangeably throughout this book and in other writing on VR. They all refer to the same thing.

Some headsets only track the direction the user is looking, while others can track changes to the user's position as well. If you're using a headset that tracks rotation but not position, and you lean forward to try to look more closely at an object, nothing's going to happen. The object will seem as though it's moving away from you as you try to lean in toward it. If you do this on a headset that tracks position as well, your virtual head will move closer to the object. We use the term Degrees of Freedom (DoF) to describe the ways objects can move in space. (Yes, it's OK to pronounce it doff. All of the developers do.) Take a look at the following points:

  • 3DOF: A device that tracks rotation but doesn't track position is commonly called a 3DoF device because it only tracks the three degrees of freedom that describe rotation: the degree to which the device is leaning to the side (roll), tilting forward (pitch), or turning sideways (yaw). Up until recently, all mobile VR headsets were 3DoF devices, as they used Inertial Measurement Units (IMUs) similar to those found in cellphones to detect rotation, but had no way to know where they were in space. The Oculus Go and Samsung Gear headsets are examples of 3DoF devices.
  • 6DOF: A device that tracks position as well as the rotation is a 6DoF device, because it's tracking all six degrees of freedom—roll, pitch, and yaw, but also up and down, side-to-side, and forward or backward movement. Tracking an object's position in space requires you to have a fixed reference point from which you can describe its motion. Most first-generation systems needed additional hardware for this. The Lighthouse base stations for the HTC Vive, or the Constellation cameras for the Oculus Rift provide this postional tracking on desktop systems. Windows Mixed Reality headsets and standalone headsets such as the Oculus Quest and Vive Focus use camera arrays on the headset to track the headset's position in the room (we call this inside-outtracking), so they don't require external cameras or base stations. The HTC Vive, Oculus Rift, HTC Vive Focus, Oculus Quest, and Windows Mixed Reality headsets are 6DoF devices.

Note

3DoF devices track rotation only, so users can look around or point, but can't move from side-to-side. 6DoF devices track position as well as rotation, so users can not only look around, but can move as well.

Headsets can either be tethered to a computer—as is the case with the Oculus Rift and the HTC Vive, which allows the full computing power of the attached PC to drive the visuals – or they can be self-contained devices such as the Samsung Gear, Oculus Go, Oculus Quest, and HTC Vive Focus. At the time of this writing, wireless connections between PCs and VR headsets are beginning to enter the market.

Most headsets also come paired with input devices that allow users to interact with the world, which can act as pointers or as hands. Handheld devices, as with headsets, can be tracked in three or six degrees of freedom. 3DoF devices such as the Oculus Go's controller are essentially pointers—users can aim them but can't reach out and grab something. 6DoF devices act much more like virtual hands and allow users to interact with the world in a much greater variety of ways.

VR isn't just about hardware though

One of the major mistakes many new developers make when first approaching VR is that they try to apply the traditional designs they're used to creating in 2D space to the VR space and, for the most part, this doesn't work. VR is its own medium, and it doesn't follow the same rules as the media that came before it. It's worth it to take a moment to look at what this means.

When most people first consider VR, they see the headset and assume that it's primarily a visual experience—traditional flat-screen media shown in stereo. It's understandable that it would seem this way, but their perception misses the point. Yes, the VR headset is (depending on whether or not it includes integrated audio) either primarily or entirely a display device, but the experience it creates for the user is very different than the experience created by a traditional flat screen.

Let's imagine for a minute that you're looking at a photo or a 2D video looking down over the edge of a tall building. You see the streets far below, but they don't really feel as though they're far below you. They're just small in the image. Take the same image, but now present it in stereo through a VR headset, and you'll probably experience vertigo. Why is this? Take a look at the following screenshot:

Non-immersive media, no matter how large or detailed, still leaves the viewer surrounded by reminders that the scene isn't real. Immersive media, on the other hand, seems to surround the user completely. (Scene: Soul:City Environment Pack, by Epic Games)

First, as we mentioned a moment ago, you're immersed in the experience. There's nothing else in the surrounding world to remind you that it isn't real. Let's jump back to our previous example—the building edge on your television—turn around and look behind you. Oh. You're just in your living room. Even when you look directly at it, the largest television you could possibly buy still leaves you with lots of peripheral vision to remind you that what you're seeing there isn't real. Everything on a flat screen, even a 3D screen, takes place on the other side of a window. You're watching, but you're not really there. In VR, the window is gone. When you look to the right, the world is still there. Look behind you, and you're still in it. Your perception is completely overtaken by an experience that has become an environment, not just a frame you're looking at.

Second, the stereo image creates a sense of real depth. You can see how far down the drop really goes. The cars in the street below aren't just small, they're far away. In a 6DoF headset that allows motion tracking, your movements in the real world are mirrored in the virtual world. You can lean over the edge or step back. This mixture of immersion, real depth perception, and natural response to your movement comes together to convince your body that what you're perceiving is real. We call this phenomenon presence, and it's a sensation that's mostly experienced physically.

Note

Presence in VR refers to the user feeling that they're actually physically in the virtual world, responding to the environment as though they were really there and experiencing these things. Creating an experience of presence is what VR is all about—this is the major thing it can do that other media can't.

The mechanics of immersion and the resulting experience of presence are unique to VR. No other medium does this.

Note

When reading about VR, you'll sometimes see the terms presence and immersion used interchangeably, but it's generally more clear to think about presence as the goal—the sensation you're trying to create in the user, and immersion as the mechanism by which we achieve it.

Presence is tough to achieve

While we're on the topic of presence, it's worth pointing out that it's a fragile phenomenon, and the current state of VR technology still faces a few challenges to creating a sense of presence fully and reliably. Some of these are rooted in hardware and are almost certainly going to go away as the technology advances. Users can feel the headset on their face, for example, and on wired headsets, they can feel the cable running from the headset. The current generation of headsets offers a field of view that's too narrow to provide peripheral vision. (The desktop devices offer a 110° field of view, but your eyes, meanwhile, can perceive a field twice as wide.) Display resolutions aren't yet high enough to keep users from being able to see individual pixels (VR users call this the screen door effect), and finicky optics can blur the user's vision if they're not perfectly aligned. This means, in practice, that it's hard to read small text on a VR headset, and that users are sometimes reminded of the hardware when they have to adjust it to get back into the sweet spot for the lenses.

Looking at the state of things, though, it's obvious that these hardware challenges won't last forever. Self-contained and wireless headsets are quickly entering the market, with increasingly reliable tracking that no longer relies on external equipment. Displays are getting wider, resolutions are getting higher, and optical waveguides show great promise for lighter displays with wider in-focus regions. VR works extremely well already, and it's easy to see how it's going to continue to improve.

There are a few other things that can break presence that we can't do as much about—hitting a desk accidentally with a controller, for example, or running into furniture, losing tracking, or hearing sounds from outside the experience. We can manage these when we have control over the user's space, but where we don't, there's not much we can do.

Even given these limitations, though, think about how profoundly the current generation of VR can create a sense of presence in a user, and realize that it only gets better from here. Users believe what they experience in VR to a degree that simply doesn't happen with other media. They explore and learn in ways that aren't possible in any other way. They empathize and connect with people and places more deeply than they could in any way, other than physically being there. Nothing else goes as deep. And we're only getting started.

What can we do in VR?


So, what can we do with VR? Let's explore this, but before we begin, it's worth it to point out that this medium is still in its infancy. At the time of this writing, we're on the first generation of consumer VR hardware and the vast majority of the population hasn't even seen a VR headset yet, much less experienced it. Try this: the next time you're in a restaurant or a public space, ask yourself how many of the people around you have likely ever seen a VR headset—a handful at best. Now, how many of them have watched a movie (a century-old medium), watched television (three-quarters of a century), or played a video game (just shy of half a century)? VR is that new. We haven't come close to discovering everything we can do with it.

With that in mind then, use these ideas as a map of the current state of things and some fodder for ideas, but realize that there's much much more that we haven't even thought of yet. Why shouldn't you be the one to discover something new?

Games in VR

As we discussed a moment ago, VR at its core creates an experience of presence. If you're developing a game for VR, this means that designs that focus on giving the player an experience of being in a place are good candidates for the medium. Skyrim VR and Fallout 4 VR do a fantastic job of making players feel as though they're really in these expansive worlds. Myst-like games that put the player into a space they can explore and manipulate work well too.

The addition of motion controllers to simulate hands, such as those supplied with the HTC Vive, Oculus Rift, and Oculus Quest, enable developers to create simulations with complex interactions, such as Job Simulator and Vinyl Reality, which wouldn't be possible using traditional game controllers. Tender Claws' Virtual Virtual Reality provides a great example, meanwhile, of achieving 6DoF-like control with the Oculus Go's 3DoF controller.

The immersive aspect of VR means that games that surround you with the experience, such as Space Pirate Trainer, work well because the player can interact with actors all around them and not just what's in front. This need to watch all around you can be a focus of your design.

The sensation of motion VR evoked in players turns fast-moving games such as Thumper and Ludicrous Speed into physically-engaging experiences, and games such as Beat Saber capitalize on the player's physical movements to turn the game into a fitness tool as well.

Games in VR present a few challenges too, though. This same experience of presence and physical movement that makes the experience so engaging can mean that not every game design is a great candidate for VR. Simply porting a 2D game into VR isn't likely to work. A Heads-Up Display (commonly abbreviated as HUD) placed over the scene in 2D space won't work in VR, as there's no 2D plane to put it on. Fast movements that could be perfectly fine in 2D may make players motion-sick in VR. The decision to make a game for VR needs to be a conscious choice, and you'll need to design with the medium's strengths and challenges in mind.

Note

When thinking about moving a game or a game design from 2D into VR, there are a few specific areas that need to be considered: will the movement scheme work in VR? How can the UI be designed to fit into the world in VR? Will the game fit within the performance constraints of VR? Does putting this game into VR improve the experience of playing it? We'll address all of these considerations—movement, UI, and performance, in later chapters.

 

 

Interactive VR

Interactive VR experiences aren't just limited to games. 3D painting applications such as Tilt Brush allow users to sculpt and paint in room-scale 3D and share their creations with other users. Google Earth VR allows users to explore the earth, much of it in 3D. Interactive storytelling experiences such as Colosse, Allumette, Coco VR, and others immerse users in a story and allow them to interact with the world and characters. Interactive VR applications and experiences can be built for productivity or entertainment and can take almost any form imaginable.

It's worthwhile to keep a few considerations in mind when thinking about creating an interactive VR application. The mouse and keyboard aren't generally available to users in VR—they can't see these devices to use them, so interactions are usually best designed around the controllers provided with the VR system. Text can be difficult to read in VR—display resolutions are improving, but they're still low enough that small text may not be readable. The lack of a 2D HUD means that traditional menus don't work easily—usually, these need to be built into the world or attached to the player's virtual hands (see Tilt Brush for an excellent example of this.)

Note

Input and output are the main considerations for interactive VR—how will the user communicate input to the system, and how do they get information back out of it? In both cases, you have to design around the strengths and weaknesses of the system. You don't have a 2D HUD or a mouse, but you do have objects that can be moved and manipulated in space. VR displays can't yet approach the resolution of a desktop monitor, so reading a lot of text may not work. Successful designs in VR take these factors into account and turn them into deliberate design choices.

Interactive VR offers incredible possibilities for entirely new ways of exploring and interacting, and it's likely that we haven't even begun to see the full range of possibilities yet.

 

VR cinema – movies, documentary, and journalism

The same experience of presence that makes VR so well-suited for certain types of games makes it a powerful medium for documentary and journalism applications. VR is able to immerse users in a circumstance or environment and can evoke empathy by allowing viewers to share an experience deeply. Chris Milk, a pioneering VR filmmaker, has referred to VR as the "ultimate empathy machine," and we think that's a fair description. Alejandro Iñárritu's CARNE y ARENA was awarded a special Oscar by the Academy of Motion Picture Arts and Sciences in 2017 to recognize its powerful use of the medium to tell a story with deep empathy. VR's capacity to create presence through immersion makes things possible that simply can't be done on a flat screen.

A player experiencing Alejandro Iñárritu's CARNE y ARENA at Los Angeles County Museum of Art

Film and video in VR can be presented in several ways, which generally boil down to the shape of the virtual screen on which the images are presented and whether those images are presented in monoscopic 2D or stereo 3D. Flat or curved surfaces are generally used to present media carried over from traditional film or television, while domes, panoramas, or spheres can be used to surround the viewer with a more immersive 2D or 3D experience.

Mono 360° video surrounds the viewer but lacks depth—it's simply mapped onto a sphere surrounding the player. This has the advantage of being easier to produce and requires far less storage and less expensive equipment, and for many scenes, the difference between this and true stereo may be difficult to detect. Most early VR videos were produced this way. Stereo 360° video is similarly mapped to a sphere around the player (we'll learn how to do this in a later chapter), but displays a different image to each eye for true stereo depth. (We'll learn how to do this too.) New approaches to volumetric video that use light fields, Light Detection And Ranging (LIDAR) and photogrammetry to map real environments into genuine 3D virtual environments are beginning to appear and will likely become more prevalent as technology matures and processing power increases. As of this writing, they're still fairly new, often expensive, and still largely confined to the realms of high-end professionals and academics.

Documentary and journalism pieces are most often filmed as live-action video shot on a 360° camera or rig, in mono or increasingly in stereo, allowing the viewer to look around and become immersed in a seamless sensory environment. 360° cinema is generally intended to be a direct, immersive, and engaging experience, but is usually not interactive. The viewer is generally not able to move freely through the scene except by triggering a cut to a new scene and generally can't affect the events that go on within the scene. 

Note

In planning a cinematic VR experience, two of the primary choices to make are the following: will the experience be presented in mono or stereo, and what's the shape of the virtual screen on which it will be displayed?

Cinematic VR is another area in which simply porting the the language of the flat screen isn't enough. There's no concept of a frame in 360° film, and no concept of a shot size such as a close-up or a long shot. VR filmmakers have to be very careful about moving the camera, as it's very easy to make viewers sick with a moving or shaky camera. Film-making in VR is still in its infancy, and we're beginning to learn the ways the grammar of the language differs from traditional film or television, but still have a long way to go before we'll fully understand the language of the new medium.

This hasn't stopped filmmakers such as Alejandro Innaritu, Nonny de la Pena, Chris Milk, and Felix and Paul from creating astonishing and powerful cinematic experiences in VR, and this highlights what an exciting time it is to be participating in the creation and discovery of a powerful and entirely new art form.

 

Variants of VR cinema include the following:

  • Narrative stories
  • Documentaries
  • Journalism
  • Concerts and happenings
  • Sports
  • Virtual tourism

Architecture, Engineering, and Construction (AEC) and real estate

VR is ideally suited for Architecture, Engineering, and Construction (AEC) planning, as it allows designers to explore and iterate quickly on designs, and it serves as an excellent communication tool between designers and clients. VR provides an immersive experience that allows the user to explore and review the space in a real-world scale in a way that simply isn't possible through any other medium.

Note

The Architecture, Engineering, and Construction industries are often bundled under the blanket initialism AEC.

For the same reasons that VR is such a useful tool for AEC, it's equally useful for real-estate applications, providing prospective buyers an opportunity to tour a home remotely, or to experience a space before it's been built. No medium represents space and scale better than VR.

Unreal Engine, as we'll see, is particularly well-suited for architectural applications, as its physically-based workflow for materials and lighting makes it possible to create surfaces that look real and respond to light the way their real-world counterparts would.

In addition to providing a realistic lighting and shading model ideal for the realistic representation of spaces, Epic Games (the makers of Unreal Engine), also provides a suite of tools designed for non-game uses such as architectural visualization. The most important of these is a toolkit called Datasmith, which allows high-detail scenes to be imported from architectural Computer-Aided Design (CAD) and 3D packages into Unreal with little or no modification required to reproduce the object placement, lighting, and shading from the original source.

 

 

Note

Architectural visualization is often shortened to archvis or archviz.

In terms of practical workflow, engineering and architecture environments for VR usually begin in a CAD or a 3D Digital Content Creation (DCC) tool, and are then brought into Unreal either by hand or by using the Datasmith workflow, where it can be made into an environment that can be explored in VR.

For real-estate applications, the environment may be fully modeled in 3D, or may be photographed as a 360° sphere or panorama, which provides less interactivity but is much easier and less expensive to produce. Even though it limits the user's movement, 360° photography can still provide an immersive sense of the space that the user couldn't experience otherwise.

Engineering and design

As with building planning, VR can be an outstandingly effective tool for engineering and other design applications. Designs can be tested in depth and iterated rapidly without requiring physical prototypes to be built and can be placed in virtual environments that allow them to be evaluated in context. Designers can use VR to explore designs and see how parts will fit together and to communicate with stakeholders in an experience that closely replicates the experience of actually handling and interacting with the object.

Education and training

It can be argued that VR began its life in education, in 1929, when Edwin Link created the Link Trainer to train aircraft pilots using an early immersive simulator. The combination of immersion and interaction makes VR a powerful tool for education, learning, and exploration. VR, at its core, is capable of providing a much more concrete and experiential understanding of a subject than other media. Where most other media communicate ideas, VR communicates direct experience. 

Traditional education often focuses on communicating facts to students, but facts in isolation can bore or overwhelm them if they don't yet have sufficient context to know what they need them for in the first place. VR, by contrast, can be used to allow students to discover and learn concepts by working directly with materials and representations of ideas they're exploring and learning, practicing real skills and turning abstract ideas into experience. Context is a natural by-product of immersion, and VR's ability to evoke presence can be instrumental in creating a physical, social, or emotional frame for the subject being learned. This can potentially make it meaningful or understandable to the student in ways that may not otherwise have been possible, and can allow students to explore the ways a complex system's parts fit together. 

VR also can aid concentration because it isolates the student's senses from distractions that aren't part of the topic being explored and can be effective at creating virtual social learning environments, such as virtual classrooms.

Educational VR can (and should) be made easy to use, immersive and engaging, and meaningful to the student and can allow students to learn at their own pace and use its interaction to fuel their own exploration and discovery.

Commerce, advertising, and retail

VR in commerce (the nickname, v-commerce, is sometimes used to describe it) offers a range of new ways for customers to experience products and can create opportunities to connect customers with products they may not otherwise have encountered. Car buyers, for instance, can explore color choices and options in a virtual car configurator that allows them to experience what their chosen options would look and feel like around them. This experience can also be instrumental in moving an aspirational purchase out of the imagination and into the realm of something that feels real.

For retailers, VR offers a way to reach customers who are not able to visit shops, increasing accessibility and the likelihood of sales. Customers can see more clearly and in context what a product is, reducing confusion and returns. VR can give a customer a chance to try before they buy, even where the product might be too large, too far away, or too elaborate to demonstrate effectively by other means. Virtual showrooms, for example, can allow customers to place furnishings together into a virtual environment that allows them to see how the pieces would fit together and how they might fit in their own space.

VR can be used as well to facilitate an emotional connection with the brand, placing the customer into a virtual environment or experience that supports the brand's emotional space, such as a mountaintop or a fashion show.

Medicine and mental health

VR offers promising opportunities as well in psychology, medicine, neuroscience, and physical and occupational therapy. VR, for example, can be used in physical therapy by slowing time and allowing patients to perform actions slowly and repeatedly and has been used successfully for pain management. VR is also useful for providing simulated virtual patients for medical and emergency training.

In the fields of mental and behavioral health, VR has powerful applications in assessment, training, and the treatment of stress-related disorders. Patients can be exposed to complex stimuli to help to assess and rehabilitate cognitive functions for stroke, traumatic brain injury, and similar neurological disorders.

So much else

The through line through all of the uses of VR described is that VR works especially well to communicate context and create meaning through presence and to allow complex physical interactions with objects that just couldn't be done with a flat screen. Without question, there are still more valuable uses of VR that haven't yet been discovered or considered. The only limit is our own imagination.

Immersion and presence


Now that we've set up a bit of context about what VR is and a few of the many of the things we can do with it. Let's start getting our hands dirty and learn the following:

  • What makes VR work
  • What can break it
  • What we need to do as developers to make sure the VR experiences we build succeed

To that end, let's lay out a few best practices in VR, and then we'll talk about them in depth.

We'll begin by talking about the experience we're trying to create.

Immersion

When VR works, as we discussed earlier, it works through a process we call immersion, which we described earlier as a perceived experience of being physically present in a virtual world. For an experience to be immersive, a few things need to be true.

Using all the senses

First, it has to encompass a wide enough range of the user's senses that competing senses from outside the VR experience don't pull the user back out of the virtual space. In practice, this is why VR headsets are designed to block out all other light, and why they usually include headphones or on-board audio. Anything we see or hear that isn't part of the VR experience risks breaking immersion.

While vision and sound are pretty easily communicated through the eyes and ears, physical sensations are more difficult to produce. In VR, we refer to physical sensations as haptics. Decades of research have gone into figuring out how to recreate physical sensations, but in practice, it's a tough problem to solve. In the current generation of VR hardware, haptics take the form of a rumble pack in the player's controller that vibrates the controller on cue. While it's limited just to the hand holding the controller, even using such basic haptic feedback as this in your designs is still surprisingly effective for creating a sense of physicality in virtual space. A little vibration when the user's virtual hand contacts an object can go a long way toward making the object feel as though it's physically there and to allow users to sense its boundaries and know when they've made contact with it.

Note

Remember to use all the senses to create an immersive experience, not just the visual. Use sound to involve the ears in the experience and haptic feedback on the controllers to create physical cues.

Make sure sensory inputs match one another and match the user's expectations

Senses need to match the user's expectations, and they need to match one another for an immersive experience to feel real. When the user turns their head, an object they're looking at should move in their view as it would have if it were in the physical world around them. This part is pretty well handled for you by the Unreal engine and your VR hardware, but the next bit, sound, is often overlooked by developers.

Objects that produce sound should use spatialized audio to ensure that sounds seem to come from where the objects appear to be. As we mentioned a moment ago, physical objects should produce a tactile response using haptic feedback when the user appears to touch them.

Note

The behavior of visual objects is pretty much taken care of for you by the HMD and Unreal Engine, but make sure you use spatialized audio to localize sounds to their apparent sources, and experiment with haptic feedback to make physical actions feel more real.

Keep latency as low as possible

The quality of the visual and audio experience matters greatly to immersion, and the absolute most important factor driving this quality is the smoothness and responsiveness of the experience. What this means for developers is that frame rate matters above every other consideration in VR. VR developers use the term latency to describe the responsiveness of a VR application—the time between the user performing an action, such as turning their head, and seeing the visual result (in this case, the world appearing to rotate around them). Developers call this motion-to-photon time, and it's important. If the user turns their head and world lags behind, it won't feel real, and worse, can make them sick. Current VR headsets do quite a lot in hardware and software to minimize and disguise latency, but as a developer, you also have to do a lot as well to keep latency as low as you can possibly get it.

Note

Latency refers to the speed at which a VR application responds visually to the user's actions and is fundamental to immersion in VR. Research suggests that the absolute highest latency you can get away with is 20 milliseconds, but you should be shooting for far less.

In practice, this means that, when you have to choose between detail in your scene and frame rate (and you'll have to make this choice all the time as a developer), choose speed. Users will forgive a lower-resolution texture much more easily than they'll forgive a dropped frame. Much of your work in VR development will focus on getting your scene running at an acceptable frame rate, and we'll talk quite a bit about how to do this in Unreal. For now, make sure you keep this in mind: keeping latency low is absolutely fundamental to immersion in VR, and the choices you make in designing and developing a VR application have to be made with this in mind.

Note

When faced with a choice between image quality and framerate, choose framerate every time. Beautiful textures, high-poly models, and dynamic shadows won't create a convincing experience for the user if they're dropping frames. Users will fill in a remarkable amount of detail in their own minds, meanwhile, if the experience is running smoothly, while they won't believe it at all, or worse, will get sick if latency gets too high.

Make sure interactions with the world make sense

Interactions with objects should be consistent and they should make sense. With the immersive nature of VR comes an increased expectation that objects will behave as they would in real life. In traditional media, delivered on a flat screen and constrained to a frame, the user's eyes and brain are consistently reminded that they're looking at a flat image that isn't real, and they'll forgive a lot. But in VR, the world already surrounds them and seems to be real, and they'll expect it to behave as though it's real too. 

Things that don't behave or respond in ways they would in the real world can pull the user out of the experience and break immersion. There's a limit in practice: of course, you can't make every object in the world interactive, but to the degree possible, you should pay attention to what your users' expectations are going to be, and try to meet them. If you put an object in the scene that looks as if it can be picked up, expect users to try to pick it up, and understand that you'll be working against immersion if it doesn't behave as they thought it would. Try to make objects in the scene that look interactive be interactive, and if they can't be, consider moving them out of the play area or changing their appearance to manage the user's expectation. This is another area where judgment comes into play—not everything can be interactive, and you may not always want it to be, depending on the kind of experience you're trying to create. You should be making conscious choices with immersion in mind when deciding how objects in your world should behave, and those choices should feel consistent with each other within the space of the world and not arbitrary.

Note

Users will try to reach out and touch objects that look as if they can be touched and will try to move them. Try to satisfy their expectations where you can, or design your scene in such a way that these interactions aren't expected.

Explore the unique opportunities for interaction that VR, especially 6DoF VR with hand controllers, gives you. In previous media, users mostly interacted using a mouse, buttons, and joysticks, but in VR, the users hands interact with the world much more directly, and this makes an entirely new range of interactions possible. Where in a traditional game, a watering can might be used by pushing a button, in VR, the user can squeeze the controller grips to pick it up and turn their hand over to use it. Think about what makes sense in your world, and what becomes possible when the user's hands enter the picture, and design to make use of these opportunities. Interfaces don't have to consist of just buttons anymore.

The user's expectations for interaction will vary depending on the type of experience you're creating. If you're making a game that simulates the experience of being in another world, immersion matters a lot. If, on the other hand, you're making a movie viewer, the user probably doesn't really care whether a virtual coffee cup on a table nearby can be picked up, because that's not what they're there for. It's up to you to understand what's going to matter to your users and what isn't and to meet those expectations.

The way you represent the user's hands will drive expectations of how they'll behave as well. If they're modeled as hands, the user may naturally expect that they can pick objects up and move them around. If instead of hands, you display models of the controllers, palettes, weapons, or other tools, you're suggesting a different type of interaction. Users will try to do what it looks like they can do.

Build a consistent world

As we mentioned previously in the discussion about interaction, the whole experience should make sense as it fits together. Users should be able to construct a model of reality, even if it's an abstract or a complete fantasy, from what you give them in the world. The place you're building should feel like a place, with its own language and rules.

The amount of detail you put into your world can have an impact here. The more immersive an experience becomes, the more fragile that immersion becomes. Adding details and immersive elements creates a raised expectation that everything else in the world will live up to that standard too and can pull the user back out of immersion if something doesn't behave consistently with the apparent rules of the world. In many cases, you may want to render your world in a more stylized way to manage the user's expectations. Immersion doesn't require the VR experience to mirror the real world perfectly—it requires the experience to be consistent with itself.

Be careful of contradicting the user's body awareness

Be careful of adding immersive elements that contradict the player's awareness of their body. We all have a natural awareness of where our body is and what it's doing. This is called proprioception, and it's the sense that tells you where your arms and legs are even when you're not looking at them. Representing the user's body in ways that don't match this sense can break immersion.

Rendering the user's hands usually works well, as the motion controllers tell us exactly where they are, but this may not be such a good idea to render the rest of the arm, since we have no information about what that arm is really doing. If you guess and get it wrong, it will feel wrong to the user and can break immersion. It's often better not to guess at all, and simply render the hands up to the wrists, leaving the arms, legs, and body imagined. Interestingly, users seem to prefer this. They tend not to notice that the body is invisible until it's pointed out to them, whereas a body that's rendered but wrong calls attention to itself.

For similar reasons, realistic, fleshy hands can make users feel uncomfortable if they don't match their real-world hands. Hands work much better if they're stylizedas translucent, cartoonish, or robotic so users don't feel as if they're trying to simulate reality and getting it wrong.

Note

Animators commonly refer to a phenomenon called the uncanny valley, which occurs when a simulation gets just close enough to resembling a human that it triggers the viewer's instinctive awareness of everything that's wrong with it. For a simulation to work, it either needs to be stylized enough that viewers don't expect realism, or it needs to get the realism right. Anything in between is creepy. The same principle holds for representations of the user's own body in VR. Don't get it almost right. Get it perfect, or stylize it. 

Decide how immersive you intend your application to be and design accordingly

Finally, not every use of VR needs to be equally immersive. Your choices in this really hinge on what your application is intended to do. If it's a tool for visualizing engineering models, you may be most interested in VR's ability to allow the user to manipulate models easily, and it may not matter so much to you whether they really believe they're in another place. If, on the other hand, you're creating an immersive game or cinematic experience, these choices will be critical. It's up to you to figure out which of these rules matters most for your particular application.

 

Presence

Immersion in VR serves a single goal—the creation of an experience of presence in the user. Presence, as we previously defined it, is a sensation of being in a place, and this is, in large part, a phenomenon perceived physically. Very often, they respond physically and instinctively to things in the world such as heights or objects flying toward them. The body largely believes what it perceives in VR and responds accordingly. If you think of presence primarily in physiological terms, you'll have an easier time understanding how users experience it. What does this experience make your user feel?

A key to understanding presence is to understand that VR doesn't so much work by trying to simulate an environment accurately as it does by triggering and fooling a range of systems that we use to perceive the world. This is one of the reasons why we can get away with low detail in our textures if we get the movement of the world right and keep latency low—our perceptive systems are much more aware of motion than detail. VR doesn't have to fool all the senses—just the right ones in the right ways.

Simulator sickness

A major factor you'll be dealing with quite a lot in VR is simulator sickness. This is a form of visually-induced motion sickness that often occurs in VR, and you'll deal with it a lot.

As humans, we spend most of our time walking upright, which takes a tremendously complicated amount of coordination to achieve, and yet we manage to do it without thinking about it. We manage this through a structure in our inner ear called the vestibular system, which we use to coordinate movement and keep our balance. This system is extremely sensitive, and it works in conjunction with our vision and our sense of our body (proprioception) to understand how we're moving.

Note

You'll hear VR developers talk a lot about the vestibular system or the inner ear. For our purposes, since the vestibular system is located in the inner ear, we mean the same thing when we use the terms interchangeably. This is one of three systems that tell us whether we're moving and how to keep our balance. The other two are our visual system and our proprioception (our natural sense of our body's position). Problems arise when signals from these three systems don't agree with one another.

This creates a problem when visual information tells the body that it's moving, but it can't feel that movement in the inner ear. (Researchers call this the sensory conflict theory.) Seasickness and carsickness happen for the same reason. When visual movement cues and movement cues coming from the vestibular system in the inner ear don't match, the body can respond by triggering nausea, sweating, and other effects. (Researchers don't yet agree on why this is, but one theory suggests that when the senses don't match, the body may assume that it's been poisoned.)

The challenge with VR is that it does such a good job of simulating movement. The user's mind naturally accepts that the movement they see is really occurring, and runs into problems when the signals from the inner ear don't confirm this. Developers need to be conscious of this challenge and deal with it. We'll talk about ways to do this in a moment. (Be aware that the opposite is true too—always show movement in the headset if the user turns their head.)

Note

Simulator sickness, sometimes shortened to simsickness, is a form of motion sickness that can occur in VR. (You'll also sometimes see it shortened to VIMS for Visually-Induced Motion Sickness.) The most common cause of simulator sickness is poorly-designed locomotion. The second most common cause is high latency. How users move through the world, and how smoothly and consistently it responds to their movements are critical factors in combating simulator sickness.

Safety

Another major consideration is safety. Because VR completely overwhelms the user's senses, it's possible to put users into unsafe situations, and it is up to you as a developer to try to avoid this. If you tilt the horizon, for example, there's a high likelihood that your users are going to lose their balance. If you've designed an experience that involves big physical movements, such as swinging a sword or a baseball bat, be aware that your users can't see what's around them and can easily hit objects in the real world. Be conscious as well of factors that can cause eyestrain, such as forcing users to focus on UI elements that are too close to the camera, and photosensitive seizures that can be induced by flashing lights.

With these factors in mind, let's get specific about laying out a few best practices that can help to keep your users comfortable and safe.

 

 

Best practices for VR


Now that we've talked a bit about immersion and presence, let's take a look at a few specific practices we can follow to keep our users comfortable and avoid breaking immersion. Don't consider any of these to be set in stone (except the requirements to maintain framerate and to leave the user's head alone)—VR is still a very new medium and there's a lot of room to experiment and find new things that work. Just because someone says a thing can't be done doesn't mean it can't. That having been said, the following recommendations generally represent our current best understanding of what works in VR, and it's usually a good idea to follow them.

Maintain framerate

Are you sensing a pattern here? You absolutely must maintain frame rate. High latency will pull the user right out of immersion, and this is a leading trigger for simulator sickness. Consider the work you're asking the renderer to do in VR, and you'll see that this is going to be a bit of a challenge. The HTC Vive Pro displays a 2,880 x 1,600 image (1,400 x 1,600 per eye), while the original Vive and the Oculus Rift display 2,160 x 1,200 (1,080 x 1,200 per eye), and all of them require this to happen 90 times per second, leaving the renderer 11 milliseconds to prepare the frame. The Oculus Go displays 2,560 x 1,440 pixels (1,280 x 1,440 per eye) 72 times per second, meaning the renderer has about 13 milliseconds to deliver the frame. The Unreal Engine renderer is blazingly fast, but, even so, this is a lot to render, and there is not a lot of time in which to get the frame drawn. You're going to have to make some compromises to reach your target. We'll talk about ways to do this throughout this book.

Here's a list of headsets currently on the market and their rendering demands.

Tethered headsets

HMD Device

Resolution

Target Framerate

Oculus Rift

2,160 x 1,200 (1,080 x 1,200 per eye)

90 FPS (11 ms)

HTC Vive

2,160 x 1,200 (1,080 x 1,200 per eye)

90 FPS (11 ms)

HTC Vive Pro

2,880 x 1,600 (1,400 x 1,600 per eye)

90 FPS (11 ms)

Windows Mixed Reality

It varies. Most display 2,880 x 1,440 (1,440 x 1,440 per eye)

90 FPS (11 ms)

Standalone Headsets

HMD Device

Resolution

Target Framerate

Gear VR

It varies depending on the phone used.

60 FPS (16 ms)

Oculus Go

2,560 x 1,440 (1,280 x 1,440 per eye)

72 FPS (13 ms)

Oculus Quest

3,200 x 1,440 (1,600 x 1,440 per eye)

72 FPS (13 ms)

 

Bear in mind as well that you should aim for frame rates slightly higher than these targets so hitches don't cause major discomfort.

VR hardware does do a bit of work to reduce perceived latency if the frame rate drops and the new frame isn't ready to be rendered when the headset needs to display it, but it does this by a bit of trickery. In these cases, the hardware will re-render the last frame and adjust it to fit the user's current head movement, so what the user sees isn't an exactly correct frame—it's just better than dropping the frame altogether. (Oculus calls this process Asynchronous Time Warp (ATW), and on the Vive it's called Asynchronous Reprojection.) Don't use time warp or reprojection as a crutch, though—they're there to keep the user comfortable when your application hitches, but it's still a degraded experience for the user. Don't let your application miss the target frame rate for extended periods.

Also be sure to test your application on the minimum spec hardware you intend to support, and give your users ways to scale the rendering demands so they can meet the frame rate target on the hardware they're running.

Never take control of the user's head

Beyond dropping frames, the next most common cause of simulator sickness is the sensory conflict we mentioned earlier—a mismatch between motion perceived visually and motion felt in the inner ear. There are two major types of motion you're going to need to accommodate in VR:

  • Movement of the player's avatar (walking around, teleporting, or piloting a vehicle)
  • Movement of the player's head relative to their avatar

Movement of the player's avatar is handled by the locomotion system you implement for your experience. You really don't have a choice here—you're going to have to create movement that isn't happening in real life, but there are things you can do to make this less of a problem and we'll talk about them shortly.

Note

The word avatar originated in Sanskrit and referred to the embodiment of a deity in human form. In its current usage, it extends this metaphor to refer to the embodiment of a human user in a virtual world. You'll hear the term commonly used to refer to a character in a simulated world under the control of a human player. Its companion term, agent, refers to a character under the control of an AI routine.

You should never interfere, however, with movement of the player's head.

What this means in practice is: never move the camera in a way the user didn't cause by their own actions. If you're making a game and the user's avatar dies, don't leave the camera bolted to the head as the body falls. You will almost definitely make users sick if you do this. Consider cutting to a third-person view instead or handling the action in some other way. Never move the camera to force the user to look around in a cinematic, and don't apply a walking bob or a camera shake. The user should control their head always.

Note

Never move the camera separately from the user's head, and never fail to move the camera when the user's head is moving. You should always maintain a 1:1 correlation between head movement and camera movement relative to the user's avatar.

This applies both ways. If the player moves their head, the camera must move, even if the game is paused or loading. Never stop tracking.

If you need to teleport the user to a new location or change cameras for any reason, consider using a fast fade to black or white to cover the transition. People instinctively blink when they turn their heads quickly, and it's a good idea to mimic this behavior.

In-game cut scenes need to be handled differently in VR than they would be on a traditional flat screen for the same reasons. Ordinarily, in authoring a cut scene, you would take control of the camera, moving and cutting from shot to shot, but you can't do this in VR. You can't control where your user is going to look, and you need to be careful moving them around. This leaves you with a few options. First, if your scenes are pre-rendered, then you really have no choice but to map them on to a screen in the virtual environment. This breaks immersion, but is no more difficult for the user than watching a movie in real life. If you're doing them in-engine, you need to think about how you're going to handle the player's point of view.

For a first-person point of view, it's probably best to stage the cinematic scene around the user and allow them to look and move freely within it. You can't cut away to another shot when doing this, and you can't guarantee that your user will be looking where you want them to look when a key moment occurs, but it's the most immersive approach you can take.

Cut scenes can also be handled in the third-person, in which you pull the user's viewpoint out of their body and allow them to view the scene unfolding, but you need to do this carefully—the out-of-body experience can be disorienting for your player and can weaken immersion and the player's identification with the character.

For film-making in VR, be very careful of the ways you move the camera. Even very small moves may induce sickness. Users will tolerate forward movement more easily than side-to-side or rotational movement and seem to tolerate movement more easily if it's justified by a visible vehicle or some other way of explaining why it's happening.

Thinking about how to use the camera in VR isn't just about managing user discomfort either. This is new territory, and the rules you learned from film and gaming work differently here. You're designing to recreate the user's eyes, not a camera, and this has far-reaching implications for your compositions. How does the user move? Do they know where you want them to look? What do they see when they look at their hands? What about a mirror? How does surrounding the user with a world (instead of making them watch it through a window) change their relationship to it? All of these factors require a conscious choice as you develop your work.

Do not put acceleration or deceleration on your camera

Depending on the type of application you're creating, you're probably going to need to give your users a way to change their location, either by teleporting or moving smoothly. (We'll dig into this in depth in a later chapter.) If you do choose to implement a smooth movement method though, don't accelerate or decelerate as the player starts and stops moving. Start the movement at full speed, or if you opt to smooth your starts and stops at all, keep them very short. (And, of course, never do a start-moving or stop-moving animation that takes control of the user's camera.)

Do not override the field of view, manipulate depth of field, or use motion blur

We mentioned a moment ago that VR mimics the user's eyes, not a camera. For this reason, don't do things in your simulation that eyes don't do in real life. Never change the focal length of the camera lens or its depth of field. The eyes' focal lengths don't change the way cinematic zoom lenses do, and you're very likely to make your user sick if you change this.

Manipulating the depth of field isn't a good idea in current-generation VR, as we don't yet have a reliable way to know what the user is actually looking at within the view. In the future, as eye-tracking improves, this will likely change, but for now, don't make this choice for your user.

Motion blur shouldn't be applied to your camera or objects in the scene. This is an artifact of the way film photographs a static frame for a fixed period of time, smearing the motion within that frame, but that's not the way eyes work, and it will look unnatural in VR.

While we're on the topic, steer clear of other camera-mimicking effects, such as lens flares and film grain. Again, these mimic the behavior of film, not the eyes, and we're not trying to mimic film in VR. Filmic effects such as these can also cause unwanted physical side effects in the user, contributing to simulator sickness if the effects don't line up between the eyes, and they cost precious frame time to render. Don't use them.

Minimize vection

Have you ever looked out the window of a car sitting still, and watched a large vehicle such as a truck or bus moving, and felt as if you were moving in the opposite direction instead? This phenomenon is called vection, and it refers to the illusion of self-movement produced by optical flow patterns. If a large portion of your view is moving, this can produce sensations of movement in your body, and as we discussed earlier, sensations of movement that don't match the signals from the inner ear can trigger simulator sickness.

Note

Vection is the illusion of movement produced when large parts of your field of view move. Optical flow, or optic flow, refers to the pattern of movement of the contents of your view, and it's these patterns of movement that cause vection.

What this means in practice is that, if a big chunk of your user's view is moving, you're at risk of inducing simulator sickness. We've talked about this already with regard to moving the user's head (don't do it), and we've touched on some of the ways we can handle this in your locomotion system, but you'll also want to be aware of other circumstances that can cause vection.

Be aware of moving patterns that fill large parts of the frame—whether or not they're part of your locomotion system, they can still create an illusion of motion, which may be a problem for your users.

 

Several games and applications have experimented with a tunnel vision effect to reduce vection when users need to move quickly through the environment—when the player's avatar runs, an iris closes in from the edges of the view to reduce peripheral vision.

Users seem to be much more tolerant of forward movement than they are of strafing—moving side-to-side. This may in part be because, in real life, we move forward far more than we move sideways, but it may also be because the optical flow the user sees when moving forward still has a relatively fixed point at the center, whereas in sideways movement, everything in the view moves.

Note

When you're trying to figure out whether a particular movement in VR is likely to cause simulator sickness, it can be useful to think about the kind of optic flow that movement is going to create. Optic flows with relatively fixed reference points, such as the horizon when running forward, may be fine, while flows that move everything in the view, such as sideways movements, may not.

Rotating the player's view is especially problematic. It moves pretty much everything in the view, and the vestibular system is especially tuned to detecting rotation. Be very careful here. Smooth rotations are generally not a good idea, but developers have found that snapping the user to a new rotation works well to reorient the user without making them sick. It turns out that the brain is very good at filling in interruptions in perception, so snapping to a new rotation or "blinking" the view during a large movement can be very effective at disrupting the perception of motion without distracting the user.

Many developers have also found that giving users a visible vehicle that moves with them, such as an aircraft cockpit, can mitigate the effect of vection when rotating. Whether this is an appropriate solution for you depends on the type of experience you're creating, but the takeaway here should be that users seem to be less prone to simulation sickness if they're given fixed points of reference in their view. Where this is appropriate, consider factoring it into your design, and where it isn't, consider other ways of breaking the optic flow if you have to do large smooth movements such as blinking or snapping.

Avoid stairs

If you're allowing your user to move smoothly through your environment, be aware that certain features of environments can provoke simulator sickness when users navigate them. Stairs are especially bad. Stairs that provide collision for every step so the view bounces when user navigates it are worse. Environment features such as these that create a sense of vertical movement when traversed can be difficult because the inner ear is very sensitive to changes in altitude.

Avoid stairs if you can. If you can't avoid them, be conscious of how steep they are and how fast you're letting your user move over them. You'll have to test a bit to get it right.

Use more dimmer lights and colors than you normally would

Be careful of using bright lights and strong contrasts in your scene. Bright lights contribute to simulator sickness in some users, and strong contrasts can increase the user's sense of vection as the world moves across their view. Also, with current hardware, bright lights can often create a flare on the headset's fresnel lenses, which can pull users out of immersion by reminding them of the hardware they're wearing. In general, it's recommended that you use cooler shades and dimmer lights than you normally would.

Keep the scale of the world accurate

VR communicates the scale of objects in the world in ways that flat screens simply do not. Each of us sees the world in stereo vision through a pair of eyes that are a fixed distance apart. This distance, called Interpupillary Distance (IPD), contributes to our sense of how large or small objects in the world appear. Most VR headsets can be adjusted to match the interpupillary distance of their user and should be adjusted correctly to minimize eyestrain.

Note

The distance between the pupils of the user's eyes is called the interpupillary distance and is a major contributor to a user's sense of how large or small objects in the world are.

What this means for you as a developer is that the scale of objects in your world matters. On a flat screen, the user is limited to comparing the size of an object to another object to determine how large it is, but in VR, the user's IPD drives an absolute sense of scale. An object that's too large or too small on a flat screen will still appear normal if it's alone on the screen. The same object in VR, even if there's nothing to which it can be compared, will look wrong to a viewer in stereo 3D.

Some users may be prone to simulator sickness if the scale of the world feels wrong, and even those who aren't will still likely feel that the world feels "wrong," without necessarily knowing why.

Make sure objects in your world are scaled correctly. In Unreal, by default, one Unreal Unit (UU) is equal to one centimeter.

Be conscious of physical actions

Your users in VR are moving around the real world wearing electric blindfolds. Respect this, and be careful what you ask them to do in VR. Take care when asking users to swing their arms, run, or strafe, as they can easily run into obstacles or walls in the real world. For headsets with cables, don't ask users to turn repeatedly in the same direction and tangle themselves in the cable. Be conscious as well of asking users to reach for objects on the floor or outside their normal reach area—this may not be easy or possible in their real-world physical environment. As mentioned earlier, avoid shifting the horizon in ways that could cause your user to lose balance. Remember that nearly all of the user's information about the world is coming from the VR simulation while they're in it—be conscious of how this information lines up with or contradicts what's in the invisible physical world around them.

Manage eyestrain

The eyes use muscles to focus on objects and orient the eyes, and these muscles, like any other, can get fatigued. We call this eyestrain. Symptoms of eyestrain can include headaches, fatigue, and blurred or double vision. As a designer, there are things you can do to minimize eyestrain in your users, and understanding a little about what causes eyestrain will help you do this.

First, eyestrain can be caused by flickering. We've already talked a lot about the importance of keeping latency low—this is another reason to keep low latency a priority. Don't create purposely flickering content, as this can produce eyestrain but could also trigger photosensitive seizures.

Note

Flickering caused by high latency can cause eyestrain. Keep your latency low.

Second, the eyes need to do some physical work to focus on an object in 3D space. They have to adjust the shape of their lenses to focus on the object (this is called accommodation), and they need to aim themselves so their lines of sight converge at the object. This is called vergence. We naturally have a reflex that correlates these two actions with each other, so the eyes naturally want to converge to a depth plane that matches the depth to which their lenses are focusing, and the lenses naturally want to focus in a way that matches where the eyes are converging. The problem comes in VR, where the actual images the eyes are seeing are a fixed distance away, but the content of those images exist at a variety of virtual depth planes, so the eyes still have to rotate so they converge at the objects they're looking at. This creates a conflict, as the focal depth the lenses are accommodating doesn't match the depth at which the eyes are converging, and it can cause eyestrain.

Note

Eyestrain can be caused by two factors in VR: flickering, which can be managed by keeping your latency low, and conflict between the fixed distance at which the eyes' lenses need to focus to see the headset screen, and the changing distances at which they need to converge to see objects in stereo depth. This is commonly called the vergence-accommodation conflict, and you can manage it by keeping important objects in the virtual world about 1 maway so the vergence and accommodation demands mostly line up.

You can manage this when designing your world by keeping these two demands in mind. The fresnel lenses on the HMD make the headset screen appear to be about 1 m from the eyes, allowing the lenses to accommodate to a focal plane about 1 m away. The user's eyes, then, will naturally find it easier to focus on objects in the virtual world that appear to be about that far away. In practice, objects are most easily viewed at a range of 0.75 m to 3.5 m, with 1 m seeming to be ideal. Avoid making users look for long periods at objects less than half a meter away from the eye. 

Note

Put objects you know your user will be fixating on for long times at least a half-meter away from the camera and ideally around 1 m to minimize eyestrain.

Don't force your user to be an eyeball contortionist to view your user interface. Attaching a GUI to the user's face is usually a bad idea—as they turn their head to view a UI element, it appears to "run away" because it's attached to the same head that's turning to try to look at it, so users have to turn their eyeballs alone to focus on it. Don't do this to them. It's irritating to users, fatiguing, and has no real-world analogue. Put your UI in the world so your users can focus on it from comfortable viewing angles and at a comfortable distance. Attaching UI elements to the user's body, such as a wrist, can work well as it allows users to bring it into view when they want to interact with it. Putting GUI elements into a cockpit or vehicle can work well too. UI elements can be placed around the world and revealed when the user looks at them.

Keep GUI elements within the ideal range we discussed and at an angle that allows it to be read without straining, if you do wind up attaching it to the user's head. 

Try to avoid creating situations that force the user to change focal distance rapidly and often. If you're making a shooter, for example, that puts critical information on a nearby UI element while the enemies are in the distance, you may be creating a situation that will force your user to change focus frequently to check the UI and focus on enemies in the field. In a flat-screen game, this wouldn't be a problem, but in VR, it will tire them out. Design your UI in such a way that the user can get critical information without focusing on it—easy-to-read graphical elements, for example, or consider putting UI elements over the enemies' heads.

GUI elements can be occluded by objects in the world that are nearer to the camera than the UI element is. Don't try to use tricks from 2D gaming space to change this. In 2D game design, it's common to draw a UI element over a 3D element even if that element would really block the player's view of it. If you do this in VR, however, you'll create a confusing stereo image that won't be at all comfortable to look at. Accept the reality that your UI exists as a physical object in the world and follows the same rules as other physical objects.

Make conscious choices about the content and intensity of your experience

Presence, when it's achieved in VR, produces strong reactions. It's an intimate experience, a visceral experience, and sometimes a fear-inducing experience. Be conscious of what you're doing as you craft experiences—you can easily trigger a fight-or-flight response in some users. This might be exactly what you intend, and we're not suggesting that you shy away from whatever it is you're trying to create. But be aware that you can be playing with strong stuff here and make intentional choices. VR is much more capable of triggering phobias than its flat-screen predecessors because the user is immersed in the space and not being constantly reminded by their peripheral vision that what they're seeing isn't true. Be on the lookout for circumstances that can induce vertigo, claustrophobia, fear of the dark, fear of snakes, spiders, or other phobias. Remember also that users will react more strongly to threats within their personal space.

Note

For those of you deliberately playing with fear in VR, making horror experiences, or therapeutic experiences to treat PTSD, there are meaningful distinctions between film and VR—the user always exists in VR, which isn't the case in film. They have an instinctive sense of personal space that you can use to great effect. Film doesn't have this either. In film, an object that's supposed to seem close is just big on the screen, but it's still whatever distance away from the user that the screen actually is. In VR, this space is real. It's right behind you in VR really means that it's right behind you.

Let players manage their own session duration

VR puts demands on the user's body, eyes, and mind that other media don't. They're wearing a device on their head, and often standing or moving physically. Design your experience to let them exit whenever they want to or need to and resume later on. Let them take breaks as they need them.

Keep load times short

In contrast to games and applications on flat screens, users in VR can't do anything else while they're waiting for the application to load. Optimize to keep your load times short. Remember as well that, even during a load, your application must be responding to the user's head tracking.

Question everything we just told you

VR is in its infancy as a medium and an art form. It's far too early to pretend we know what its rules are really going to turn out to be. In the early days of film, actors were always filmed in full-frame, because the conventional wisdom at the time was that audiences wouldn't pay to see half an actor. Be equally willing to question the guidelines and advice you receive in VR design. These represent the current best understanding of what seems to work, but that doesn't mean that there aren't other ways to do things that haven't been tried. Be open to them. This is part of the reason why these guidelines were each presented with information about why they exist—so you can understand where they're coming from and make your own choices and try your own experiments. You're on a frontier in VR, part of the creation of an entirely new means of communication. Don't be afraid to explore.

Planning your VR project


We've talked quite a lot about VR in the abstract—what we can do with it, and what we think we know so far about how it works and what works well within it. From here on out, this book is going to get pretty practical and hands-on, and our hope is that, as we go through these projects and learn how to build VR experiences in Unreal, these principles we just talked about stay in your mind and guide your choices.

With that in mind, there's one last topic we should explore before we start getting our hands dirty, which is how to turn an idea into a thing you can actually make.

Clarify what you're trying to do

The first thing to do in developing a design is to decide what it's for. This may sound obvious, but it happens all too often that developers jump right into a project and start building without first taking a step back and figuring out what it is that they're really trying to make and who it's for. The result, more often than not, is either an unfocused experience that doesn't really achieve what it was intended to do because the parts aren't all working together to support a common goal, or a project that takes a long time to complete. This wastes a lot of work, as developers discover things that need to change and have to throw out existing work to make the changes. By taking some time to plan before you begin building software, you can save yourself a lot of effort and make it more likely that the project will succeed.

 

 

The first thing to remember in design is that the more you build, the more difficult and expensive changes get, so try to make these decisions as early in the process as you can. The cheapest prototype you can make is in your own mind. The second cheapest is on paper. Once you start building software, start with the bare minimum you need to get your project running—a gray box environment, or a simple prototype, and test it to see how it needs to change. You're almost guaranteed to discover a few things you hadn't anticipated, and this is the time to discover these things and change what you need. Once you've gone through this process, discovered what really works and what doesn't, and adjusted your design to respond to what you learned, now you're ready to begin putting expensive art and polish into the work. Too many developers work backward and try to make the final product right out of the gate, and they get locked into decisions that could have been changed more easily if the legwork had been done first.

With that in mind, the first thing to think about is who the project is for and why you're making it for them. Is this a game or an entertainment experience? What do you want the user to feel? What will they be doing while they play or participate in the experience? The same questions apply to cinematic VR—what's this experience about? What story are you trying to tell? Take a moment to write this down.

If you're making a learning experience, what does the user need to learn? What's the best way to teach it?

If you're making an architecture or design visualization application, what's important to your end user? An architect or engineer may want to be able to look inside walls and structures to see electrical and plumbing designs, while a real-estate buyer may care more about the quality of light in the space.

Figure out who your user is, and what's important to them, and clarify what you're trying to create and what's important to you. This should be done on paper. It's very easy for vague design elements to hide in your mental model, only to reveal holes or unexpected questions when you start to write them down.

Is it a good fit for VR? Why?

Pretty much the next thing you should do once you've clarified your design intention for a VR project is to think about how it fits in VR.

Think about your project in terms of what VR allows you to do. Does it rely on immersion and a strong sense of presence to work? Is it about making use of VR's ability to simulate the body or to give context to information? Why does your project work better in VR than it would on a flat screen? What can your user do or experience that they couldn't in traditional media?

Think about the challenges VR imposes as well. As we've seen in the best practices mentioned, VR imposes a different set of challenges than traditional media. Simulator sickness is a major concern—does your project require you to move the camera in ways that are going to be uncomfortable for your users? Does it rely on your users to move in a way that may be difficult or impossible in VR? Are you asking your users to read lots of small text that may not be legible on current headsets? Think about the practices we outlined, and evaluate whether any of them pose challenges to your design. This doesn't necessarily mean your design can't work in VR, but it does mean that you'll have to do some additional design thinking to work through those challenges.

Your choice to put your project into VR should be made deliberately. You should be able to describe why your project works better in VR than in traditional media, and how you plan to handle the challenges VR imposes. This too should happen in writing. You'll probably discover opportunities you hadn't seen, and a few challenges you hadn't realized you'd need to overcome. Writing these down will help you to understand what's important about your project and what needs to happen for it to succeed.

What's important – what has to exist in this project for it to work? (MVP)

Now that you've clarified who your project is for, what you intend it to do, and why it makes sense to do it in VR, you're ready to begin figuring out what it's really going to take to build it. It's helpful to figure this out in terms of a Minimum Viable Product (MVP). This, simply put, is a version of the product that contains only what's needed for it to satisfy its intention. An architectural visualization project, for example, needs to put the viewer into the building at the correct scale, and give the user some way to move around and see what it looks and feels like from different perspectives. What your MVP contains is your choice as a designer, but you should be clear about whether the thing you're talking about is a thing you need or a thing you want. If the project simply isn't worth doing if you can't get a given feature into it, then it's a needed feature and should go into your MVP. If it would improve the experience but users could still get what they needed without it, it's not part of the MVP.

Note

MVP refers to a version of the project that contains only what's needed to satisfy its goal and little or nothing else. Clarifying your MVP can help you to understand what the spine of your project is, which can tell you what to prioritize and gives you a baseline from which to evaluate whether your project succeeds at achieving what it set out to do.

The contents of your MVP will differ greatly between different types of projects—the needs of a cinematic VR experience are substantially different than those of an engineering visualization application, but as a designer, you should know what they are and write them down. You don't need to write a book or an essay here—a list of bullets should be enough, but for each item on the list, ask yourself whether the project could still do what it's intended to do without it, and be clear about your answers. Wants, even strong wants, aren't needs. The point here is to know where your floor is.

Be on the lookout as well for things you missed. Imagine your user using your project—what are they trying to do, from moment to moment, from the moment they start up the application until the moment they shut it down? Use this exercise to discover items you missed, and figure out whether they're wants or needs, and get them on to the list if they belong there.

Break it down

If you've gone through the preceding exercises, you should have a clear idea of what your project is for, why it works in VR, and what needs to be in it for it to work. Now you're ready to figure out how to do it.

For the items in your MVP, what do you need to make them exist? Do you need a UI element to display information to the user? Do you need a way for your user to move around? Does the user need to be able to load or save information, or connect to a server?

For each item in your list, figure out what that item really requires you to build and write it down. You should come out of this exercise with a pretty clear breakdown of the things you need to do to get your project built.

Note

breakdown is a list of things you need to do or build to get your project made. Use it as a tool to ensure that you haven't missed required elements, or underestimated risks, and to see whether the project you're trying to build is realistically achievable with the time and resources you have. It's a tool for spotting problems early, while you still have a chance to fix them, and then later for tracking your progress as you build.

Look through this list—where are the big jobs, and where are the big risks? Can you achieve all of this with the time and resources you have? Do you need to re-evaluate your scope if it's starting to look too big? Bear in mind that it's almost always better to do fewer things well than to try to do everything and do a poor job of it. It's common, at this point, to discover that the project scope exceeds what you can realistically do well, and this is a good thing. The time to discover this is now while it's still on paper and you can reorganize the work, move items off the MVP, or change your schedule or resources. If you discover these problems near the start, you have a fighting chance of solving them, whereas if you discover them only once you're months-deep into software development, you may discover that you've painted yourself into a corner. Set yourself up for success on paper while you still have the flexibility to do so.

Tackle things in the right order

Some items in your breakdown will be easier to do than others, and some will be more fun. Use your judgement when you figure out the order in which you should do things. In general, it's a good idea to tackle the risky things first. If something's important enough to be in your MVP, and there's a risk that it could go long, or might not work at all, it's often smart to get it out of the way early. Doing this gives you time to iterate on a risky item while you work on other things, or in the worst-case scenario, if you discover that a thing you'd counted on just can't be done, you're still early enough in your project that you may be able to fall back to another plan. Don't leave high-risk, high-priority items to the end—you'll be in trouble if something goes wrong.

Look for dependencies between items. If something can't be done before another thing is done—a character, for example, that can't be animated until the character is built and rigged, then make sure those dependencies factor into your plan. It does you no good to plan to do a thing in a certain order and then discover that you can't because something it depends on isn't ready.

Note

As you plan how you're going to get through your breakdown, look for items that involve risks or uncertainties, items that are just going to take a long time, and items that depend on other items. Factor these into your plan. In general, where you can, do your high-importance, high-risk work early in your project so you have time to handle things if something goes wrong.

A word about project management—there's a forest of literature out there about planning and tracking projects, and discussing them in depth falls outside the scope of this book. Broadly, these fall into two major schools of thought: waterfall and agile. Waterfall project management methods lay out tasks in a rigid order that assumes that, once one task is done, the next can begin. This works well when the things you're doing are well-defined and don't entail much risk, such as painting a house, but VR design and development rarely works out like this. You simply may not know whether a feature is done until you see it running alongside other systems, and you may have to loop back at that point and change something or rework it entirely. Agile methods, such as Scrum, take this reality into account, and are intended for design and development projects where things are going to need to be revisited as the project evolves and reveals new information. In general, agile methods work much better for software development than waterfall plans.

Depending on the scope of the project, you may not need to apply a formal project management method, but even if you're planning loosely, there should still be a plan, and you should make sure the plan accommodates the reality that you're going to have to loop back and iterate on features and design, that some items are going to depend on others, and that some things are going to take longer than you thought.

Test early and often

Test your designs as early as you can. VR especially is a very new medium, and people respond to it in very different ways. Test with as diverse a range of subjects as you can, as early as you can, so you can spot things you need to change while they're still relatively easy to change.

Remember as well that VR developers make terrible test subjects. We use VR far, far more than other users and tend to be much more comfortable with VR interfaces and much less prone to simulator sickness than typical users. Test VR with users who are new to it as well as with users who are comfortable with the medium.

Test with as diverse a population as you can. VR embodies the user in ways that previous media don't, and this can matter to your users. Hands that look fine to you may feel alien to a user whose hands look different. Make sure your test population isn't limited to just people like you.

Look for opportunities to test as early in the process as you can make them happen. Even long before you've reached your MVP, test elements such as locomotion systems that may need design iteration. Put users into a gray-box environment and have them navigate through it, and watch what they do and where they get stuck. The more testing you do, the better your project will get, and the earlier you test, the easier it is to act on what you've learned.

Design is iterative

Lots of people assume that finished products somehow spring fully-formed from the minds of genius designers or developers. It doesn't work this way. Anything worth making takes iteration to get there.

Prepare yourself now for the reality that the first iteration of your design isn't going to be everything you wanted it to be, and that's the point. The purpose of the first draft of anything is to show you what's really important about the thing you're building and how it needs to fit together. Plan for this. Design is a process, and time and iteration, more than anything else, are the key elements of that process.

This is why we so strongly advise designing on paper first and testing prototypes in software as early as you can. Each time you give yourself something tangible to respond to, you're going to discover something about it and probably discover a way to make it better. 

Summary


In this chapter, we looked at what VR is and some of the ways it can be used in the real world. We talked quite a bit about immersion and presence. Let's recap for a moment here.

Presence, we said, is a physiological sensation of being in a place, and is really the point of VR. We create VR to create presence. Immersion is the means by which presence is brought about, and involves taking over the user's senses completely enough that they can begin to believe the virtual world around them.

We discussed a number of currently-held best practices for creating good VR. The most important of these was the need to keep latency as low as possible and the need to be very careful of how you move the user's viewpoint. Simulator sickness is largely caused by conflicts between a visual sense of motion and the lack of motion felt by the inner ear. Breaking up movement and being aware of the types of movement most likely to trigger simulator sickness are important for keeping your users comfortable in your experience. We also talked about safety—the need to be conscious of the kinds of movement you're asking your users to perform, about designing to avoid eyestrain, and the need to be careful about triggering photosensitive seizures.

Finally, we outlined a process for planning a VR project and iterating on its design to make the best project you can and ensure that it succeeds at what you intended it to do.

In the next chapter, we're going to dive in and start getting our hands dirty with the Unreal Engine, and from here on out, the rest of this book will be hands-on. We hope that the ideas outlined in this chapter will stay with you as you develop, and help you to succeed, not just in making running VR applications, but in making them well.

With that out of the way, let's get to work. 

Left arrow icon Right arrow icon
Download code icon Download Code

Key benefits

  • Learn about effective VR design and develop virtual reality games and applications for every VR platform
  • Build essential features for VR such as player locomotion and interaction, 3D user interfaces, and 360 media players
  • Learn about multiplayer networking and how to extend the engine using plugins and asset packs

Description

Unreal Engine 4 (UE4) is a powerful tool for developing VR games and applications. With its visual scripting language, Blueprint, and built-in support for all major VR headsets, it's a perfect tool for designers, artists, and engineers to realize their visions in VR. This book will guide you step-by-step through a series of projects that teach essential concepts and techniques for VR development in UE4. You will begin by learning how to think about (and design for) VR and then proceed to set up a development environment. A series of practical projects follows, taking you through essential VR concepts. Through these exercises, you'll learn how to set up UE4 projects that run effectively in VR, how to build player locomotion schemes, and how to use hand controllers to interact with the world. You'll then move on to create user interfaces in 3D space, use the editor's VR mode to build environments directly in VR, and profile/optimize worlds you've built. Finally, you'll explore more advanced topics, such as displaying stereo media in VR, networking in Unreal, and using plugins to extend the engine. Throughout, this book focuses on creating a deeper understanding of why the relevant tools and techniques work as they do, so you can use the techniques and concepts learned here as a springboard for further learning and exploration in VR.

Who is this book for?

This book is for anyone interested in learning to develop Virtual Reality games and applications using UE4. Developers new to UE4 will benefit from hands-on projects that guide readers through clearly-explained steps, while both new and experienced developers will learn crucial principles and techniques for VR development in UE4.

What you will learn

  • Understand design principles and concepts for building VR applications
  • Set up your development environment with Unreal Blueprints and C++
  • Create a player character with several locomotion schemes
  • Evaluate and solve performance problems in VR to maintain high frame rates
  • Display mono and stereo videos in VR
  • Extend Unreal Engine s capabilities using various plugins

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Apr 30, 2019
Length: 632 pages
Edition : 1st
Language : English
ISBN-13 : 9781789133882
Languages :
Tools :

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Billing Address

Product Details

Publication date : Apr 30, 2019
Length: 632 pages
Edition : 1st
Language : English
ISBN-13 : 9781789133882
Languages :
Tools :

Packt Subscriptions

See our plans and pricing
Modal Close icon
$19.99 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
$199.99 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just NZ$7 each
Feature tick icon Exclusive print discounts
$279.99 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just NZ$7 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total NZ$ 194.97
Hands-On Artificial Intelligence with Unreal Engine
NZ$64.99
Unreal Engine 4 Virtual Reality Projects
NZ$64.99
Unreal Engine 4.x Scripting with C++ Cookbook
NZ$64.99
Total NZ$ 194.97 Stars icon
Banner background image

Table of Contents

12 Chapters
Thinking in VR Chevron down icon Chevron up icon
Setting Up Your Development Environment Chevron down icon Chevron up icon
Hello World - Your First VR Project Chevron down icon Chevron up icon
Getting Around the Virtual World Chevron down icon Chevron up icon
Interacting with the Virtual World - Part I Chevron down icon Chevron up icon
Interacting with the Virtual World - Part II Chevron down icon Chevron up icon
Creating User Interfaces in VR Chevron down icon Chevron up icon
Building the World and Optimizing for VR Chevron down icon Chevron up icon
Displaying Media in VR Chevron down icon Chevron up icon
Creating a Multiplayer Experience in VR Chevron down icon Chevron up icon
Taking VR Further - Extending Unreal Engine Chevron down icon Chevron up icon
Where to Go from Here Chevron down icon Chevron up icon

Customer reviews

Top Reviews
Rating distribution
Full star icon Full star icon Full star icon Full star icon Half star icon 4.6
(18 Ratings)
5 star 83.3%
4 star 0%
3 star 11.1%
2 star 0%
1 star 5.6%
Filter icon Filter
Top Reviews

Filter reviews by




Jonona Jun 29, 2019
Full star icon Full star icon Full star icon Full star icon Full star icon 5
I am about halfway through this book so far, and already it has propelled my knowledge farther than other resources. Just the thorough discussion of locomotion by itself justifies the cost of the book. The true value though is found in the way the authors take the time to explain why things are being done, not just how. A lot of books become obsolete rather quickly because UE4 changes so fast. But this book's philosophy of explaining why things are done will make it useful long after the engine version changes. It doesn't depend on outside tools, just sticks to the engine.The verdict: Give a man a fish, and you feed him for a day. Teach a man to fish, and you feed him for a lifetime.
Amazon Verified review Amazon
Rohit Aug 08, 2022
Full star icon Full star icon Full star icon Full star icon Full star icon 5
This is a very great book. I got to know a lot from this. Thank But I have a question*Is this also for Oculus quest 2?*
Amazon Verified review Amazon
P. Rerksanant Feb 11, 2020
Full star icon Full star icon Full star icon Full star icon Full star icon 5
More details and cleary, easy to follow and understand.
Amazon Verified review Amazon
Dino Feb 15, 2020
Full star icon Full star icon Full star icon Full star icon Full star icon 5
A fantastic book. Very clear and easy to follow, gives a great introduction to developing VR in Unreal and at the same time introduces lots of the fundamentals of Blueprint design in general. the authors also focus on good coding practice.The only negative point I have is that on the front cover it mentions C++ but the book is almost totally focussed on Blueprint coding, so if you specifically want to learn C++ coding for Unreal you would need another book.
Amazon Verified review Amazon
cybereality Feb 06, 2021
Full star icon Full star icon Full star icon Full star icon Full star icon 5
This book right here is gold for anyone wanting to get into virtual reality development with Unreal. There is not a lot of handholding, you are expected to understand the basics of using Unreal to get the most out of the book. Rather than trying to explain programming fundamentals or UE4 basic functionality, the authors instead jump straight into actual VR development. That said, they do show how to setup your development environment at the start, but this is mostly for completeness. The book is about VR specifically, which is great because there are already tons of resources for general development that are not repeated here.Contained in the text a 11 chapters, each building upon each other for how to work with VR in the context of Unreal Engine. We learn about thinking in VR and the established concepts, setting up the dev environment, using a basic VR template and exporting for mobile VR. Navigation and moving around. Interacting with your hands. User interface considerations and solutions. Building in VR. Playing media, including 360 and stereo 3D video. Multiplayer, and more. I’d consider this quite a complete survey and I’m happy the authors stuck to VR only and did not dilute the text with general information.So overall, I’d consider this book a big success. If you’re looking to get into VR with Unreal Engine, this is the ticket. It is well explained, not confusing at all, and does cover the most essential aspects of creating for virtual reality. Mack and Ruud also spend a good amount of time explaining certain considerations you need to do with respect to VR, such as special locomotion methods to avoid sickness, as well as the performance limits of mobile VR hardware. This is an almost flawless execution and should not be missed.
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

How do I buy and download an eBook? Chevron down icon Chevron up icon

Where there is an eBook version of a title available, you can buy it from the book details for that title. Add either the standalone eBook or the eBook and print book bundle to your shopping cart. Your eBook will show in your cart as a product on its own. After completing checkout and payment in the normal way, you will receive your receipt on the screen containing a link to a personalised PDF download file. This link will remain active for 30 days. You can download backup copies of the file by logging in to your account at any time.

If you already have Adobe reader installed, then clicking on the link will download and open the PDF file directly. If you don't, then save the PDF file on your machine and download the Reader to view it.

Please Note: Packt eBooks are non-returnable and non-refundable.

Packt eBook and Licensing When you buy an eBook from Packt Publishing, completing your purchase means you accept the terms of our licence agreement. Please read the full text of the agreement. In it we have tried to balance the need for the ebook to be usable for you the reader with our needs to protect the rights of us as Publishers and of our authors. In summary, the agreement says:

  • You may make copies of your eBook for your own use onto any machine
  • You may not pass copies of the eBook on to anyone else
How can I make a purchase on your website? Chevron down icon Chevron up icon

If you want to purchase a video course, eBook or Bundle (Print+eBook) please follow below steps:

  1. Register on our website using your email address and the password.
  2. Search for the title by name or ISBN using the search option.
  3. Select the title you want to purchase.
  4. Choose the format you wish to purchase the title in; if you order the Print Book, you get a free eBook copy of the same title. 
  5. Proceed with the checkout process (payment to be made using Credit Card, Debit Cart, or PayPal)
Where can I access support around an eBook? Chevron down icon Chevron up icon
  • If you experience a problem with using or installing Adobe Reader, the contact Adobe directly.
  • To view the errata for the book, see www.packtpub.com/support and view the pages for the title you have.
  • To view your account details or to download a new copy of the book go to www.packtpub.com/account
  • To contact us directly if a problem is not resolved, use www.packtpub.com/contact-us
What eBook formats do Packt support? Chevron down icon Chevron up icon

Our eBooks are currently available in a variety of formats such as PDF and ePubs. In the future, this may well change with trends and development in technology, but please note that our PDFs are not Adobe eBook Reader format, which has greater restrictions on security.

You will need to use Adobe Reader v9 or later in order to read Packt's PDF eBooks.

What are the benefits of eBooks? Chevron down icon Chevron up icon
  • You can get the information you need immediately
  • You can easily take them with you on a laptop
  • You can download them an unlimited number of times
  • You can print them out
  • They are copy-paste enabled
  • They are searchable
  • There is no password protection
  • They are lower price than print
  • They save resources and space
What is an eBook? Chevron down icon Chevron up icon

Packt eBooks are a complete electronic version of the print edition, available in PDF and ePub formats. Every piece of content down to the page numbering is the same. Because we save the costs of printing and shipping the book to you, we are able to offer eBooks at a lower cost than print editions.

When you have purchased an eBook, simply login to your account and click on the link in Your Download Area. We recommend you saving the file to your hard drive before opening it.

For optimal viewing of our eBooks, we recommend you download and install the free Adobe Reader version 9.