Pointing and clicking with VR components
As we have seen, while Unity provides UI elements such as canvas text, buttons, and other controls that are specially tuned for conventional screen space UI and mobile app, using them in World Space and tying them together with VR user input can get pretty involved. World space interactions assume some physics, colliders, and ray casts to detect interaction events.
Fortunately, VR device-specific toolkits may provide components that take care of some of this work already. As we saw in previous chapters, device manufacturers provide toolkits built atop their Unity SDK with convenient scripts, prefabs, and demo scenes that illustrate how to use them.
In this case, we're looking for components that let you design scenes using Unity UI elements on canvas, take advantage of all their EventSystem interactivity goodness, use world space 3D models, and input controllers or laster pointers. For example, consider these:
- Oculus Rift and GearVR: OVRInputModule...