In Unity, UIs that are based on a Canvas object and the Event System include buttons, text, images, sliders, and input fields, which can be assembled and wired to objects in the scene. At the start of the chapter, I reviewed some of the UI design principles to consider when building your own UIs for VR.
We took a close look at various world space UI techniques and how they can be used in VR projects. We considered the ways in which the UI for VR differs from the UI for conventional video games and desktop applications, never using screen space UI. We implemented over a half-dozen of them, learning how each can be constructed, coded, and used in our own projects. These range from passive HUD info panels and dynamic in-game displays to interactive control panels with buttons. We learned about using TMP to make text information a truly graphical element. Our C# scripting got a little more advanced, probing deeper into the Unity Engine API and modular coding techniques...