In Unity, user interfaces that are based on a canvas object and the event system include buttons, text, images, sliders, and input fields, which can be assembled and wired to objects in the scene.
In this chapter, we took a close look at various world space UI techniques and how they can be used in virtual reality projects. We considered ways in which UI for VR differs from UI for conventional video games and desktop applications. Also, we implemented over a half-dozen of them, demonstrating how each can be constructed, coded, and used in your own projects. Our C# scripting got a little more advanced, probing deeper into the Unity Engine API and modular coding techniques.
You now have a broader vocabulary to approach UI in your VR projects. Some of the examples in this chapter can be directly applicable in your own work. However, not all need to be home-grown. VR UI tools...