Introduction
The purpose of this chapter is to introduce the techniques to visualize another interesting and emerging class of data: depth information from 3D range-sensing cameras. Devices with 3D depth sensors are hitting the market everyday, and companies such as Intel, Microsoft, SoftKinetic, PMD, Structure Sensor, and Meta (wearable Augmented Reality eyeglasses) are all using these novel 3D sensing devices to track user inputs, such as hand gestures for interaction and/or tracking a user's environment. An interesting integration of 3D sensors with OpenGL is the ability to look at a scene in 3D from different perspectives, thereby enabling a virtual 3D fly-through of a scene captured with the depth sensors. In our case, for data visualization, being able to walk through a massive 3D dataset could be particularly powerful in scientific computing, urban planning, and many other applications that involve the visualization of 3D structures of a scene.
In this chapter, we propose a simplified...