What this book covers
Chapter 1, Getting Started with OpenGL, introduces the essential development tools required to create OpenGL-based data visualization applications and provides a step-by-step tutorial on how to set up the environment for our first OpenGL demo application in Windows, Mac OS X, and Linux.
Chapter 2, OpenGL Primitives and 2D Data Visualization, focuses on the use of OpenGL 2.0 primitives, such as points, lines, and triangles, to enable the basic 2D visualization of data, including time series such as an electrocardiogram (ECG).
Chapter 3, Interactive 3D Data Visualization, builds upon the fundamental concepts discussed previously and extends the demos to incorporate more sophisticated OpenGL features for 3D rendering.
Chapter 4, Rendering 2D Images and Videos with Texture Mapping, introduces OpenGL techniques to visualize another important class of datasets—those involving images or videos. Such datasets are commonly encountered in many fields, including medical imaging applications.
Chapter 5, Rendering of Point Cloud Data for 3D Range-sensing Cameras, introduces the techniques used to visualize another interesting and emerging class of data—depth information from 3D range sensing cameras.
Chapter 6, Rendering Stereoscopic 3D Models using OpenGL, demonstrates how to visualize data with stunning stereoscopic 3D technology using OpenGL. OpenGL does not provide any mechanism to load, save, or manipulate 3D models. Thus, to support this, we will integrate a new library named Assimp into our code.
Chapter 7, An Introduction to Real-time Graphics Rendering on a Mobile Platform using OpenGL ES 3.0, transitions to an increasingly powerful and ubiquitous computing platform by demonstrating how to set up the Android development environment and create the first Android-based application on the latest mobile devices, from smartphones to tablets, using OpenGL for Embedded Systems (OpenGL ES).
Chapter 8, Interactive Real-time Data Visualization on Mobile Devices, demonstrates how to visualize data interactively by using built-in motion sensors called Inertial Measurement Units (IMUs) and the multitouch interface found on mobile devices.
Chapter 9, Augmented Reality-based Visualization on Mobile or Wearable Platforms, introduces the fundamental building blocks required to create your first AR-based application on a commodity Android-based mobile device: OpenCV for computer vision, OpenGL for graphics rendering, as well as Android's sensor framework for interaction.