Introduction
In the previous chapter, we learned how to initialize sensors and configure their output in the way we want. In this chapter, we will introduce ways to read this data and show it to the user.
This chapter is not about advanced or high-level (middleware) outputs. For now, we are going to talk only about native OpenNI outputs: color, IR, and depth.
But before anything, we need to know about some of OpenNI's related classes.
VideoFrameRef object
Unlike the OpenNI 1.x era, where there were different classes for reading frames of data from sensors, known as MetaData types, here we have only one class to read data from any sensor, called
openni::VideoFrameRef
. This is just as in the previous chapter, where we had the openni::VideoStream
class for accessing different sensors instead of one
Generator
class per sensor.
But let's forget about OpenNI 1.x. We are here to talk about OpenNI 2.x. In OpenNI 2.x, you don't need different classes to access frames; the only thing you need is the...