Summary
In this chapter we learned how to track the skeletal data provided by the Kinect sensor and how to interpret them for designing relevant user actions.
With the example developed in this chapter, we definitely went to the core of designing and developing Natural User Interfaces.
Thanks to the KinectSensors.SkeletonStream.Enable()
method and the event handler attached to KinectSensors.AllFramesReady
, we have started to manipulate the skeleton stream data and the color stream data provided by the Kinect sensor and overlap them.
We addressed the SkeletonStream.TrackingMode
property for tracking users in Default (stand-up) and Seated mode. Leveraging the Seated mode together with the ability to track user actions is very useful for application-oriented people with disabilities.
We went through the algorithmic approach for tracking user's actions and recognizing user's gestures and we developed our custom gesture manager. Gestures have been defined as a collection of movement sections for...