In this chapter, we looked at how to make use of Leap Motion sensors and VR headsets in Ubuntu with ROS. Additionally, we looked at how to set up the Leap Motion sensor in ROS and make use of a VR headset to visualize data in RViz. Later, we created a custom teleoperation node to help teleoperate a mobile robot in Gazebo using hand gestures that were recognized by the Leap Motion sensor. We also examined how to visualize the Gazebo environment using the VR headset. Using these skills, you could also manually control a real mobile robot, by only using hand gestures and especially those that are not in direct human vicinity.
In the next chapter, we will demonstrate how to detect and track a face in ROS.