Working with perception using MoveIt! and Gazebo
Until now, in MoveIt!, we have worked with an arm only. In this section, we will see how to interface a 3D vision sensor data to MoveIt!. The sensor can be either simulated using Gazebo, or you can directly interface an RGB-D sensor, such as Kinect or Xtion Pro, using the openni_launch
package. Here, we will work using Gazebo simulation. We will add sensors to MoveIt! for vision-assisted pick-and-place. We will create a grasp table and a grasp object in Gazebo for the pick-and-place operation. We will add two custom models called Grasp_Object
and Grasp_Table
. The sample models are placed into the seven_dof_arm_test
package in the model directory, and should be copied to the ~/.gazebo/models
folder for accessing the models from Gazebo. The following command will launch the robot arm and the Asus Xtion pro simulation in Gazebo:
$ roslaunch seven_dof_arm_gazebo seven_dof_arm_bringup_grasping.launch
This command will open up Gazebo with arm joint...