Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Mastering ROS for Robotics Programming

You're reading from   Mastering ROS for Robotics Programming Design, build, and simulate complex robots using the Robot Operating System

Arrow left icon
Product type Paperback
Published in Feb 2018
Publisher Packt
ISBN-13 9781788478953
Length 580 pages
Edition 2nd Edition
Languages
Tools
Concepts
Arrow right icon
Authors (2):
Arrow left icon
Lentin Joseph Lentin Joseph
Author Profile Icon Lentin Joseph
Lentin Joseph
Jonathan Cacace Jonathan Cacace
Author Profile Icon Jonathan Cacace
Jonathan Cacace
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. Introduction to ROS FREE CHAPTER 2. Getting Started with ROS Programming 3. Working with 3D Robot Modeling in ROS 4. Simulating Robots Using ROS and Gazebo 5. Simulating Robots Using ROS and V-REP 6. Using the ROS MoveIt! and Navigation Stack 7. Working with pluginlib, Nodelets, and Gazebo Plugins 8. Writing ROS Controllers and Visualization Plugins 9. Interfacing I/O Boards, Sensors, and Actuators to ROS 10. Programming Vision Sensors Using ROS, Open CV, and PCL 11. Building and Interfacing Differential Drive Mobile Robot Hardware in ROS 12. Exploring the Advanced Capabilities of ROS-MoveIt! 13. Using ROS in MATLAB and Simulink 14. ROS for Industrial Robots 15. Troubleshooting and Best Practices in ROS 16. Other Books You May Enjoy

Working with perception using MoveIt! and Gazebo


Until now, in MoveIt!, we have worked with an arm only. In this section, we will see how to interface a 3D vision sensor data to MoveIt!. The sensor can be either simulated using Gazebo, or you can directly interface an RGB-D sensor, such as Kinect or Xtion Pro, using the openni_launch package. Here, we will work using Gazebo simulation. We will add sensors to MoveIt! for vision-assisted pick-and-place. We will create a grasp table and a grasp object in Gazebo for the pick-and-place operation. We will add two custom models called Grasp_Object and Grasp_Table. The sample models are placed into the seven_dof_arm_test package in the model directory, and should be copied to the ~/.gazebo/models folder for accessing the models from Gazebo. The following command will launch the robot arm and the Asus Xtion pro simulation in Gazebo:

$ roslaunch seven_dof_arm_gazebo seven_dof_arm_bringup_grasping.launch

This command will open up Gazebo with arm joint...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image