Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

How-To Tutorials - Robotics

35 Articles
article-image-upgrading-interface
Packt
06 Feb 2015
4 min read
Save for later

Upgrading the interface

Packt
06 Feb 2015
4 min read
In this article by Marco Schwartz and Oliver Manickum authors of the book Programming Arduino with LabVIEW, we will see how to design an interfave using LabVIEW. (For more resources related to this topic, see here.) At this stage, we know that we have our two sensors working and that they were interfaced correctly with the LabVIEW interface. However, we can do better; for now, we simply have a text display of the measurements, which is not elegant to read. Also, the light-level measurement goes from 0 to 5, which doesn't mean anything for somebody who will look at the interface for the first time. Therefore, we will modify the interface slightly. We will add a temperature gauge to display the data coming from the temperature sensor, and we will modify the output of the reading from the photocell to display the measurement from 0 (no light) to 100 percent (maximum brightness). We first need to place the different display elements. To do this, perform the following steps: Start with Front Panel. You can use a temperature gauge for the temperature and a simple slider indicator for Light Level. You will find both in the Indicators submenu of LabVIEW. After that, simply place them on the right-hand side of the interface and delete the other indicators we used earlier. Also, name the new indicators accordingly so that we can know to which element we have to connect them later. Then, it is time to go back to Block Diagram to connect the new elements we just added in Front Panel. For the temperature element, it is easy: you can simply connect the temperature gauge to the TMP36 output pin. For the light level, we will make slightly more complicated changes. We will divide the measured value beside the Analog Read element by 5, thus obtaining an output value between 0 and 1. Then, we will multiply this value by 100, to end up with a value going from 0 to 100 percent of the ambient light level. To do so perform the following steps: The first step is to place two elements corresponding to the two mathematical operations we want to do: a divide operator and a multiply operator. You can find both of them in the Functions panel of LabVIEW. Simply place them close to the Analog Read element in your program. After that, right-click on one of the inputs of each operator element, and go to Create | Constant to create a constant input for each block. Add a value of 5 for the division block, and add a value of 100 for the multiply block. Finally, connect the output of the Analog Read element to the input of the division block, the output of this block to the input of the multiply block, and the output of the multiply block to the input of the Light Level indicator. You can now go back to Front Panel to see the new interface in action. You can run the program again by clicking on the little arrow on the toolbar. You should immediately see that Temperature is now indicated by the gauge on the right and Light Level is immediately changing on the slider, depending on how you cover the sensor with your hand. Summary In this article, we connected a temperature sensor and a light-level sensor to Arduino and built a simple LabVIEW program to read data from these sensors. Then, we built a nice graphical interface to visualize the data coming from these sensors. There are many ways you can build other projects based on what you learned in this article. You can, for example, connect higher temperatures and/or more light-level sensors to the Arduino board and display these measurements in the interface. You can also connect other kinds of sensors that are supported by LabVIEW, for example, other analog sensors. For example, you can add a barometric pressure sensor or a humidity sensor to the project to build an even more complete weather-measurement station. One other interesting extension of this article will be to use the storage and plotting capabilities of LabVIEW to dynamically plot the history of the measured data inside the LabVIEW interface. Resources for Article: Further resources on this subject: The Arduino Mobile Robot [article] Using the Leap Motion Controller with Arduino [article] Avoiding Obstacles Using Sensors [article]
Read more
  • 0
  • 0
  • 1369

article-image-building-robots-can-walk
Packt
19 Dec 2014
15 min read
Save for later

Building robots that can walk

Packt
19 Dec 2014
15 min read
In this article, by Richard Grimmett, author of Mastering BeagleBone Robotics, you'll build a quadruped, that is, a robot with four legs. You'll be using 12 servos so that each leg has three points that can move, or three Degrees of Freedom (DOF). In this project, you'll control 12 servos at the same time, so it will make more sense to use an external servo controller that can supply the control signals and supply voltages for all the 12 servos. (For more resources related to this topic, see here.) Since servos are the main component of this project, it is perhaps useful to go through a tutorial on servos and how to control them. Working of servomotors Servomotors are somewhat similar to DC motors; however, there is an important difference. While DC motors are generally designed to move in a continuous way—rotating 360 degrees at a given speed—servos are generally designed to move within a limited set of angles. In other words, in the case of a DC motor, you would generally want your motors to spin with continuous rotation speed that you control. But in the case of a servomotor, you would want your motor to move to a specific position that you control. This is done by sending a Pulse-Width-Modulated (PWM) signal to the control connector of the servo. PWM simply means that you are going to change the length of each pulse of electrical energy in order to control something. In this case, the length of this pulse will control the angle of the servo, as shown in the following diagram: These pulses are sent out with a repetition rate of 60 Hz. You can position the servo to any angle by setting the correct control pulse. Building the quadruped platform You'll first need some parts so that you can build your quadruped robot. There are several kit possibilities out there, including the one available at www.trossenrobotics.com/p/PhantomX-AX-12-Quadruped.aspx. However, such kits can be expensive, so for this example, you'll create your own kit using a set of Lynxmotion parts. These are available from several online retailers such as robotshop.com. To build this quadruped, you'll need four legs, each leg requires two Lynxmotion parts. Here are the parts with their Robotshop part numbers: Quantity Description 1 Lynxmotion symmetric quadruped body kit: Mini QBK-02 2 Lynxmotion 3'' aluminum femur pair 2 Lynxmotion Robot Leg "A" pair (No servo) RL-01 4 Lynxmotion aluminum multi-purpose servo bracket Two Pack ASB-04 2 Ball bearing with flange: 3mm ID (pair) Product code: RB-Lyn-317 This last part a bearing you'll need to connect the leg to the body. You'll also need 12 servos of a standard size. There are several possible choices, but I personally like the Hitec servos. They are very inexpensive and you can get them from most hobby shops and online electronic retailers. Now, for a moment on the model of servo: Servos come in different model numbers, primarily based on the amount of torque they can generate. Torque is the force that the servo can exert to move the part connected to it. In this case, your servos will need to lift and move the weight associated with your quadruped, so you'll need a servo with enough torque to do this. I suggest that you use eight Hitec model HS-485HB servos. You'll use these for the servos attached to the end of the leg and for the body. Then you'll use four Hitec model HS-645MG servos for the middle of the leg; this is the servo that will require the highest amount of torque. You can use the 12 Hitec model HS-645MG servos instead, but they are more expensive than the HS-485 servos, so using two different servos will be less expensive. Here are the steps required to assemble the quadruped: Put the two parts of the lower right leg together and insert the servo with the servo mounting screws. It should look like this: Now connect this assembly to the interconnect part, like this: Complete the leg by connecting two of the servo brackets together at right angles, mounting the HS-645MG on one of the brackets, and then connecting this servo to the interconnect piece, like this: Put another right leg together. Now put two left legs together following the same preceding steps, but in left leg configuration. They look like this: The next step is to build the body kit. There are some instructions given at www.lynxmotion.com/images/html/sq3u-assembly.htm, but it should be like the following image: Then connect each leg to the body kit. First connect the empty servo bracket to the body using the bearing, as shown in the following image: Now connect the other servo to the empty servo bracket and the body, like this: After performing all the preceding steps, your quadruped should now look like this: Now that you have the basic hardware assembled, you can turn your attention to the electronics. Using a servo controller to control the servos To make your quadruped walk, you will first need to connect the servomotor controller to the servos. The servo controller you are going to use for this project is a simple servomotor controller utilizing USB from Pololu (Pololu item number 1354, available at pololu.com) that can control 18 servomotors. Here is an image of the unit: Make sure that you order the assembled version. This piece of hardware will turn USB commands from the BeagleBone Black into signals that control your servomotors. Pololu makes a number of different versions of this controller, each able to control a certain number of servos. In this case, you may want to choose the 18-servo version, so you can control all 12 servos with one controller and also add an additional servo to control the direction of a camera or sensor. You could also choose the 12-servo version. One advantage of the 18-servo controller is the ease of connecting power to the unit via screw-type connectors. There are two connections you'll need to make to the servo controller to get started: the first is to the servomotors and the second is to a battery. First, connect the servos to the controller. In order to be consistent, let's connect your 12 servos to the connections marked 0 through 11 on the controller using the configuration shown in the following table: Servo Connector Servo 0 Right front lower leg 1 Right front middle leg 2 Right front upper leg 3 Right rear lower leg 4 Right rear middle leg 5 Right rear upper leg 6 Left front lower leg 7 Left front middle leg 8 Left front upper leg 9 Left rear lower leg 10 Left rear middle leg 11 Left rear upper leg Here is an image of the back of the controller. This will tell us where to connect our servos: Now you need to connect the servomotor controller to your battery. For this project, you can use a 2S RC LiPo battery. The 2S means that the battery will have two cells, with an output voltage of 7.2 volts. It will supply the voltage and current needed by your servos, which can be of the order of 2 amperes. Here is an image of the battery: This battery will come with two connectors: one with large gauge wires for normal usage and a smaller connector used to connect to the battery recharger. You'll want to build connectors that can connect to the screw-type connectors of the servo controller. I purchased some XT60 connector pairs, soldered some wires to the mating connector of the battery, and screwed these into the servo controller. Your system is now functional. Now you'll connect the motor controller to your personal computer to check if you can communicate with it. To do this, connect a mini USB cable between the servo controller and your personal computer. Communicating with the servo controller via a PC Now that the hardware is connected, you can use some software provided by Pololu to control the servos. Let's do this using your personal computer. First download the Pololu software from www.pololu.com/docs/0J40/3.a and install it according to the instructions on the website. Once it is installed, run the software, and you should see something like the following screenshot: You will first need to change the configuration of the serial settings, so select the Serial Settings tab and you should see this: Make sure that USB Chained is selected; this will allow you to communicate with and control the motor controller over USB. Now go back to the main screen by selecting the Status tab, and now you can actually turn on the 12 servos. The screen should look like this screenshot: Now you can use the sliders to actually control the servos. Check that servo 0 moves the right front lower servo, servo 1 moves the right front middle servo, servo 2 moves the right front upper servo, and so on. You can also use this to center the servos. Set all the servos such that the slider is in the middle. Now unscrew the servo horn on each servo until the servos are centered at this location. At the zero degree location of all servos, your quadruped should look like this: Your quadruped is now ready to actually do something. Now you'll need to send the servos the electronic signals they need to move your quadruped. Connecting the servo controller to the BeagleBone Black You've checked the servomotor controller and the servos. You'll now connect the motor controller to the BeagleBone Black and make sure you can control the servos from it. Remove the USB cable from the PC and connect it to the BeagleBone Black. The entire system will look like this: Let's now talk to the motor controller by downloading the Linux code from Pololu at www.pololu.com/docs/0J40/3.b. Perhaps the best way is to log in to your BeagleBone Black using PuTTY, then type wget http://www.pololu.com/file/download/maestro-linux-100507.tar.gz?file_id=0J315. Then move the file by typing mv maestro-linux-100507.tar.gz?file_id=0J315 maestro-linux-100507.tar.gz. Unpack the file by typing tar –xzfv maestro_linux_011507.tar.gz. This will create a directory called maestro_linux. Go to that directory by typing cd maestro_linux and then type ls. You should see something like this: The README.txt document will give you explicit instructions on how to install the software. Unfortunately, you can't run MaestroControlCenter on your BeagleBone Black. Your version of Windows doesn't support the graphics, but you can control your servos using the UscCmd command-line application to ensure that they are connected and working correctly. First, type ./UscCmd --list and you should see something like the following screenshot: The unit sees our servo controller. By just typing ./UscCmd, you can see all the commands that you can send to your controller, as shown in the following screenshot: Note that although you can send a servo a specific target angle, the target is not in angle values, so it makes it a bit difficult to know where you are sending your servo. Try typing ./UscCmd --servo 0, 10. The servo will move to its maximum angle position. Type ./UscCmd – servo 0, 0 and it will prevent the servo from trying to move. In the next section, you'll write some Python code that will translate your angles to the commands that the servo controller will want to receive to move it to specific angle locations. If you didn't run the Windows version of Maestro Controller and set the Serial Settings to USB Chained, your motor controller might not respond. Rerun the Maestro Controller code and set the Serial Settings to USB Chained. Creating a program on Linux to control your quadruped You now know that you can talk to your servomotor controller, and move your servos. In this section, you'll create a Python program that will let you talk to your servos to move them to specific angles. Let's start with a simple program that will make your legged mobile robot's servos go to 90 degrees (the middle of the 0 to 180 degrees you can set). This particular controller uses bytes of information, so the code will translate the input of the channel and angle to numbers that the controller can understand. For more details, see http://www.pololu.com/docs/0J40. Here is the code to move all the connected servos to the 90 degree point: Here is an explanation of the code: #! /usr/bin/python: This first line allows you to make this Python file executable from the command line. import serial: This line imports the serial library. You need the serial library to talk to your unit via USB. def setAngle(ser, channel, angle): This function converts your desired setting of servo and angle into the serial command that the servomotor controller needs. ser = serial.Serial("/dev/ttyACM0", 9600): This opens the serial port connection to your servo controller. for i in range(0, 15): This is for all 16 servo possibilities. setAngle(ser, i, 90): This allows you to set each servo to the middle (home) position. The default would be to set each servo to 90 degrees. If the legs of your robot aren't in their middle position, you can adjust them by adjusting the position of the servo horns on each servo. To access the serial port, you'll need to make sure that you have the Python serial library. If you don't, then type sudo apt-get install python-serial. After you have installed the serial library, you can run your program by typing sudo python quad.py. Once you have the basic home position set, you can now ask your robot to do some things. Let's start by making your quadruped wave an arm. Here is the Python code that waves the arm: In this case, you are using the setAngle command to set your servos to manipulate your robot's front-right arm. The middle servo raises the arm, and the lower servo then goes back and forth between angle 100 and 130. One of the most basic actions you'll want your robot to do is to walk forward. Here is an example of how to manipulate the legs to make this happen: This program lifts and moves each leg forward one at a time, and then moves all the legs to home position, which moves the robot forward. Not the most elegant motion, but it does work. There are more sophisticated algorithms to make your quadruped walk as shown at http://letsmakerobots.com/node/35354 and https://www.youtube.com/watch?v=jWP3RnYa_tw. Once you have the program working, you'll want to package all of your hardware onto the mobile robot. You can make your robot do many amazing things: walk forward, walk backward, dance, turn around—any kinds of movements are possible. The best way to learn is to try new and different positions with the servos. Issuing voice commands to your quadruped You should now have a mobile platform that you can program to move in any number of ways. Unfortunately, you still have your LAN cable connected, so the platform isn't completely mobile. And once you have begun the program, you can't alter the behavior of your program. You'll need to modify your voice recognition program so that it can run your Python program when it gets a voice command. You have to make a simple modification to the continuous.c program in /home/ubuntu/pocketsphinx-0.8/src/programs. To do this, type cd /home/ubuntu/ pocketsphinx-0.8/src/programs, and then type emacs continuous.c. The changes will occur in the same section as your other voice commands and will look like this: The additions are pretty straightforward. Let's walk through them: else if (strcmp(word, "FORWARD") == 0): This checks the word as recognized by your voice command program. If it corresponds with the word FORWARD, it will execute everything inside the if statement. We use { } to tell the system which commands go with this else if clause. system("/home/ubuntu/maestro_linux/robot.py"): This is the program we will execute. In this case, our mobile platform will do whatever the robot.py program tells it to do. After doing this, you will need to recompile the program, so type make and the pocketSphinx_continuous executable will be created. Run the program by typing ./pocketSphinx_continuous. Don't forget the ./ at the start of this command or you'll run a different version of the program. When the program is running, you can disconnect the LAN cable, and the mobile platform will now take the forward voice command and execute your program. Summary You now have a robot than can walk! You can also add other sensors, like the ones you discovered for your tracked robot, sensors that can watch for barriers, or even a webcam. Resources for Article: Further resources on this subject: Protecting GPG Keys in BeagleBone [Article] Home Security by BeagleBone [Article] Introducing BeagleBoard [Article]
Read more
  • 0
  • 0
  • 4236

article-image-lejos-unleashing-ev3
Packt
29 Oct 2014
7 min read
Save for later

LeJOS – Unleashing EV3

Packt
29 Oct 2014
7 min read
In this article by Abid H. Mujtaba, author of Lego Mindstorms EV3 Essentials, we'll have look at a powerful framework designed to grant an extraordinary degree of control over EV3, namely LeJOS: (For more resources related to this topic, see here.) Classic programming on EV3 LeJOS is what happens when robot and software enthusiasts set out to hack a robotics kit. Although lego initially intended the Mindstorms series to be primarily targeted towards children, it was taken up with gleeful enthusiasm by adults. The visual programming language, which was meant to be used both on the brick and on computers, was also designed with children in mind. The visual programming language, although very powerful, has a number of limitations and shortcomings. Enthusiasts have continually been on the lookout for ways to program Mindstorms using traditional programming languages. As a result, a number of development kits have been created by enthusiasts to allow the programming of EV3 in a traditional fashion, by writing and compiling code in traditional languages. A development kit for EV3 consists of the following: A traditional programming language (C, C++, Java, and so on) Firmware for the brick (basically, a new OS) An API in the chosen programming language, giving access to the robot's inputs and outputs A compiler that compiles code on a traditional computer to produce executable code for the brick Optionally, an Integrated Development Environment (IDE) to consolidate and simplify the process of developing the brick The release of each robot in the Mindstorms series has been associated with a consolidated effort by the open source community to hack the brick and make available a number of frameworks for programming robots using traditional programming languages. Some of the common frameworks available for Mindstorms are GNAT GPL (Ada), ROBOTC, Next Byte Code (NBC), an assembly language, Not Quite C (NQC), LeJOS, and many others. This variety of frameworks is particularly useful for Linux users, not only because they love having the ability to program in their language of choice, but also because the visual programming suite for EV3 does not run on Linux at all. In its absence, these frameworks are essential for anyone who is looking to create programs of significant complexity for EV3. LeJOS – introduction LeJOS is a development kit for Mindstorms robots based on the Java programming language. There is no official pronunciation, with people using lay-joss, le-J-OS (claiming it is French for "the Java Operating System", including myself), or lay-hoss if you prefer the Latin-American touch. After considerable success with NXT, LeJOS was the first (and in my opinion, the most complete) framework released for EV3. This is a testament both to the prowess of the developers working on LeJOS and the fact that lego built EV3 to be extremely hackable by running Linux under its hood and making its source publicly available. Within weeks, LeJOS had been ported to EV3, and you could program robots using Java. LeJOS works by installing its own OS (operating system) on the EV3's SD card as an alternate firmware. Before EV3, this would involve a slightly difficult and dangerous tinkering with the brick itself, but one of the first things that EV3 does on booting up is to check for a bootable partition on the SD card. If it is found, the OS/firmware is loaded from the SD card instead of being loaded internally. Thus, in order to run LeJOS, you only need a suitably prepared SD card inserted into EV3 and it will take over the brick. When you want to return to the default firmware, simply remove the SD card before starting EV3. It's that simple! Lego wasn't kidding about the hackability of EV3. The firmware for LeJOS basically runs a Java Virtual Machine (JVM) inside EV3, which allows it to execute compiled Java code. Along with the JVM, LeJOS installs an API library, defining methods that can be programmatically used to access the inputs and outputs attached to the brick. These API methods are used to control the various components of the robot. The LeJOS project also releases tools that can be installed on all modern computers. These tools are used to compile programs that are then transferred to EV3 and executed. These tools can be imported into any IDE that supports Java (Eclipse, NetBeans, IntelliJ, Android Studio, and so on) or used with a plain text editor combined with Ant or Gradle. Thus, leJOS qualifies as a complete development kit for EV3. The advantages of LeJOS Some of the obvious advantages of using LeJOS are: This was the first framework to have support for EV3 Its API is stable and complete This is an active developer and user base (the last stable version came out in March 2014, and the new beta was released in April) The code base is maintained in a public Git repository Ease of installation Ease of use The other advantages of using LeJOS are linked to the fact that its API as well as the programs you write yourself are all written in Java. The development kits allow a number of languages to be used for programming EV3, with the most popular ones being C and Java. C is a low-level language that gives you greater control over the hardware, but it comes at a price. Your instructions need to be more explicit, and the chances of making a subtle mistake are much higher. For every line of Java code, you might have to write dozens of lines of C code to get the same functionality. Java is a high-level language that is compiled into the byte code that runs on the JVM. This results in a lesser degree of control over the hardware, but in return, you get a powerful and stable API (that LeJOS provides) to access the inputs and outputs of EV3. The LeJOS team is committed to ensure that this API works well and continues to grow. The use of a high-level language such as Java lowers the entry threshold to robotic programming, especially for people who already know at least one programming language. Even people who don't know programming yet can learn Java easily, much more so than C. Finally, two features of Java that are extremely useful when programming robots are its object-oriented nature (the heavy use of classes, interfaces, and inheritance) and its excellent support for multithreading. You can create and reuse custom classes to encapsulate common functionality and can integrate sensors and motors using different threads that communicate with each other. The latter allows the construction of subsumption architectures, an important development in robotics that allows for extremely responsive robots. I hope that I have made a compelling case for why you should choose to use LeJOS as your framework in order to take EV3 programming to the next level. However, the proof is in the pudding. Summary In this article, we learned how EV3's extremely hackable nature has led to the proliferation of alternate frameworks that allow EV3 to be programmed using traditional programming languages. One of these alternatives is LeJOS, a powerful framework based on the Java programming language. We studied the fundamentals of LeJOS and learned its advantages over other frameworks. Resources for Article: Further resources on this subject: Home Security by BeagleBone [Article] Clusters, Parallel Computing, and Raspberry Pi – A Brief Background [Article] Managing Test Structure with Robot Framework [Article]
Read more
  • 0
  • 0
  • 2151

article-image-making-unit-very-mobile-controlling-movement-robot-legs
Packt
17 Feb 2014
11 min read
Save for later

Making the Unit Very Mobile – Controlling the Movement of a Robot with Legs

Packt
17 Feb 2014
11 min read
(For more resources related to this topic, see here.) The following is an image of a finished project: Even though you've made your robot mobile by adding wheels or tracks, this mobile platform will only work well on smooth, flat surfaces. Often, you'll want your robot to work in environments where the path is not smooth or flat; perhaps you'll even want your robot to go up stairs or around curbs. In this article, you'll learn how to attach your board, both mechanically and electrically, to a platform with legs so that your projects can be mobile in many more environments. Robots that can walk! What could be more amazing than that? In this article, we will cover the following topics: Connecting Raspberry Pi to a two-legged mobile platform using a servo motor controller Creating a program in Linux so that you can control the movement of the two-legged mobile platform Making your robot truly mobile by adding voice control Gathering the hardware In this article, you'll need to add a legged platform to make your project mobile. For a legged robot, there are a lot of choices for hardware. Some are completely assembled, others require some assembly, and you may even choose to buy the components and construct your own custom mobile platform. Also I'm going to assume that you don't want to do any soldering or mechanical machining yourself, so let's look at several choices of hardware that are available completely assembled or can be assembled using simple tools (a screwdriver and/or pliers). One of the simplest legged mobile platforms is one that has two legs and four servo motors. The following is an image of this type of platform: We'll use this legged mobile platform in this article because it is the simplest to program and the least expensive, requiring only four servos. To construct this platform, you must purchase the parts and then assemble them yourself. Find the instructions and parts list at http://www.lynxmotion.com/images/html/build112.htm. Another easy way to get all the mechanical parts (except servos) is by purchasing a biped robot kit with six DOF (degrees of freedom). This will contain the parts needed to construct your four-servo biped. These six DOF bipeds can be purchased on eBay or at http://www.robotshop.com/2-wheeled-development-platforms-1.html. You'll also need to purchase the servo motors. Servo motors are designed to move at specific angles based on the control signals that you send. For this type of robot, you can use standard-sized servos. I like the Hitec HS-311 or HS-322 for this robot. They are inexpensive but powerful enough in operations. You can get them on Amazon or eBay. The following is an image of an HS-311 servo: You'll need a mobile power supply for Raspberry Pi. I personally like the 5V cell phone rechargeable batteries that are available at almost any place that supplies cell phones. Choose one that comes with two USB connectors; you can use the second port to power your servo controller. The mobile power supply shown in the following image mounts well on the biped hardware platform: You'll also need a USB cable to connect your battery to Raspberry Pi. You should already have one of those. Now that you have the mechanical parts for your legged mobile platform, you'll need some hardware that will turn the control signals from your Raspberry Pi into voltage levels that can control the servo motors. Servo motors are controlled using a signal called PWM. For a good overview of this type of control, see http://pcbheaven.com/wikipages/How_RC_Servos_Works/ or https://www.ghielectronics.com/docs/18/pwm. You can find tutorials that show you how to control servos directly using Raspberry Pi's GPIO (General Purpose Input/Output) pins, for example, those at http://learn.adafruit.com/adafruit-16-channel-servo-driver-with-raspberry-pi/ and http://www.youtube.com/watch?v=ddlDgUymbxc. For ease of use, I've chosen to purchase a servo controller that can talk over a USB and control the servo motor. These controllers protect my board and make controlling many servos easy. My personal favorite for this application is a simple servo motor controller utilizing a USB from Pololu that can control six servo motors—the Micro Maestro 6-Channel USB Servo Controller (Assembled). The following is an image of the unit: Make sure you order the assembled version. This piece of hardware will turn USB commands into voltage levels that control your servo motors. Pololu makes a number of different versions of this controller, each able to control a certain number of servos. Once you've chosen your legged platform, simply count the number of servos you need to control and choose a controller that can control that many servos. In this article, we will use a two-legged, four-servo robot, so I will illustrate the robot using the six-servo version. Since you are going to connect this controller to Raspberry Pi via USB, you'll also need a USB A to mini-B cable. You'll also need a power cable running from the battery to your servo controller. You'll want to purchase a USB to FTDI cable adapter that has female connectors, for example, the PL2303HX USB to TTL to UART RS232 COM cable available on amazon.com. The TTL to UART RS232 cable isn't particularly important, other than that the cable itself provides individual connectors to each of the four wires in a  USB cable. The following is an image of the cable: Now that you have all the hardware, let's walk through a quick tutorial of how a two-legged system with servos works and then some step-by-step instructions to make your project walk. Connecting Raspberry Pi to the mobile platform using a servo controller Now that you have a legged platform and a servo motor controller, you are ready to make your project walk! Before you begin, you'll need some background on servo motors. Servo motors are somewhat similar to DC motors. However, there is an important difference: while DC motors are generally designed to move in a continuous way, rotating 360 degrees at a given speed, servo motors are generally designed to move at angles within a limited set. In other words, in the DC motor world, you generally want your motors to spin at a continuous rotation speed that you control. In the servo world, you want to control the movement of your motor to a specific position. For more information on how servos work, visit http://www.seattlerobotics.org/guide/servos.html or http://www.societyofrobots.com/actuators_servos.shtml. Connecting the hardware To make your project walk, you first need to connect the servo motor controller to the servos. There are two connections you need to make: the first is to the servo motors and the second is to the battery holder. In this section, you'll connect your servo controller to your PC or Linux machine to check to see whether or not everything is working. The steps for that are as follows: Connect the servos to the controller. The following is an image of your two-legged robot and the four different servo connections: In order to be consistent, let's connect your four servos to the connections marked 0 through 3 on the controller using the following configurations: 0: Left foot 1: Left hip 2: Right foot 3: Right hip The following is an image of the back of the controller; it will show you where to connect your servos: Connect these servos to the servo motor controller as follows: The left foot to the 0 to the top connector, the black cable to the outside (-) The left hip to the 1 connector, the black cable out The right foot to the 2 connector, the black cable out The right hip to the 3 connector, the black cable out See the following image indicating how to connect servos to the controller: Now you need to connect the servo motor controller to your battery. You'll use the USB to FTDI UART cable; plug the red and black cables into the power connector on the servo controller, as shown in the following image: Configuring the software Now you can connect the motor controller to your PC or Linux machine to see whether or not you can talk to it. Once the hardware is connected, you can use some of the software provided by Polulu to control the servos. It is easiest to do this using your personal computer or Linux machine. The steps to do so are as follows: Download the Polulu software from http://www.pololu.com/docs/0J40/3.a and install it based on the instructions on the website. Once it is installed, run the software; you should see the window shown in the following screenshot: You will first need to change the Serial mode configuration in Serial Settings, so select the Serial Settings tab; you should see the window shown in the following screenshot: Make sure that USB Chained is selected; this will allow you to connect to and control the motor controller over the USB. Now go back to the main screen by selecting the Status tab; now you can turn on the four servos. The screen should look as shown in the following screenshot: Now you can use the sliders to control the servos. Enable the four servos and make sure that the servo 0 moves the left foot, 1 the left hip, 2 the right foot, and 3 the right hip. You've checked the motor controllers and the servos and you'll now connect the motor controller to Raspberry Pi to control the servos from there. Remove the USB cable from the PC and connect it to Raspberry Pi. The entire system will look as shown in the following image: Let's now talk to the motor controller by downloading the Linux code from Pololu at http://www.pololu.com/docs/0J40/3.b. Perhaps the best way to do this is by logging on to Raspberry Pi using vncserver and opening a VNC Viewer window on your PC. To do this, log in to your Raspberry Pi using PuTTY and then type vncserver at the prompt to make sure vncserver is running. Then, perform the following steps: On your PC, open the VNC Viewer application, enter your IP address, and then click on Connect. Then, enter the password that you created for the vncserver; you should see the Raspberry Pi viewer screen, which should look as shown in the following screenshot: Open a Firefox browser window and go to http://www.pololu.com/docs/0J40/3.b. Click on the Maestro Servo Controller Linux Software link. You will need to download the file maestro_linux_100507.tar.gz to the Download directory. You can also use wget to get this software by typing wget http://www.pololu.com/file/download/maestro-linux-100507.tar.gz?file_id=0J315 in a terminal window. Go to your Download directory, move it to your home directory by typing mv maestro_linux_100507.tar.gz .. and then you can go back to your home directory. Unpack the file by typing tar –xzfv maestro_linux_011507.tar.gz. This will create a directory called maestro_linux. Go to that directory by typing cd maestro_linux and then type ls. You should see the output as shown in the following screenshot: The document README.txt will give you explicit instructions on how to install the software. Unfortunately, you can't run MaestroControlCenter on your Raspberry Pi. Our version of windowing doesn't support the graphics, but you can control your servos using the UscCmd command-line application. First, type ./UscCmd --list and you should see the following screenshot: The unit sees your servo controller. If you just type ./UscCmd, you can see all the commands you could send to your controller. When you run this command, you can see the result as shown in the following screenshot: Notice that you can send a servo a specific target angle, although if the target angle is not within range, it makes it a bit difficult to know where you are sending your servo. Try typing ./UscCmd --servo 0, 10. The servo will most likely move to its full angle position. Type ./UscCmd – servo 0, 0 and it will stop the servo from trying to move. If you haven't run the Maestro Controller tool and set the Serial Settings setting to USB Chained, your motor controller may not respond.
Read more
  • 0
  • 0
  • 1667

article-image-webcam-and-video-wizardry
Packt
26 Jul 2013
13 min read
Save for later

Webcam and Video Wizardry

Packt
26 Jul 2013
13 min read
(For more resources related to this topic, see here.) Setting up your camera Go ahead, plug in your webcam and boot up the Pi; we'll take a closer look at what makes it tick. If you experimented with the dwc_otg.speed parameter to improve the audio quality during the previous article, you should change it back now by changing its value from 1 to 0, as chances are that your webcam will perform worse or will not perform at all, because of the reduced speed of the USB ports.. Meet the USB Video Class drivers and Video4Linux Just as the Advanced Linux Sound Architecture (ALSA) system provides kernel drivers and a programming framework for your audio gadgets, there are two important components involved in getting your webcam to work under Linux: The Linux USB Video Class (UVC) drivers provide the low-level functions for your webcam, which are in accordance with a specifcation followed by most webcams produced today. Video4Linux (V4L) is a video capture framework used by applications that record video from webcams, TV tuners, and other video-producing devices. There's an updated version of V4L called V4L2, which we'll want to use whenever possible. Let's see what we can find out about the detection of your webcam, using the following command: pi@raspberrypi ~ $ dmesg The dmesg command is used to get a list of all the kernel information messages that of messages, is a notice from uvcvideo. Kernel messages indicating a found webcam In the previous screenshot, a Logitech C110 webcam was detected and registered with the uvcvideo module. Note the cryptic sequence of characters, 046d:0829, next to the model name. This is the device ID of the webcam, and can be a big help if you need to search for information related to your specifc model. Finding out your webcam's capabilities Before we start grabbing videos with our webcam, it's very important that we find out exactly what it is capable of in terms of video formats and resolutions. To help us with this, we'll add the uvcdynctrl utility to our arsenal, using the following command: pi@raspberrypi ~ $ sudo apt-get install uvcdynctrl Let's start with the most important part—the list of supported frame formats. To see this list, type in the following command: pi@raspberrypi ~ $ uvcdynctrl -f List of frame formats supported by our webcam According to the output of this particular webcam, there are two main pixel formats that are supported. The first format, called YUYV or YUV 4:2:2, is a raw, uncompressed video format; while the second format, called MJPG or MJPEG, provides a video stream of compressed JPEG images. Below each pixel format, we find the supported frame sizes and frame rates for each size. The frame size, or image resolution, will determine the amount of detail visible in the video. Three common resolutions for webcams are 320 x 240, 640 x 480 (also called VGA), and 1024 x 768 (also called XGA). The frame rate is measured in Frames Per Second (FPS) and will determine how "fuid" the video will appear. Only two different frame rates, 15 and 30 FPS, are available for each frame size on this particular webcam. Now that you know a bit more about your webcam, if you happen to be the unlucky owner of a camera that doesn't support the MJPEG pixel format, you can still go along, but don't expect more than a slideshow of images of 320 x 240 from your webcam. Video processing is one of the most CPU-intensive activities you can do with the Pi, so you need your webcam to help in this matter by compressing the frames first. Capturing your target on film All right, let's see what your sneaky glass eye can do! We'll be using an excellent piece of software called MJPG-streamer for all our webcam capturing needs. Unfortunately, it's not available as an easy-to-install package for Raspbian, so we will have to download and build this software ourselves. Often when we compile software from source code, the application we're building will want to make use of code libraries and development headers. Our MJPG-streamer application, for example, would like to include functionality for dealing with JPEG images and Video4Linux devices. Install the libraries and headers for JPEG and V4L by typing in the following command: pi@raspberrypi ~ $ sudo apt-get install libjpeg8-dev libv4l-dev Next, we're going to download the MJPG-streamer source code using the following command: pi@raspberrypi ~ $ wget http: // mjpg-streamer.svn.sourceforge.net/viewvc/mjpg-streamer/mjpg-streamer/?view=tar -O mjpg-streamer.tar.gz The wget utility is an extraordinarily handy web download tool with many uses. Here we use it to grab a compressed TAR archive from a source code repository, and we supply the extra -O mjpg-streamer.tar.gz to give the downloaded tarball a proper filename. Now we need to extract our mjpg-streamer.tar.gz article, using the following command: pi@raspberrypi ~ $ tar xvf mjpg-streamer.tar.gz The tar command can both create and extract archives, so we supply three fags here: x for extract, v for verbose (so that we can see where the files are being extracted to), and f to tell tar to use the article we specify as input, instead of reading from the standard input. Once you've extracted it, enter the directory containing the sources: pi@raspberrypi ~ $ cd mjpg-streamer Now type in the following command to build MJPG-streamer with support for V4L2 devices: pi@raspberrypi ~/mjpg-streamer $ make USE_LIBV4L2=true Once the build process has finished, we need to install the resulting binaries and other application data somewhere more permanent, using the following command: pi@raspberrypi ~/mjpg-streamer $ sudo make DESTDIR=/usr install You can now exit the directory containing the sources and delete it, as we won't need it anymore: pi@raspberrypi ~/mjpg-streamer $ cd .. && rm -r mjpg-streamer Let's fre up our newly-built MJPG-streamer! Type in the following command, but adjust the values for resolution and frame rate to a moderate setting that you know (from the previous section) that your webcam will be able to handle: pi@raspberrypi ~ $ mjpg_streamer -i "input_uvc.so -r 640x480 -f 30" -o "output_http.so -w /usr/www" MJPG-streamer starting up You may have received a few error messages saying Inappropriate ioctl for device; these can be safely ignored. Other than that, you might have noticed the LED on your webcam (if it has one) light up as MJPG-streamer is now serving your webcam feed over the HTTP protocol on port 8080. Press Ctrl + C at any time to quit MJPG-streamer. To tune into the feed, open up a web browser (preferably Chrome or Firefox) on a computer connected to the same network as the Pi and enter the following line into the address field of your browser, but change [IP address] to the IP address of your Pi. That is, the address in your browser should look like this: http://[IP address]:8080. You should now be looking at the MJPG-streamer demo pages, containing a snapshot from your webcam. MJPG-streamer demo pages in Chrome The following pages demonstrate the different methods of obtaining image data from your webcam: The Static page shows the simplest way of obtaining a single snapshot frame from your webcam. The examples use the URL http://[IP address]:8080/?action=snapshot to grab a single frame. Just refresh your browser window to obtain a new snapshot. You could easily embed this image into your website or blog by using the <img src = "http://[IP address]:8080/?action=snapshot"/> HTML tag, but you'd have to make the IP address of your Pi reachable on the Internet for anyone outside your local network to see it. The Stream page shows the best way of obtaining a video stream from your webcam. This technique relies on your browser's native support for decoding MJPEG streams and should work fne in most browsers except for Internet Explorer. The direct URL for the stream is http://[IP address]:8080/?action=stream. The Java page tries to load a Java applet called Cambozola, which can be used as a stream viewer. If you haven't got the Java browser plugin already installed, you'll probably want to steer clear of this page. While the Cambozola viewer certainly has some neat features, the security risks associated with the plugin outweigh the benefits of the viewer. The JavaScript page demonstrates an alternative way of displaying a video stream in your browser. This method also works in Internet Explorer. It relies on JavaScript code to continuously fetch new snapshot frames from the webcam, in a loop. Note that this technique puts more strain on your browser than the preferred native stream method. You can study the JavaScript code by viewing the page source of the following page: http://[IP address]:8080/javascript_simple.html The VideoLAN page contains shortcuts and instructions to open up the webcam video stream in the VLC media player. We will get to know VLC quite well during this article; leave it alone for now. The Control page provides a convenient interface for tweaking the picture settings of your webcam. The page should pop up in its own browser window so that you can view the webcam stream live, side-by-side, as you change the controls. Viewing your webcam in VLC media player You might be perfectly content with your current webcam setup and viewing the stream in your browser; for those of you who prefer to watch all videos inside your favorite media player, this section is for you. Also note that we'll be using VLC for other purposes further in this article, so we'll go through the installation here. Viewing in Windows Let's install VLC and open up the webcam stream: Visit http://www.videolan.org/vlc/download-windows.html and download the latest version of the VLC installer package(vlc-2.0.5-win32.exe, at the time of writing). Install VLC media player using the installer. Launch VLC using the shortcut on the desktop or from the Start menu. From the Media drop-down menu, select Open Network Stream…. Enter the direct stream URL we learned from the MJPG-streamer demo pages (http://[IP address]:8080/?action=stream), and click on the Play button. (Optional) You can add live audio monitoring from the webcam by opening up a command prompt window and typing in the following command: "C:Program Files (x86)PuTTYplink" pi@[IP address] -pw [password] sox -t alsa plughw:1 -t sox - | "C:Program Files (x86)sox-14-4-1sox" -q -t sox - -d Viewing in Mac OS X Let's install VLC and open up the webcam stream: Visit http://www.videolan.org/vlc/download-macosx.html and download the latest version of the VLC dmg package for your Mac model. The one at the top, vlc-2.0.5.dmg (at the time of writing), should be fne for most Macs. Double-click on the VLC disk image and drag the VLC icon to the Applications folder. Launch VLC from the Applications folder. From the File drop-down menu, select Open Network…. Enter the direct stream URL we learned from the MJPG-streamer demo pages (http://[IP address]:8080/?action=stream) and click on the Open button. (Optional) You can add live audio monitoring from the webcam by opening up a Terminal window (located in /Applications/Utilities ) and typing in the following command: ssh pi@[IP address] sox -t alsa plughw:1 -t sox - | sox -q -t sox - -d Viewing on Linux Let's install VLC or MPlayer and open up the webcam stream: Use your distribution's package manager to add the vlc or mplayer package. For VLC, either use the GUI to Open a Network Stream or launch it from the command line with vlc http://[IP address]:8080/?action=stream For MPlayer, you need to tag on an MJPG article extension to the stream, using the following command: mplayer "http://[IP address]:8080/?action= stream&stream.mjpg" (Optional) You can add live audio monitoring from the webcam by opening up a Terminal and typing in the following command: ssh pi@[IP address] sox -t alsa plughw:1 -t sox - | sox -q -t sox - -d Recording the video stream The best way to save a video clip from the stream is to record it with VLC, and save it into an AVI article container. With this method, we get to keep the MJPEG compression while retaining the frame rate information. Unfortunately, you won't be able to record the webcam video with sound. There's no way to automatically synchronize audio with the MJPEG stream. The only way to produce a video article with sound would be to grab video and audio streams separately and edit them together manually in a video editing application such as VirtualDub. Recording in Windows We're going to launch VLC from the command line to record our video: Open up a command prompt window from the Start menu by clicking on the shortcut or by typing in cmd in the Run or Search fields. Then type in the following command to start recording the video stream to a article called myvideo.avi, located on the desktop: C:> "C:Program Files (x86)VideoLANVLCvlc.exe" http://[IP address]:8080/?action=stream --sout="#standard{mux=avi,dst=%UserProfile%Desktopmyvideo.avi,access=file}" As we've mentioned before, if your particular Windows version doesn't have a C:Program Files (x86) folder, just erase the (x86) part from the path, on the command line. It may seem like nothing much is happening, but there should now be a growing myvideo.avi recording on your desktop. To confirm that VLC is indeed recording, we can select Media Information from the Tools drop-down menu and then select the Statistics tab. Simply close VLC to stop the recording. Recording in Mac OS X We're going to launch VLC from the command line, to record our video: Open up a Terminal window (located in /Applications/Utilities) and type in the following command to start recording the video stream to a file called myvideo.avi, located on the desktop: $ /Applications/VLC.app/Contents/MacOS/VLC http://[IP address]:8080/?action=stream --sout='#standard{mux=avi,dst=/Users/[username]/Desktop/myvideo.avi,access=file}' Replace [username] with the name of the account you used to log in to your Mac, or remove the directory path to write the video to the current directory. It may seem like nothing much is happening, but there should now be a growing myvideo.avi recording on your desktop. To confirm that VLC is indeed recording, we can select Media Information from the Window drop-down menu and then select the Statistics tab. Simply close VLC to stop the recording. Recording in Linux We're going to launch VLC from the command line to record our video: Open up a Terminal window and type in the following command to start recording the video stream to a file called myvideo.avi, located on the desktop: $ vlc http://[IP address]:8080/?action=stream --sout='#standard{mux=avi,dst=/home/[username]/Desktop/myvideo.avi,access=file}' Replace [username] with your login name, or remove the directory path to write the video to the current directory.
Read more
  • 0
  • 1
  • 2849
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime