Adding computer vision to A.R.E.S.
In the previous section, we explored OpenCV and YOLO, using OpenCV to view images and video feeds, and YOLO to identify a dog in a picture. In this section, we’ll apply what we’ve learned to create a smart video streaming application that represents the eyes of A.R.E.S. We’ll only use dogs as an example, but we could easily adapt this application for tracking other objects.
We will start by encapsulating our YOLO code into a class called DogTracker
before creating a video streaming application using this class with OpenCV.
Creating the DogTracker class
The DogTracker
class embodies the artificial intelligence component of A.R.E.S. Although it could be installed directly on the Raspberry Pi 3B+ within A.R.E.S. and accessed remotely via the streaming window application, we will install it on a computer alongside our streaming application for simplicity and improved performance. In our example, we will utilize a Windows...