So, what we have seen so far of CoreML is pretty neat, to say the least. But taking a look back over this chapter so far, we have probably spent more time building our app to harness the power of CoreML than actually implementing it.
In this section, we're going to take our app a little further by streaming a live camera feed that in turn will allow us to intercept each frame and detect objects in real time.
Getting ready
For this section, you'll need the latest version of Xcode available from the Mac App Store.
Please note that for this section, you'll need to be connected to a real device for this to work. Currently, the iOS simulator does not have a way to emulate the front or back camera.
How to do it...
Let's begin:
- Head over to our ViewContoller.swift file and make the following amendments:
import AVFoundation
private var previewLayer: AVCaptureVideoPreviewLayer! = nil
var captureSession = AVCaptureSession()
var bufferSize: CGSize...