Testing the model
Now we need to load the trained model and test it. Here, we will be using the video stream. The FER application will detect the emotion based on my facial expression. You can refer to the code using this GitHub link: https://github.com/jalajthanaki/Facial_emotion_recognition_using_TensorFlow/blob/master/emotion_recognition.py.
You can find the code snippet for this in the following figure:

Figure 10.35: Code snippet for loading the trained model and performing testing
In order to start testing, we need to execute the following command:
$ python emotion_recognition.py poc
This testing will use your webcam. I have some demo files that I want to share here. Refer to the following figure:

Figure 10.36: Code snippet for FER application identifying the emotion of disgust
Also refer to the the following figure:

Figure 10.37: tThe FER application identifying the happy emotion
Refer to the code snippet in the following figure:

Figure 10.38: Code snippet for the FER application identifying...