The Emotion API analyzes faces to identify the emotions of a person. This API takes facial expression from an image as input and returns feelings/expressions for that face. If a user has already called the Face API on a particular image, they can submit the face rectangle from that image as an optional input. The emotions detected by the Emotion API are anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise. For more information, refer to https://www.microsoft.com/cognitive-services/en-us/emotion-api/documentation.
Both the Face and Emotion APIs can also detect face attributes and emotions from a video. For a video, the Emotion API will detect the facial expressions of people in the video and return a summary of their emotions. In real-time scenarios, you can use these APIs to find out how a crowd responds to your speech or content.