Understanding users emotions
While voice input is undoubtedly a useful feature, we all well know how the actual meaning of the sentence can be opposite to the literal one, depending on the speaker's intonation, facial expression, and context. Try this simple sentence: Oh, really? Depending on the conditions, this can mean: I doubt, I didn't know, I'm impressed, I don't care, This is obvious, and so on. The problem is that speech is not the only mode of conversation for human beings, and that's why much research is focused these days on teaching computers to understand (and also simulate) gestures, facial expressions, sentiments in a text, eye movements, sarcasm, and other affect manifestations. An interdisciplinary field that emerges around the question of emotional and compassionate AI is known as affective computing. It integrates knowledge from the computer and cognitive sciences, as well as psychology and robotics. The aim is the creation of computer systems that will adapt themselves...