Multimodal Detection of Human Emotions by Social Robotics
With the help of autonomous robot systems, various social tasks can be supported. Promising application possibilities include support for illness, care, therapy, as well as assistance for healthy, active aging and the maintenance of self-employment. Social robots will be highly relevant to handle the challenges of demographic change. The robotic systems need to have certain social and emotional skills to guarantee user acceptance as well as automatic emotion-recognition and adequate reactions to human emotions.
In addition to the voice content, other factors such as para-verbal language, gesture and mimic play a decisive role in the classification of the emotional states. Difficulties often appear concerning cultural and personal characteristics. The construction and application of speech-, video- and biosignal-based detection methods for identifying safety- and comfort-relevant user states seem to be an important part of a solution. By developing models of interaction including culture and person-specific characteristics, the recognition of short-term behavioral changes and appropriate social interaction is possible.