Emotion recognition from body expressions with a neural network architecture

Link:
Autor/in:
Verlag/Körperschaft:
Association for Computing Machinery (ACM)
Erscheinungsjahr:
2017
Medientyp:
Text
Schlagworte:
  • Robots
  • Sensory perception
  • Body language
  • Robotics
  • Human Robot Interaction
  • Robots
  • Sensory perception
  • Body language
  • Robotics
  • Human Robot Interaction
Beschreibung:
  • The recognition of emotions plays an important role in our daily life and is essential for social communication. Although multiple studies have shown that body expressions can strongly convey emotional states, emotion recognition from body motion patterns has received less attention than the use of facial expressions. In this paper, we propose a self-organizing neural architecture that can effectively recognize affective states from full-body motion patterns. To evaluate our system, we designed and collected a data corpus named the Body Expressions of Emotion (BEE) dataset using a depth sensor in a human-robot interaction scenario. For our recordings, nineteen participants were asked to perform six different emotions: anger, fear, happiness, neutral, sadness, and surprise. In order to compare our system with human-like performance, we conducted an additional experiment by asking fifteen annotators to label depth map video sequences as one of the six emotion classes. The labeling results from human annotators were compared to the results predicted by our system. Experimental results showed that the recognition accuracy of the system was competitive with human performance when exposed to body motion patterns from the same dataset.
Lizenz:
  • info:eu-repo/semantics/closedAccess
Quellsystem:
Forschungsinformationssystem der UHH

Interne Metadaten
Quelldatensatz
oai:www.edit.fis.uni-hamburg.de:publications/56da5a94-ad9b-4496-8e3e-750e973578e5