The study of emotions has attracted considerable attention in several areas, from artificial intelligence and psychology to neuroscience. The use of emotions in decisionmaking processes is an example of how multi-disciplinary they are. To be able to communicate better with humans, robots should use appropriate communicational gestures, considering the emotions of their human conversation partners. In this paper we propose a deep neural network model which is able to recognize spontaneous emotional expressions and to classify them as positive or negative. We evaluate our model in two experiments, one using benchmark datasets, and the other using an HRI scenario with a humanoid robotic head, which itself gives emotional feedback.