The EMOTIC dataset, named after EMOTions In Context, is a database of images with people in real environments, annotated with their apparent emotions. The images are annotated with an extended list of 26 emotion categories combined with the three common continuous dimensions Valence, Arousal and Dominance.
The motivation of this project is providing machines with the ability of understanding what a person is experiencing from her frame of reference. This capacity is essential in our everyday life in order to perceive, anticipate and respond with care to people’s reactions. This makes one think that machines with this type of ability would interact better with people.
While remarkable improvements have been shown in emotion recognition from facial expression or body posture, there are no systems that incorporate contextual information, meaning the situation and surroundings of the person. We expect that our EMOTIC dataset, in combination with previous datasets on emotion estimation, will open the door to new approaches for the problem of emotion estimation in the wild from visual information.
This work is partly supported by the Ministerio de Economía, Industria y Competitividad (Spain), TIN2015-66951-C2-2-R.
Thank NVIDIA for their generous hardware donations.
R. Kosti, J.M. Álvarez, A. Recasens and A. Lapedriza, "Context based emotion recognition using emotic dataset", IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI), 2019. (pdf, bibtex)