IEMOCAP: interactive emotional dyadic motion capture database

IEMOCAP: interactive emotional dyadic motion capture database

5 November 2008 | Carlos Busso · Murtaza Bulut · Chi-Chun Lee · Abe Kazemzadeh · Emily Mower · Samuel Kim · Jeannette N. Chang · Sungbok Lee · Shrikanth S. Narayanan
The paper introduces the "Interactive Emotional Dyadic Motion Capture Database" (IEMOCAP), a new audio-visual database designed to facilitate the study of human communication through both verbal and non-verbal channels. Collected by the Speech Analysis and Interpretation Laboratory (SAIL) at the University of Southern California (USC), IEMOCAP features detailed motion capture data from ten actors in dyadic sessions. The actors performed scripted emotional scenarios and improvised hypothetical dialogues to elicit specific emotions such as happiness, anger, sadness, frustration, and neutrality. The database includes approximately 12 hours of data, with markers on the face, head, and hands to capture facial expressions and hand movements. This comprehensive and authentic dataset addresses the limitations of existing emotional databases, which often lack genuine interaction, isolated utterances, and detailed motion capture information. The IEMOCAP database is expected to enhance the development and implementation of robust emotion recognition models in real-life applications.The paper introduces the "Interactive Emotional Dyadic Motion Capture Database" (IEMOCAP), a new audio-visual database designed to facilitate the study of human communication through both verbal and non-verbal channels. Collected by the Speech Analysis and Interpretation Laboratory (SAIL) at the University of Southern California (USC), IEMOCAP features detailed motion capture data from ten actors in dyadic sessions. The actors performed scripted emotional scenarios and improvised hypothetical dialogues to elicit specific emotions such as happiness, anger, sadness, frustration, and neutrality. The database includes approximately 12 hours of data, with markers on the face, head, and hands to capture facial expressions and hand movements. This comprehensive and authentic dataset addresses the limitations of existing emotional databases, which often lack genuine interaction, isolated utterances, and detailed motion capture information. The IEMOCAP database is expected to enhance the development and implementation of robust emotion recognition models in real-life applications.
Reach us at info@study.space
[slides] IEMOCAP%3A interactive emotional dyadic motion capture database | StudySpace