Empirical investigation of the temporal relations between speech and facial expressions of emotion

DSpace Repository

Show simple item record

dc.contributor.author BUISINE, Stéphanie
ensam.hal.laboratories
  127758 Laboratoire Conception de Produits et Innovation [LCPI]
dc.contributor.author WANG, Yun
ensam.hal.laboratories
dc.contributor.author GRYNSZPAN, Ouriel
dc.date.accessioned 2013-02-08T16:07:03Z
dc.date.available 2013-02-08T16:07:03Z
dc.date.issued 2010
dc.date.submitted 2013-02-05T08:32:38Z
dc.date.submitted 2013-02-08T13:40:13Z
dc.identifier.citation Journal on Multimodal User Interfaces, 3, 263-270
dc.identifier.uri http://hdl.handle.net/10985/6760
dc.description Lien vers l'article original : http://link.springer.com/article/10.1007/s12193-010-0050-4 en_US
dc.description.abstract Behavior models implemented within Embodied Conversational Agents (ECAs) require nonverbal communication to be tightly coordinated with speech. In this paper we present an empirical study seeking to explore the influence of the temporal coordination between speech and facial expressions of emotions on the perception of these emotions by users (measuring their performance in this task, the perceived realism of behavior, and user preferences).We generated five different conditions of temporal coordination between facial expression and speech: facial expression displayed before a speech utterance, at the beginning of the utterance, throughout, at the end of, or following the utterance. 23 subjects participated in the experiment and saw these 5 conditions applied to the display of 6 emotions (fear, joy, anger, disgust, surprise and sadness). Subjects recognized emotions most efficiently when facial expressions were displayed at the end of the spoken sentence. However, the combination users viewed as most realistic, preferred over others, was the display of the facial expression throughout speech utterance. We review existing literature to position our work and discuss the relationship between realism and communication performance. We also provide animation guidelines and draw some avenues for future work. en_US
dc.language.iso en en_US
dc.publisher Springer en_US
dc.rights Post-print en_US
dc.title Empirical investigation of the temporal relations between speech and facial expressions of emotion en_US
ensam.hal.id hal-00786508 *
ensam.hal.status accept *
dc.identifier.doi 10.1007/s12193-010-0050-4
dc.typdoc Articles dans des revues avec comité de lecture en_US
dc.localisation Centre de Paris en_US
dc.subject.hal Informatique: Interface homme-machine en_US
dc.subject.hal Informatique: Modélisation et simulation en_US
ensam.workflow.submissionConsumer updateMetadata *
ensam.audience Internationale en_US
ensam.page 3, 263-270 en_US
ensam.journal Journal on Multimodal User Interfaces en_US

Files in this item

 

This item appears in the following Collection(s)

Show simple item record

Search


Number of documents in SAM

  • 2911 references in SAM

Newsletter

My Account

Reporting Suite

Help