Show simple item record

dc.contributor.author
 hal.structure.identifier
BUISINE, Stéphanie
127758 Laboratoire Conception de Produits et Innovation [LCPI]
dc.contributor.authorWANG, Yun
dc.contributor.authorGRYNSZPAN, Ouriel
dc.date.accessioned2013
dc.date.available2013
dc.date.issued2010
dc.date.submitted2013
dc.date.submitted2013
dc.identifier.citationJournal on Multimodal User Interfaces, 3, 263-270
dc.identifier.issn1783-7677
dc.identifier.urihttp://hdl.handle.net/10985/6760
dc.descriptionLien vers l'article original : http://link.springer.com/article/10.1007/s12193-010-0050-4
dc.description.abstractBehavior models implemented within Embodied Conversational Agents (ECAs) require nonverbal communication to be tightly coordinated with speech. In this paper we present an empirical study seeking to explore the influence of the temporal coordination between speech and facial expressions of emotions on the perception of these emotions by users (measuring their performance in this task, the perceived realism of behavior, and user preferences).We generated five different conditions of temporal coordination between facial expression and speech: facial expression displayed before a speech utterance, at the beginning of the utterance, throughout, at the end of, or following the utterance. 23 subjects participated in the experiment and saw these 5 conditions applied to the display of 6 emotions (fear, joy, anger, disgust, surprise and sadness). Subjects recognized emotions most efficiently when facial expressions were displayed at the end of the spoken sentence. However, the combination users viewed as most realistic, preferred over others, was the display of the facial expression throughout speech utterance. We review existing literature to position our work and discuss the relationship between realism and communication performance. We also provide animation guidelines and draw some avenues for future work.
dc.language.isoen
dc.publisherSpringer
dc.rightsPost-print
dc.titleEmpirical investigation of the temporal relations between speech and facial expressions of emotion
dc.identifier.doi10.1007/s12193-010-0050-4
dc.typdocArticle dans une revue avec comité de lecture
dc.localisationCentre de Paris
dc.subject.halInformatique: Interface homme-machine
dc.subject.halInformatique: Modélisation et simulation
ensam.audienceInternationale
ensam.page3, 263-270
ensam.journalJournal on Multimodal User Interfaces
hal.identifierhal-00786508
hal.version1
hal.submission.permittedupdateMetadata
hal.statusaccept
dc.identifier.eissn1783-8738


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record