<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" version="2.0">
<channel>
<title>SAM</title>
<link>https://sam.ensam.eu:443</link>
<description>The DSpace digital repository system captures, stores, indexes, preserves, and distributes digital research material.</description>
<pubDate xmlns="http://apache.org/cocoon/i18n/2.1">Fri, 10 Apr 2026 08:33:17 GMT</pubDate>
<dc:date>2026-04-10T08:33:17Z</dc:date>
<item>
<title>Expressive Virtual Human : Impact of expressive wrinkles and pupillary size on emotion recognition</title>
<link>http://hdl.handle.net/10985/17027</link>
<description>Expressive Virtual Human : Impact of expressive wrinkles and pupillary size on emotion recognition
MILCENT, Anne-Sophie; KADRI, Abdelmajid; GESLIN, Erik; RICHIR, Simon
Improving the expressiveness of virtual humans is essential for qualitative interactions and development of an emotional bond. It is certainly indicated for all applications using the user’s cognitive processes, such as applications dedicated to training or health. Our study aims to contribute to the design of an expressive virtual human, by identifying and adapting visual factors promoting transcription of emotions. In this paper, we investigate the effect of expressive wrinkles and variation of pupil size. We propose to compare the recognition of basic emotions on a real human and on an expressive virtual human. The virtual human was subject to two different factors: expressive wrinkles and/or pupil size. Our results indicate that emotion recognition rates on the virtual agent are high. Moreover, expressive wrinkles affect emotion recognition. The effect of pupillary size is less significant. However, both are recommended to design an expressive virtual human.
</description>
<pubDate>Tue, 01 Jan 2019 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/17027</guid>
<dc:date>2019-01-01T00:00:00Z</dc:date>
<dc:creator>MILCENT, Anne-Sophie</dc:creator>
<dc:creator>KADRI, Abdelmajid</dc:creator>
<dc:creator>GESLIN, Erik</dc:creator>
<dc:creator>RICHIR, Simon</dc:creator>
<dc:description>Improving the expressiveness of virtual humans is essential for qualitative interactions and development of an emotional bond. It is certainly indicated for all applications using the user’s cognitive processes, such as applications dedicated to training or health. Our study aims to contribute to the design of an expressive virtual human, by identifying and adapting visual factors promoting transcription of emotions. In this paper, we investigate the effect of expressive wrinkles and variation of pupil size. We propose to compare the recognition of basic emotions on a real human and on an expressive virtual human. The virtual human was subject to two different factors: expressive wrinkles and/or pupil size. Our results indicate that emotion recognition rates on the virtual agent are high. Moreover, expressive wrinkles affect emotion recognition. The effect of pupillary size is less significant. However, both are recommended to design an expressive virtual human.</dc:description>
</item>
<item>
<title>Bernardo Autonomous Emotional Agents Increase Perception of VR Stimuli</title>
<link>http://hdl.handle.net/10985/17735</link>
<description>Bernardo Autonomous Emotional Agents Increase Perception of VR Stimuli
GESLIN, Erik; BARTHEYE, Olivier; SCHMIDT, Colin; TCHA-TOKEY, Katy; KULSUWAN, Teerawat; KEZIZ, Salah; BELOUIN, Tanguy
Video games are high emotional vectors. They play with the emotions of players by eliciting and increasing them. The importance of the induction of basic emotions has been a long forestay and is favoured by video game publishers, as they are quite easily mobilized. Video game publishers look to produce more complex social emotions like empathy, and compassion. In games framework with narrative context, designers frequently use cinema movies methods, like cinematic non-interactive Cutscenes. These methods temporarily exclude the player from interactivity to leave his first viewpoint view and move the camera focusing on the narrative stimuli.  Cutscenes were used abundantly and are now rejected, the new development wave is often trying to develop in a “zero cinematic” way. For the same reason, cinematics are also not usable in new Virtual Reality. If VR games and simulations provides a high level of presence, VR environments needs certain rules related in particular to the continuation of free will and the avoidance of possible Break in Presence. We propose in this paper a concept of Emotionally Intelligent Virtual Avatars, which when they perceive an important narrative stimulus, share their emotions through, gestures, facial nonverbal expressions, and declarative sentences to stimulate the player's attention. This will lead players to focus on the narrative stimuli. Our research studies the impact of the use of Bernardo Agents Emotional Avatars involving n = 51 users. The statistical analysis of the results shows a significant difference in the narrative perception of the stimuli and in Presence, correlated to the use of Agents Bernardo. Overall, our emotional Agent Bernardo is a unique concept for increasing the perception of narrative stimuli in virtual environments using HMD, and may be useful in all virtual environments using an emotional narrative process.
</description>
<pubDate>Wed, 01 Jan 2020 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/17735</guid>
<dc:date>2020-01-01T00:00:00Z</dc:date>
<dc:creator>GESLIN, Erik</dc:creator>
<dc:creator>BARTHEYE, Olivier</dc:creator>
<dc:creator>SCHMIDT, Colin</dc:creator>
<dc:creator>TCHA-TOKEY, Katy</dc:creator>
<dc:creator>KULSUWAN, Teerawat</dc:creator>
<dc:creator>KEZIZ, Salah</dc:creator>
<dc:creator>BELOUIN, Tanguy</dc:creator>
<dc:description>Video games are high emotional vectors. They play with the emotions of players by eliciting and increasing them. The importance of the induction of basic emotions has been a long forestay and is favoured by video game publishers, as they are quite easily mobilized. Video game publishers look to produce more complex social emotions like empathy, and compassion. In games framework with narrative context, designers frequently use cinema movies methods, like cinematic non-interactive Cutscenes. These methods temporarily exclude the player from interactivity to leave his first viewpoint view and move the camera focusing on the narrative stimuli.  Cutscenes were used abundantly and are now rejected, the new development wave is often trying to develop in a “zero cinematic” way. For the same reason, cinematics are also not usable in new Virtual Reality. If VR games and simulations provides a high level of presence, VR environments needs certain rules related in particular to the continuation of free will and the avoidance of possible Break in Presence. We propose in this paper a concept of Emotionally Intelligent Virtual Avatars, which when they perceive an important narrative stimulus, share their emotions through, gestures, facial nonverbal expressions, and declarative sentences to stimulate the player's attention. This will lead players to focus on the narrative stimuli. Our research studies the impact of the use of Bernardo Agents Emotional Avatars involving n = 51 users. The statistical analysis of the results shows a significant difference in the narrative perception of the stimuli and in Presence, correlated to the use of Agents Bernardo. Overall, our emotional Agent Bernardo is a unique concept for increasing the perception of narrative stimuli in virtual environments using HMD, and may be useful in all virtual environments using an emotional narrative process.</dc:description>
</item>
<item>
<title>Improving Humans: Enhancing the complex sociological being with the virtual</title>
<link>http://hdl.handle.net/10985/18835</link>
<description>Improving Humans: Enhancing the complex sociological being with the virtual
TCHA-TOKEY, Katy; SCHMIDT, Colin; GESLIN, Erik; RICHIR, Simon
In this paper, we argue in favour of using an immersive Virtual Environment (VE) in order to improve human capabilities. We develop this idea in order to advance the potential of VEs in enhancing humans. Training with VEs has proven in some cases to be more efficient than training in real world situations in terms of the reduction of time consumption, risk reduction, easily presenting specific simulation realism, all of which improve learning capabilities. The VE would be an environment for extending human capabilities, the goal being to increase experiences. This last affirmation could renew the scope of action for AH research: acquiring new needed capabilities from the virtual world that would be usable in both worlds, real and virtual.
</description>
<pubDate>Wed, 01 Jan 2020 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/18835</guid>
<dc:date>2020-01-01T00:00:00Z</dc:date>
<dc:creator>TCHA-TOKEY, Katy</dc:creator>
<dc:creator>SCHMIDT, Colin</dc:creator>
<dc:creator>GESLIN, Erik</dc:creator>
<dc:creator>RICHIR, Simon</dc:creator>
<dc:description>In this paper, we argue in favour of using an immersive Virtual Environment (VE) in order to improve human capabilities. We develop this idea in order to advance the potential of VEs in enhancing humans. Training with VEs has proven in some cases to be more efficient than training in real world situations in terms of the reduction of time consumption, risk reduction, easily presenting specific simulation realism, all of which improve learning capabilities. The VE would be an environment for extending human capabilities, the goal being to increase experiences. This last affirmation could renew the scope of action for AH research: acquiring new needed capabilities from the virtual world that would be usable in both worlds, real and virtual.</dc:description>
</item>
<item>
<title>EEVEE : the Empathy-Enhancing Virtual Evolving Environment</title>
<link>http://hdl.handle.net/10985/10570</link>
<description>EEVEE : the Empathy-Enhancing Virtual Evolving Environment
JACKSON, Philip L.; MICHON, Pierre-Emmanuel; GESLIN, Erik; CARIGNAN, Maxime; BEAUDOIN, Danny
Empathy is a multifaceted emotional and mental faculty that is often found to be affected in a great number of psychopathologies, such as schizophrenia, yet it remains very difficult to measure in an ecological context. The challenge stems partly from the complexity and fluidity of this social process, but also from its covert nature. One powerful tool to enhance experimental control over such dynamic social interactions has been the use of avatars in virtual reality (VR); information about an individual in such an interaction can be collected through the analysis of his or her neurophysiological and behavioral responses. We have developed a unique platform, the Empathy-Enhancing Virtual Evolving Environment (EEVEE), which is built around three main components: (1) different avatars capable of expressing feelings and emotions at various levels based on the Facial Action Coding System (FACS); (2) systems for measuring the physiological responses of the observer (heart and respiration rate, skin conductance, gaze and eye movements, facial expression); and (3) a multimodal interface linking the avatar's behavior to the observer's neurophysiological response. In this article, we provide a detailed description of the components of this innovative platform and validation data from the first phases of development. Our data show that healthy adults can discriminate different negative emotions, including pain, expressed by avatars at varying intensities. We also provide evidence that masking part of an avatar's face (top or bottom half) does not prevent the detection of different levels of pain. This innovative and flexible platform provides a unique tool to study and even modulate empathy in a comprehensive and ecological manner in various populations, notably individuals suffering from neurological or psychiatric disorders.
</description>
<pubDate>Thu, 01 Jan 2015 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/10570</guid>
<dc:date>2015-01-01T00:00:00Z</dc:date>
<dc:creator>JACKSON, Philip L.</dc:creator>
<dc:creator>MICHON, Pierre-Emmanuel</dc:creator>
<dc:creator>GESLIN, Erik</dc:creator>
<dc:creator>CARIGNAN, Maxime</dc:creator>
<dc:creator>BEAUDOIN, Danny</dc:creator>
<dc:description>Empathy is a multifaceted emotional and mental faculty that is often found to be affected in a great number of psychopathologies, such as schizophrenia, yet it remains very difficult to measure in an ecological context. The challenge stems partly from the complexity and fluidity of this social process, but also from its covert nature. One powerful tool to enhance experimental control over such dynamic social interactions has been the use of avatars in virtual reality (VR); information about an individual in such an interaction can be collected through the analysis of his or her neurophysiological and behavioral responses. We have developed a unique platform, the Empathy-Enhancing Virtual Evolving Environment (EEVEE), which is built around three main components: (1) different avatars capable of expressing feelings and emotions at various levels based on the Facial Action Coding System (FACS); (2) systems for measuring the physiological responses of the observer (heart and respiration rate, skin conductance, gaze and eye movements, facial expression); and (3) a multimodal interface linking the avatar's behavior to the observer's neurophysiological response. In this article, we provide a detailed description of the components of this innovative platform and validation data from the first phases of development. Our data show that healthy adults can discriminate different negative emotions, including pain, expressed by avatars at varying intensities. We also provide evidence that masking part of an avatar's face (top or bottom half) does not prevent the detection of different levels of pain. This innovative and flexible platform provides a unique tool to study and even modulate empathy in a comprehensive and ecological manner in various populations, notably individuals suffering from neurological or psychiatric disorders.</dc:description>
</item>
<item>
<title>How Color Properties Can Be Used to Elicit Emotions in Video Games</title>
<link>http://hdl.handle.net/10985/10569</link>
<description>How Color Properties Can Be Used to Elicit Emotions in Video Games
GESLIN, Erik; JEGOU, Laurent; BEAUDOIN, Danny
Classifying the many types of video games is difficult, as their genres and supports are different, but they all have in common that they seek the commitment of the player through exciting emotions and challenges. Since the income of the video game industry exceeds that of the film industry, the field of inducting emotions through video games and virtual environments is attracting more attention. Our theory, widely supported by substantial literature, is that the chromatic stimuli intensity, brightness, and saturation of a video game environment produce an emotional effect on players. We have observed a correlation between the RGB additives color spaces, HSV, HSL, and HSI components of video game images, presented to = 85 participants, and the emotional statements expressed in terms of arousal and valence, recovered in a subjective semantic questionnaire. Our results show a significant correlation between luminance, saturation, lightness, and the emotions of joy, sadness, fear, and serenity experienced by participants viewing 24 video game images.We also show strong correlations between the colorimetric diversity, saliency volume, and stimuli conspicuity and the emotions expressed by the players. These results allow us to propose video game environment development methods in the form of a circumplex model. It is aimed at game designers for developing emotional color scripting.
</description>
<pubDate>Fri, 01 Jan 2016 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/10569</guid>
<dc:date>2016-01-01T00:00:00Z</dc:date>
<dc:creator>GESLIN, Erik</dc:creator>
<dc:creator>JEGOU, Laurent</dc:creator>
<dc:creator>BEAUDOIN, Danny</dc:creator>
<dc:description>Classifying the many types of video games is difficult, as their genres and supports are different, but they all have in common that they seek the commitment of the player through exciting emotions and challenges. Since the income of the video game industry exceeds that of the film industry, the field of inducting emotions through video games and virtual environments is attracting more attention. Our theory, widely supported by substantial literature, is that the chromatic stimuli intensity, brightness, and saturation of a video game environment produce an emotional effect on players. We have observed a correlation between the RGB additives color spaces, HSV, HSL, and HSI components of video game images, presented to = 85 participants, and the emotional statements expressed in terms of arousal and valence, recovered in a subjective semantic questionnaire. Our results show a significant correlation between luminance, saturation, lightness, and the emotions of joy, sadness, fear, and serenity experienced by participants viewing 24 video game images.We also show strong correlations between the colorimetric diversity, saliency volume, and stimuli conspicuity and the emotions expressed by the players. These results allow us to propose video game environment development methods in the form of a circumplex model. It is aimed at game designers for developing emotional color scripting.</dc:description>
</item>
<item>
<title>Les humains virtuels expressifs dans les simulateurs en santé</title>
<link>http://hdl.handle.net/10985/17524</link>
<description>Les humains virtuels expressifs dans les simulateurs en santé
MILCENT, Anne-Sophie; KADRI, Abdelmajid; GESLIN, Erik; RICHIR, Simon
Healthcare simulators are considered as interactive and playful learning environments offering many training opportunities. Medical students wish to train with virtual simulators for a few years to prepare their encounter with real patients. The integration of virtual agents endowed with emotions promotes exchanges and interactions, and provokes emotional reactions in the learner. This leads to emotional involvement and facilitates learning and memorization. We will focus our research on the impact of expressive virtual humans on the user experience, and more especially on the induction of empathy in learners.
</description>
<pubDate>Tue, 01 Jan 2019 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/17524</guid>
<dc:date>2019-01-01T00:00:00Z</dc:date>
<dc:creator>MILCENT, Anne-Sophie</dc:creator>
<dc:creator>KADRI, Abdelmajid</dc:creator>
<dc:creator>GESLIN, Erik</dc:creator>
<dc:creator>RICHIR, Simon</dc:creator>
<dc:description>Healthcare simulators are considered as interactive and playful learning environments offering many training opportunities. Medical students wish to train with virtual simulators for a few years to prepare their encounter with real patients. The integration of virtual agents endowed with emotions promotes exchanges and interactions, and provokes emotional reactions in the learner. This leads to emotional involvement and facilitates learning and memorization. We will focus our research on the impact of expressive virtual humans on the user experience, and more especially on the induction of empathy in learners.</dc:description>
</item>
</channel>
</rss>
