Force and vibrotactile integration for 3D user interaction within virtual environments
TypeCommunications avec actes
Proper integration of sensory cues facilitates 3D user interaction within virtual environments (VEs). Studies on multi-sensory integration of visual and haptic cues revealed that the integration follows maximum likelihood estimation (MLE). Little effort focuses however on integrating force and vibrotactile cues - two sub-categorical cues of the haptic modality. Hence, this paper presents an investigation on MLE's suitability for integrating these sub-categorical cues. Within a stereoscopic VE, human users performed a 3D interactive task of navigating a flying drone along a high-voltage transmission line in an inaccessible region and identifying defects on the line. The users had to identify defects via individual force or vibrotactile cues, and their combinations in co-located and dislocated settings. The co-located setting provided both cues on the right hand of the users; whereas the dislocated setting delivered the force and vibrotactile cues on the right hand and forearm of the users, respectively. Task performance of the users, such as completion time and accuracy, was assessed under each cue and setting. The presence of the vibrotactile cue promoted a better performance than the force cue alone. This was in agreement with the role of tactile cues in sensing surface properties, herein setting a baseline for using MLE. The task performance under the co-located setting indicated certain degrees of combining those under the individual cues. In contrast, the performance under the dislocated setting was alike that under the individual vibrotactile cue. These observations imply an inconclusiveness of MLE to integrate both cues in a co-located setting for 3D user interaction.
Fichier(s) constituant cette publication
Cette publication figure dans le(s) laboratoire(s) suivant(s)
Visualiser des documents liés par titre, auteur, créateur et sujet.
Mechanism of Integrating Force and Vibrotactile Cues for 3D User Interaction within Virtual Environments ERFANIAN, Aïda; TARNG, Stanley; HU, Yaoping; PLOUZEAU, Jérémy; MERIENNE, Frédéric (IEEE, 2017)Proper integration of sensory cues facilitates 3D user interaction within virtual environments (VEs). Studies showed that the integration of visual and haptic cues follows maximum likelihood estimation (MLE). Little effort ...
Navigation in virtual environments: Design and comparison of two anklet vibration patterns for guidance PLOUZEAU, Jérémy; ERFANIAN, Aïda; CHIU, Cynthia; MERIENNE, Frédéric; HU, Yaoping (IEEE, 2016)In this study, we present a preliminary exploration about the added value of vibration information for guiding navigation in a VE. The exploration consists of two parts. Firstly, we designed two different vibration patterns. ...
ASSILA, Ahlem; PLOUZEAU, Jérémy; MERIENNE, Frédéric; ERFANIAN, Aïda; HU, Yaoping (Springer, 2017)Navigation is a key factor for immersion and exploration in virtual environment (VE). Nevertheless, measuring navigation performance is not an easy task, especially when analyzing and interpreting heterogeneous results of ...
TARNG, Stanley; MERIENNE, Frédéric; HU, Yaoping; ERFANIAN, Aïda (IEEE, 2018)Vibrotactile and force cues of the haptic modality is increasing used to facilitate interactive tasks in three-dimensional (3D) virtual environments (VE). While maximum likelihood estimation (MLE) explains the integration ...
TARNG, Stanley; ERFANIAN, Aida; HU, Yaoping; MERIENNE, Frederic (IEEE, 2018-05-09)In a three-dimensional (3D) virtual environment (VE), proper collaboration between vibrotactile and force cues - two cues of the haptic modality - is important to facilitate task performance of human users. Many studies ...