Exploring Facial Expressions for Human-Computer Interaction: Combining Visual Face Tracking and EMG Data to Control a Flight Simulation Game
Symposium:
Session Title:
- Interacting with the Virtual
Presentation Title:
- Exploring Facial Expressions for Human-Computer Interaction: Combining Visual Face Tracking and EMG Data to Control a Flight Simulation Game
Presenter(s):
Abstract:
(Long paper)
Keywords: Facial Expression, Facial EMG signal, Emotion, computer vision system.
In many affective computing paradigms a user’s internal state is used as an implicit control signal in an interaction. In the work presented here, we are exploring the utilization of two measurement techniques commonly used to assess a user’s affective state as an explicit control signal in a navigation task in a virtual environment. Concretely, we are investigating the feasibility of combining a real-time emotional biometric sensing system and a computer vision system for human emotional characterization and controlling a computer game. A user’s “happiness” and “sadness” levels are assessed by combining information from a camerabased computer vision system and electromyogram (EMG) signals from the facial corrugator muscle. Using a purpose-designed 3D flight simulation game, users control their simulated up-down motions using their facial expressions. To assess if combining visual and EMG data improves facial tracking performance, we conduct a user study where users are navigating through the 3D visual environment using the two control systems, trying to collect as many tokens as possible. We compared two conditions: Computer vision system alone, and computer vision system in combination with the EMG signal. The results show that combining both signals significantly increases the users’ performance and reduces task difficulty. However, this performance increase is associated with a reduced usability due to the need to have EMG sensors on one’s forehead. We hope these results from our study can help in future game designs, aid the development of more immersive virtual environments, and offer for alternative input methods where traditional methods are insufficient or unfesasible.
Related Links:
Full text (PDF) p. 451-457