This article is part of the series Multimedia Human-Computer Interface.

Open Access Research Article

Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals

Christine Lætitia Lisetti1* and Fatma Nasoz2

Author Affiliations

1 Department of Multimedia Communications, Institut Eurecom, Sophia-Antipolis, 06904, France

2 Department of Computer Science, University of Central Florida, Orlando, FL 32816-2362, USA

For all author emails, please log on.

EURASIP Journal on Advances in Signal Processing 2004, 2004:929414  doi:10.1155/S1110865704406192


The electronic version of this article is the complete one and can be found online at: http://asp.eurasipjournals.com/content/2004/11/929414


Received: 30 July 2002
Revisions received: 14 April 2004
Published: 18 September 2004

© 2004 Lisetti and Nasoz

This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

We discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI) and user modeling. We introduce the overall paradigm for our multimodal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature) and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement). We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and potential applications of emotion recognition for multimodal intelligent systems.

Keywords:
multimodal human-computer interaction; emotion recognition; multimodal affective user interfaces

Research Article