Skip to: main navigation | main content | sitemap | accessibility page

 

“The NAO robot enables rapid prototyping of social behaviors, such as emotions, without having to struggle with low-level technical details.”

Markus Häring quoteWhen humans communicate with each other, they do not only use speech to convey the content of a message. At the same time, they employ a large variety of emotional and social signals to express consciously or unconsciously additional information, for example, about their attitude towards the conversational partner, their level of attention and their personality.

As robots are expected to interact with humans in social environments, like hospitals, schools or at home, a comfortable and intuitive way of communication has to be established. The ability to display emotions is a key feature in human communication. In recent years huge progress was made in the effort to express emotions with humanoid robots, mainly with facial expressions. For expressions based on Body Movement and other signals, like Sound, no common grounds have been established so far.

Markus Häring with Nori, Nali and Nyr Nao humanoid robotsOver the past ten years, our lab has been working on the simulation of human-like behaviours in synthetic agents. On the one hand, we employ machine learning methods to analyse social signals, such as emotions, conveyed by humans interacting with synthetic agents. On the other hand, we generate expressive behaviours by synchronizing speech, facial expressions, gaze, gestures and posture. For example, a synthetic agent may talk about a particular object and at the same time gaze and point at it.

Nao robots Nori, Nali and NyrPhysical embodiment via a robot opens up a number of new research challenges to us. Unlike virtual agents, robotic agent co-habit with the human in a physical space in which social interactions take place. The NAO robot enables rapid prototyping of social behaviours, such as emotions, without having to struggle with low-level technical details.

Based on psychological research on human expression of emotions and perception of emotional stimuli we created eight different expressional designs for the emotions Anger, Sadness, Fear and Joy, consisting of Body Movements, Sounds and Eye Colours, using the robotic platform NAO.

Markus Häring quote 2In our main experiment we separated the expressional designs into their single cues (Body Movement, Sound, Eye Color) and evaluated their expressivity within the Pleasure-Arousal-Dominance model using the Self-Assessment Manikin (SAM). We based our expressional designs on psychological research on the connection of emotions to body movement, sound and colour. While some movements are attributed to one specific emotion, others are attributed to a group of related emotions. These findings are important guidelines for the design of emotion expression based on body movements.

Students working with Nao and Choregraphe in a practical courseThis approach forms the basis for our effort to create a large pool of validated emotional expressions for the NAO robotic platform as well as the identification and creation of rules and guidelines to combine expressive cues into proper emotional expressions. The connection of sound and emotions is very complex. There is evidence that specific features are important for emotional communication. But Bachorowski and Owren*, for example, support the perspective that emotional vocal expressions aim more to affect the listener than to express an inner emotional state. We decided to use in most cases human or animal-like acoustic expressions that are commonly related to our emotions, like crying for Sadness, frantic noises for Anger or cheering for Joy.

Students working with Nao and Choregraphe in a practical courseFirst we created the Body Movements. Then we modified our Sounds so they met the timing of the movements. Finally the Eye Colours were added. The behaviour modelling software of NAO allowed a simple synchronization of the used modalities, so we did not have to implement our own solution. Because NAO is so easy to use, we were also able to exploit it successfully in education. We observed that working with NAO has a highly motivating effect on our students, and many of them were very interested in conducting student projects with NAO, for example, within a bachelor or master thesis, after having attended a lecture about it. In this way they are able to be part of our ongoing research by design new expressive behaviours like emotions or dialog gestures.

We mostly use Choregraphe to model the body movements and to synchronize them with the corresponding sounds. For more complex behaviours, we implement python scripts using the SDK. Recently we started to use NAOSim so that our students are able to work on their projects at home. Markus Häring quoteSo far, we focused on the simulation of emotional behaviours in NAO using gestures, sound and colour effects. An empirical study revealed that the expressive behaviours in NAO were in most cases correctly interpreted by human observers. Our future research will concentrate on the implementation of reactive social behaviours. That is NAO should be able to analyse social signals from the human conversational partner and respond to it in real time. Another topic of our future research is the realization of multi-party dialog between several robots and several humans. In this context we are currently studying the appropriate use of gaze and pointing behaviours and work on gestures the robot needs to become a capable participant in a multi-party dialog.

If you want to follow our research take a look at our project page www.hcm-lab.de, where most of our work is available for download in the form of behaviour files for Choregraphe and python scripts.

*J. A. Bachorowski and M. J. Owren.
Sounds of emotion: production and perception of affect-related vocal acoustics. Ann N Y Acad Sci,1000:244–265, December 2003.

PDF Download – Markus Häring – Augsburg University (Germany)

Back to case studies