Tools and measures for the computational modeling of human nonverbal behaviors

Modern human machine interaction systems require knowledge about their environment, the interaction itself, and their interlocutors' behavior in order to be able to show appropriate nonverbal behavior as well as to adapt dialog policies accordingly. I will present, some recent achievements in the area of automatic behavior recognition and understanding that can provide information about the interactants' multimodal nonverbal behavior and subsequently their affective states. In particular, I will present means to evaluate and measure rapport and entrainment between interlocutors based on a large dyadic corpus along with a robust easily deployable speaker independent way to detect pause fillers.

In this presentation, I will further introduce a perception markup language (PML) which is a first step towards a standardized representation of perceived nonverbal behaviors. PML follows several design concepts, namely compatibility and synergy, modeling uncertainty, multiple interpretative layers, and extensibility, in order to maximize its usefulness for the research community. We show how we can successfully integrate PML in a fully automated virtual agent system for healthcare applications developed at the USC Institute for Creative Technologies. In our system PML is closely interwoven with the decision processes of the dialog manager and the nonverbal behavior generation components steering the virtual agent, enabling the virtual agent to react and behave in a more natural and reactive way.

 

Information

Sprecher

Dr. Stefan Scherer
Multimodal Communication and Computation Laboratory
Institute for Creative Technologies
University of Southern California, Los Angeles, USA

Datum

Montag, 16.Juli 2012, 16 Uhr c.t.

Ort

Universität Ulm, N27, Raum 2.033 (Videoübertragung zur Otto-von-Guericke-Universität Magdeburg G26.1-010)