Final presentation of the master thesis
Multimodal Adaptive Dialogue Management in OwlSpeak
Louisa Pragst (Supervisor: Stefan Ultes)
Wednesday, October 21, 2015, 2:45 pm
Uni West, Room 43.2.227
Spoken dialogue systems are employed in human-computer interaction to support the natural communication method of humans. Multimodal dialogue systems can provide an even more human-like interaction for the user by utilising additional communication channels besides speech, such as gestures, gaze, or facial expression. The ability to adapt the dialogue to the task, the situation, or the user, can further improve the user experience. This work focuses on the challenges of multimodal, adaptive dialogue systems for dialogue management. A dialogue manager is the component of a dialogue system that chooses the next system action in dependence on the user action and the dialogue history. Additional input can be provided to the dialogue manager and influence its decision, thus enabling adaptation. The work at hand examines models of emotion as well as culture, and the
potential of their employment in dialogue management. Following these considerations, it describes adjustments of the OwlSpeak dialogue manager that enable it to handle the contemplated adaptations. Moreover, this work illustrates the integration of the speech-based OwlSpeak dialogue manager into a multimodal dialogue system. Thereafter, a user study is presented that determines the impact of the implemented adaptations discussed earlier in this work on the perceived naturalness of human-computer interaction.