Funded Projects

Persuasive Technologies in Highly Autonomous Cars to Increase Cooperation and Safety

Duration: 2016-2019
Funding by: Carl Zeiss Foundation
Cooperation Partners: Department of Human Factors, Ulm University (Prof. Dr. Martin Baumann)

Highly automated driving aims to radically increase driving safety because human errors account for over 90 percent of severe traffic accidents. When automation is evolved enough to drive safer than human beings, it is assumed that the human driver can still reclaim control over the vehicle. Reasons for this may be a lack of trust, misunderstanding or a disagreement between the own driving style and the vehicle's driving style. In such cases, convincing the driver to enable the automation increase traffic safety. Research in this project is to investigate technical systems to increase the usage of automation.

SenseEmotion

Duration: 2015-2018
Funding by: Federal Ministry of Education and Researcg (BMBF) Cooperation Partners: University of Augsburg, University Hospital Ulm

SenseEmotion aims at increasing the overall quality of life for elderly and chronic pain patients. Therefore, automatic pain recognition is optimized. Additionally, a personalized affect management is developed to support pain patients and to reduce panic, angst, aggression and confusion related to pain. A multi-sensor approach is used to detect pain, disorientation and confusion as well as the resulting emotions panic, angst and anger based on paralinguistic, psychobiological and visual data. This data is also processed for pain management by caregivers and physicians and can be used to improve therapy. To further support pain patients, the system will include a virtual companion. This companion will support pain patients during a crisis by calming and reassuring them. In daily live, the companion will provide support for patients by providing general help or assistance with their therapy. Link zum Projekt

Mobile Interaction With Pervasive User Interfaces

Duration: 2010-2018
Funding by: German Research Foundation (DFG)

The usage of mobile devices, in particular mobile phones, for interactions with novel display and projection technologies or enhanced everyday objects allows the design of new forms of interaction and applications benefiting the user. Prior research in this area produced fascinating and promising individual results. However, there has been no general interaction concept so far that would connect mobile devices to existing interfaces similar to the desktop metaphor for desktop systems. This project designs and evaluates new interaction concepts and applications that merge interfaces of mobile devices with computer systems in the surroundings. The results of this project will enable the application of these new interaction techniques beyond the boundaries of a laboratory in real environments and novel application areas.The main goals of this project are the design of tools for development support and the generalization of particular systems to universal models, methods and guidelines. This process is based on prototypes in the areas 'office, meetings and travel'; 'leisure and games'; and 'automobile and manufacturing industry'.

Interact: Interactive Manual Assembly Operations for the Human-Centred Workplace of the Future

Duration: 2013-2016
Funding by: European Union (EU) 
Cooperation Partners: Daimler, Elektrolux, DFKI, etc.

INTERACT aims to utilize  workers’ knowledge on executing manual assembly tasks and include it in the digital tools used to support design, verification, validation, modification and continuous improvement of human-centred, flexible assembly workplaces in industry. For this, a system was built which incorporates shop-floor sensing architectures, automated recognition and classification of actions from sensing data and a novel approach for synthesis of natural human motions in assembly operations.For INTERACT, the HCI research group at the Institute of Media Informatics aims to provide and strengthen its knowledge in human sensing technologies, specifically by the implementation of a markerless human tracking system based on the combination of data from multiple depth cameras.

More information about Interact: Interactive Manual Assembly Operations for the Human-Centered Workplaces of the Future

Adaptive Pervasive User Interfaces

Duration: 2013-2017
Funding by: German Research Foundation (DFG)
Cooperation Partners: A Companion-Technology for Cognitive Technical Systems

The project designs, implements and evaluates novel concepts, systems and interaction techniques for pervasive projected user interfaces. An interactive projector-camera system will be initially developed to enable the investigation of interactive multi-projector environments and their installation and usage in realistic usage contexts. At this open research questions regarding touch-based interactions, supported interaction spaces, multi-display interactions and adaptive applications will be investigated. Those interaction concepts and the corresponding display infrastructure extend the companion-technology within SFB/Transregio 62 by a novel modality which provides novel possibilities for user-specific information provision and interaction between the users and the technical system. Furthermore offers this approach additional information from optical sensors which could also potentially be used for emotion recognition. Finally allow the pervasive user interfaces also a more pervasive availability of a companion system.

More information

Gaze- and Gesture-Based Assistive Systems for Users with Special Needs

Duration: 2013-2014 
Funding by: Federal Ministry of Education and Research (BMBF)
Cooperation Partners: General Psychology group at University of Ulm (Prof. Dr. Anke Huckauf)

Due to reasons of aging or poor health many people have to cope with varying limitations regarding their motor functions and mobility. They are not able to accomplish diverse tasks in their homes without the help of others. ASSIST aims to improve this by investigating the possibilities of controlling functions in intelligent homes with gestures and gazes. To accomplish this, we evaluate camera systems like Structured Light and Time of Flight cameras and Eyetrackers as well as the requirements towards such a system. Furthermore, we implement several interaction modalities incorporating different types of gestures, e.g. head gestures, hand gestures, arm gestures and gazes. Those interaction modalities are then evaluated with both healthy persons and persons with health problems. Our long-term goal is to develop an adaptive interaction concept based on gestures. This concept is supposed to enable people with varying limitations of mobility to live a more autonomous life. 

Mobile and Wearable User Interfaces

Duration: 2012-2014 
Funding by: Nokia 
Cooperation Partners: Nokia