Funded Projects

Gaze-Assisted Scalable Interaction in Pervasive Classrooms

Duration: 2020-2023
Funding by:  German Research Foundation (DFG)
Cooperation Partners: research groups General Psychology (Prof. Anke Huckauf) and Human-Computer-Interaction (Prof. Enrico Rukzio) at University of Ulm

Gaze-Assisted Scalable Interaction in Pervasive Classrooms is a project within the Priority Programme “Scalable Interaction Paradigms for Pervasive Computing Environments” (SPP 2199) funded by the German Research Foundation (DFG).

Empirical Assessment of Presence and Immersion in Augmented and Virtual Realities

Duration: 2020-2023
Funding by:  German Research Foundation (DFG)

"Empirical Assessment of Presence and Immersion in Augmented and Virtual Realities" is an Individual Research Grant funded by the German Research Foundation (DFG).


Duration: 2020-2023
Funding by:  Federal Ministry for Economic Affairs and Energy
Cooperation Partners: AVL Software & Functions GmbH (project lead), Human-Factors-Consult GmbH, Humatects GmbH, OFFIS, research groups Human Factors (Prof. Martin Baumann) and Human-Computer-Interaction (Prof. Enrico Rukzio) at University of Ulm

SituWare (Detection of driver situation awareness for adaptive cooperative transfer strategies in highly automated driving) is funded  by the German Federal Ministry for Economic Affairs and Energy.


Duration: 2020-2022
Funding by:  Federal Ministry of Health
Cooperation Partners: research groups Visual Computing (Prof. Timo Ropinski, project lead), History, Philosophy and Ethics of Medicine (Prof. Florian Steger) and Human-Computer-Interaction (Prof. Enrico Rukzio) at University of Ulm, Robert-Bosch Gesellschaft für Medizinische Forschung mbH (Prof. Jochen Klenk),  IB-Hochschule Berlin (Prof. Dr. Alexandra Jorzig)

AktiSmart-KI (Identification of complex activity patterns through smart sensor technology in geriatric rehabilitation) is funded by Module 1 "Smart Sensors" with the funding framework "Digital Innovations for the Improvement of Patient-Centered Care in Public Health" by the German Federal Ministry of Health.


Duration: 2018-2021
Funding by: Ministry of state of Baden-Württemberg
Cooperation Partners: Department of Human Factors, Ulm University (Prof. Dr. Martin Baumann)
Microwave Engineering, Ulm University (Prof. Dr. Christian Waldschmidt)
Institute of Measurement, Control and Microtechnology, Ulm University (Prof. Dr. Klaus Dietmayer)

INTUITIVER is part of the research funding programme "Smart Mobility".

Intuitiver is short for "INTeraktion zwischen aUtomatIsierTen Fahrzeugen und leicht verletzbaren VerkehrsteilnehmER",  (interaction between automated vehicles and vulnerable road users -> pedestrians) focuses on challenges that arise with autonomous vehicles becoming players in a socio-technical system. An interdisciplinary team consisiting of engineers,  psychologists and computer scientists tackle a series of related questions:

  1. How can cameras and sensors of an autonomous vehicle detect whether a pedestrian wants to cross the road?
  2. How can pedestrians recognize the intention of an autonomous vehicle?
  3. How can the autonomous vehicle inform its occupants of its intentions?

 Research in this area will investigate which kind of information to display and how-to convey this information.

AuCity 2 - Applying Augmented Reality In Academic Education Based Upon The Example Of Civil Engineering

Duration: 2018-2021
Funding by: Federal Ministry of Education and Research (BMBF)
Cooperation Partners: Bauhaus University Weimar

Augmented and Virtual Reality applications allow for improved realism in the depiction of 3D information that other types of media cannot achieve. Especially in teaching in engineering contexts, these advantages can be seen as motivating and helpful for learners, as the mapping between the real world and the presented information is easier as with other media.

The AuCity 2 project aims to analyse the importance of accurate depiction of 3d information regarding its influence on learning achievements and learning motivation. Three types of mixed reality applications are to be created: Photo-based Panoramas, Augmented Reality on site and fully modelled Virtual Reality.

For more details on the project please visit the project page at Weimar University.

Persuasive Technologies in Highly Autonomous Cars to Increase Cooperation and Safety

Duration: 2016-2019
Funding by: Carl Zeiss Foundation
Cooperation Partners: Department of Human Factors, Ulm University (Prof. Dr. Martin Baumann)

Highly automated driving aims to radically increase driving safety because human errors account for over 90 percent of severe traffic accidents. When automation is evolved enough to drive safer than human beings, it is assumed that the human driver can still reclaim control over the vehicle. Reasons for this may be a lack of trust, misunderstanding or a disagreement between the own driving style and the vehicle's driving style. In such cases, convincing the driver to enable the automation increase traffic safety. Research in this project is to investigate technical systems to increase the usage of automation.


Duration: 2015-2018
Funding by: Federal Ministry of Education and Researcg (BMBF) Cooperation Partners: University of Augsburg, University Hospital Ulm

SenseEmotion aims at increasing the overall quality of life for elderly and chronic pain patients. Therefore, automatic pain recognition is optimized. Additionally, a personalized affect management is developed to support pain patients and to reduce panic, angst, aggression and confusion related to pain. A multi-sensor approach is used to detect pain, disorientation and confusion as well as the resulting emotions panic, angst and anger based on paralinguistic, psychobiological and visual data. This data is also processed for pain management by caregivers and physicians and can be used to improve therapy. To further support pain patients, the system will include a virtual companion. This companion will support pain patients during a crisis by calming and reassuring them. In daily live, the companion will provide support for patients by providing general help or assistance with their therapy. Link zum Projekt

Development of interaction techniques, concepts and tools for mobile interactions with pervasive user interfaces

Duration: 2010-2019
Funding by: German Research Foundation (DFG)

The usage of mobile devices, in particular mobile phones, for interactions with novel display and projection technologies or enhanced everyday objects allows the design of new forms of interaction and applications benefiting the user. Prior research in this area produced fascinating and promising individual results. However, there has been no general interaction concept so far that would connect mobile devices to existing interfaces similar to the desktop metaphor for desktop systems. This project designs and evaluates new interaction concepts and applications that merge interfaces of mobile devices with computer systems in the surroundings. The results of this project will enable the application of these new interaction techniques beyond the boundaries of a laboratory in real environments and novel application areas.The main goals of this project are the design of tools for development support and the generalization of particular systems to universal models, methods and guidelines. This process is based on prototypes in the areas 'office, meetings and travel'; 'leisure and games'; and 'automobile and manufacturing industry'.

Interact: Interactive Manual Assembly Operations for the Human-Centred Workplace of the Future

Duration: 2013-2016
Funding by: European Union (EU) 
Cooperation Partners: Daimler, Elektrolux, DFKI, etc.

INTERACT aims to utilize  workers’ knowledge on executing manual assembly tasks and include it in the digital tools used to support design, verification, validation, modification and continuous improvement of human-centred, flexible assembly workplaces in industry. For this, a system was built which incorporates shop-floor sensing architectures, automated recognition and classification of actions from sensing data and a novel approach for synthesis of natural human motions in assembly operations.For INTERACT, the HCI research group at the Institute of Media Informatics aims to provide and strengthen its knowledge in human sensing technologies, specifically by the implementation of a markerless human tracking system based on the combination of data from multiple depth cameras.

More information about Interact: Interactive Manual Assembly Operations for the Human-Centered Workplaces of the Future

Adaptive Pervasive User Interfaces

Duration: 2013-2017
Funding by: German Research Foundation (DFG)
Cooperation Partners: A Companion-Technology for Cognitive Technical Systems

The project designs, implements and evaluates novel concepts, systems and interaction techniques for pervasive projected user interfaces. An interactive projector-camera system will be initially developed to enable the investigation of interactive multi-projector environments and their installation and usage in realistic usage contexts. At this open research questions regarding touch-based interactions, supported interaction spaces, multi-display interactions and adaptive applications will be investigated. Those interaction concepts and the corresponding display infrastructure extend the companion-technology within SFB/Transregio 62 by a novel modality which provides novel possibilities for user-specific information provision and interaction between the users and the technical system. Furthermore offers this approach additional information from optical sensors which could also potentially be used for emotion recognition. Finally allow the pervasive user interfaces also a more pervasive availability of a companion system.

Gaze- and Gesture-Based Assistive Systems for Users with Special Needs

Duration: 2013-2014 
Funding by: Federal Ministry of Education and Research (BMBF)
Cooperation Partners: General Psychology group at University of Ulm (Prof. Dr. Anke Huckauf)

Due to reasons of aging or poor health many people have to cope with varying limitations regarding their motor functions and mobility. They are not able to accomplish diverse tasks in their homes without the help of others. ASSIST aims to improve this by investigating the possibilities of controlling functions in intelligent homes with gestures and gazes. To accomplish this, we evaluate camera systems like Structured Light and Time of Flight cameras and Eyetrackers as well as the requirements towards such a system. Furthermore, we implement several interaction modalities incorporating different types of gestures, e.g. head gestures, hand gestures, arm gestures and gazes. Those interaction modalities are then evaluated with both healthy persons and persons with health problems. Our long-term goal is to develop an adaptive interaction concept based on gestures. This concept is supposed to enable people with varying limitations of mobility to live a more autonomous life. 

Mobile and Wearable User Interfaces

Duration: 2012-2014 
Funding by: Nokia 
Cooperation Partners: Nokia