How we perceive objects seems to be easy and self-evident. However, there are complex and interdependent processes underlying the coordinated movements of our eyes and our perceptual experienced. In our research, we investigate what our eyes tell us about an observer, and how this information can be used in interacting weith technical assisting systems. Therefore, we also investigate eye and pupilary movements. For this, we need an understanding of the functions and effects of perception out of focus, in the periphery, and also in front of and behind the focus. Respective findings are fed back to gain a deeper understanding of current graphical media technologies. They also serve as basis for our development of interfaces and concepts for gaze input


2022-2026: Eyes4ICU; Coordination of the Doctoral Network; funded by the Marie-Sklodowska-Curie Actions under Horizon Europ.

2020- 2023: Gaze-Assisted Scalable Interaction in Pervasive Classrooms; cooperation with prof. E. Rukzio, Priority Programme “Scalable Interaction Paradigms for Pervasive Computing Environments” (SPP 2199).

2019-2020: Eye tracking in real time for interaction, communication, and understanding (EYES4ICU), sponsored by the Federal Ministry of Education and Research (BMBF).

2013 – 2017: Research cluster Driver – Vehicle – Research (Forschungszentrum für kooperative, hochautomatisierte Fahrerassistenzsysteme und Fahrfunktionen ; F3); cooperation project with Profs. K. Dietmayer, F. Kargl, C. Waldschmidt, M. Weber; funded by Carl-Zeiss-Stiftung.

2012 – 2015:  Multimodal human computer interaction; in cooperation with Prof. M. Weber, Ulm University; subproject B3 in SFB-TRR 62 „A Companion-Technology for Cognitive Technical Systems “; funded by Deutsche Forschungsgemeinschaft.

2012 – 2013: Gaze and gesture based assistence systems for users with restricted mobility (Blick- und Gestenbasierte Assistenzsysteme für Nutzer mit Bewegungseinschränkungen; Assist); cooperation project with Prof. E. Rukzio; funded by Bundesministerium für Bildung und Forschung.

2012 – 2014: Companion-Technology in automotive application scenarios for workmen assistance via mobile augmented reality; Transfer-project in cooperation with Prof. A. Al-Hamadi and Prof. B. Michaelis, University of Magdeburg; SFB-TRR 62; funded by Deutsche Forschungsgemeinschaft.

2012 – 2016: Serious Games – Advancement of competences through adaptive systems. Cooperation with Profs. M. Weber, T. Seufert, I. Kolassa, J. Keller, W. Minker, K. Schumacher; funded by Carl-Zeiss-Stiftung.

2012: Evaluation of technological innovations for production; Daimler AG, Stuttgart.

2011: On the application of head-mounted AR-devices for companion-technologies; initial sponsoring of SFB/TRR 62 Companion-Technology.

2008 – 2011: User-centered development and investigation of AR-based systems for workmen assistance. In cooperation with Dr. Rüdiger Mecke, Fraunhofer IFF, Magdeburg, and Prof. Eberhard Pfister, Institut für Arbeitsmedizin, University of Magdeburg; subproject of AVILUS (Angewandte Virtuelle Technologien im Produkt- und Produktionsmittellebenszyklus), funded by Bundesministerium für Bildung und Forschung. AVILUS coordination by Volkswagen AG.