Teresa Hirzle, M.Sc.
As of March 2022, this website will no longer be updated, as I no longer work at Ulm University.
Teresa Hirzle joined the HCI group in June 2017. Before, she studied Media Informatics at Ulm University. She received her Bachelor's degree in 2015 and Master's degree in 2017, both with distinction. During her studies she spent a year at the Universidad de Granada in Spain.
Teresa wrote her master's thesis in cooperation with the Department of General Psychology. The thesis was about applying implicit pupillary events for eyes only interaction strategies in virtual reality. The title of the master's thesis was Eyes Only Interaction in Virtual Reality (Cogain 2017).
Teresas research interests include eye-based Human-Computer Interaction (HCI), augmented and virtual reality (AR/VR) Head-Mounted Displays (HMDs) and mobile eye tracking. Her research focuses on developing interactive systems for eye-based interaction on AR/VR head-mounted displays. Advancements in display technology and the miniaturization of sensors have led to VR head-mounted displays (HMDs) advancing from research devices to everyday end user technology. However, the devices expose users to digital eye strain (DES) - a rapidly spreading health problem in today's digital society that includes eye and vision problems and affects users' overall quality of life and general well-being. In her Ph.D. thesis, Teresa investigates properties and causes that contribute to DES in VR HMDs, and develops and evaluates solutions thereof.
- Digital eye strain in virtual reality head-mounted displays
- Designing eye-based applications for visual well-being
- Eye-based human-computer interaction with computerized eyewear
- 3D gaze interaction on head-mounted displays
- Lecture: "Eye Tracking and Eye-based HCI" in Mobile Mensch-Computer Interaktion (WS 2019/20, WS 2020/21) und User Interface Softwaretechnologie (SS 2017, SS 2018)
- Project: Mensch-Computer Interaktion (WS 2017/18, SS 2018)
- Project: Design Thinking for Interactive Systems (SS 2020, WS 2020/21)
- Seminar: Reseach Trends in Media Informatics (WS 2017/18, WS 2018/19, WS 2019/20, WS 2020/21)
- Proseminar: Mensch-Computer Interaktion (SS 2018)
- Master Thesis: The Influence of Presence and Flow on the Perception of Discomfort inVirtual Reality (Fabian Fischbach, 2021)
- Master Thesis: Measuring Vergence Errors in Real and Virtual Environments (Ly Hoang Minh Nguyen, 2019)
- Master Project: Alleviating Digital Eye Strain In Online Learning (Annalisa Degenhard, Truc Thanh, and Albin Zeqiri, 2021)
- Master Project: Helping Hands: Exploring Multimanual Interaction in Virtual Reality (Julian Karlbauer, 2021)
- Master Project: Eye Exercises to Alleviate Digital Eye Strain in Virtual Reality Head-Mounted Displays (Fabian Fischbach and Pascal Jansen, 2019)
- Bachelor Thesis: Improving Interpersonal Communication in Online Education Using WebCam Eye Tracking (Maryam Maged A. A. Elhaidary, 2021)
- Bachelor Thesis: An Exploration of Foveated Blue Light Filters in Virtual Reality (Julian Karlbauer, 2020)
- Bachelor Thesis: Reducing Computer Vision Syndrome in Virtual Reality by Prompting Blinks (Lukas Güthing, 2020)
- Bachelor Thesis: Using Eye Tracking to Improve Meditation by Applying EOG Signal Analysis to a Responsive Application (Markus Lederer, 2019)
- Bachelor Project: Prevalence of Computer Vision Syndrome according to Gender (Ronald Agee, 2021)
- Bachelor Project: Concepts Of Using Specific Eye Properties in Human Computer Interaction (Julian Karlbauer, 2019)
The goal of this project is to explore how much we can learn about physical objects a user is looking at by observing gaze depth. We envision a symbiotic scenario, where current technology (e.g. depth cameras) is extended with "human sensing data". Here, a depth camera is able to create a rough understanding of a static environment and gaze depth is merged into the model by leveraging unique propetries of human vision.
Augmented and virtual reality (AR/VR) head-mounted display (HMD) applications inherently rely on three dimensional information. In contrast to gaze interaction on a two dimensional screen, gaze interaction in AR and VR therefore also requires to estimate a user's gaze in 3D (3D Gaze).
While first applications, such as foveated rendering, hint at the compelling potential of combining HMDs and gaze, a systematic analysis is missing. To fill this gap, we present the first design space for gaze interaction on HMDs.
- Scientific Activities: CHI '21 (SC Assistant for the UX Subcommittee), ETRA '21 (Short Paper Co-Chair), ETRA '20 (Web Co-Chair), Ubicomp/ISWC '21 (Workshop Co-Organizer), GermanHCI '18-'20 (Organizer, https://www.germanhci.de/), Virtual German CHI Week '20 (Organizer), CHI '21 (AC Late Breaking Work), MuC Short Papers (AC)
- Reviewer: TOCHI '21, IEEE ToG '21, IMWUT '21, AutoUI '21, CHI '21 '20 '19 '18, ISMAR '21 '20 '19, IEEE VR Conference Papers '21, IEEE VR Journal Papers '21, ETRA '21 '20, NordiCHI '20, MobileHCI '20, '19, Mensch und Computer '20, '19, CHI LBW '19, CHI PLAY LBW '19, VRST '18, AH '18,
- Student Volunteer: MUM '17