Teresa Hirzle, M.Sc.
Teresa Hirzle joined the HCI group in June 2017. Before, she studied Media Informatics at Ulm University. She received her Bachelor's degree in 2015 and Master's degree in 2017, both with distinction. During her studies she spent a year at the Universidad de Granada in Spain.
Teresa wrote her master's thesis in cooperation with the Department of General Psychology. The thesis was about applying implicit pupillary events for eyes only interaction strategies in virtual reality. The title of the master's thesis was Eyes Only Interaction in Virtual Reality (Cogain 2017).
Teresas research interests include eye-based Human-Computer Interaction (HCI), augmented and virtual reality (AR/VR) Head-Mounted Displays (HMDs) and mobile eye tracking. Her research focuses on developing interactive systems for eye-based interaction on AR/VR head-mounted displays. More specifically she is interested in investigating how AR and VR displays influence and change human visual perception and behavior. Since these devices are inherently relying on three dimensional information, they also require gaze to be estimated in three dimensions (3D Gaze). However, the interaction space that arises from that combination (3D Gaze + HMDs) is not yet fully understood and filled.
- Eye-based Human-Computer Interaction with Computerized Eyewear
- How does gaze behavior differ in different environments?
- 3D Gaze Interaction on Head-Mounted Displays
- Designing Eye-based applications for "Visual Well-Being"
Lectures, Projects and Seminars
- Measuring Gaze Depth: How Eye tracking Measures Vary in Real and Virtual Environments (ongoing)
- Using Eye Tracking To Improve Meditation by Applying EOG Signal Analysis to a Responsive Application (ongoing)
Open Theses Topics
Open theses topics can only be accessed from the university network.
If you are interested in one of the topics, have an own idea or want to inform yourself about the topics in general, please send me an e-mail.
A Symbiotic Human-Machine Depth Sensor
The goal of this project is to explore how much we can learn about physical objects a user is looking at by observing gaze depth. We envision a symbiotic scenario, where current technology (e.g. depth cameras) is extended with "human sensing data". Here, a depth camera is able to create a rough understanding of a static environment and gaze depth is merged into the model by leveraging unique propetries of human vision.
A Design Space for Gaze Interaction on Head-Mounted Displays
Augmented and virtual reality (AR/VR) head-mounted display (HMD) applications inherently rely on three dimensional information. In contrast to gaze interaction on a two dimensional screen, gaze interaction in AR and VR therefore also requires to estimate a user's gaze in 3D (3D Gaze).
While first applications, such as foveated rendering, hint at the compelling potential of combining HMDs and gaze, a systematic analysis is missing. To fill this gap, we present the first design space for gaze interaction on HMDs.
- Scientific Activities: CHI 2021 (SC Assistant for the UX Subcommittee), ETRA 2021 (Short Paper Co-Chair), ETRA 2020 (Web Chair), Virtual German CHI Event 2020 (Co-Organizer), MuC Short Papers (AC)
- Conference Reviewer: CHI '21, CHI '20, '19, '18, NordiCHI '20, MobileHCI '20, '19, ISMAR '20, '19, ETRA '20, Mensch und Computer '20, '19, VRST '18, Augmented Human '18,
- Student Volunteer: MUM '17