Teresa Hirzle, M.Sc.
Teresa Hirzle joined the HCI group in June 2017. Before, she studied Media Informatics at Ulm University. She received her Bachelor's degree in 2015 and Master's degree in 2017, both with distinction. During her studies she spent a year at the Universidad de Granada in Spain.
Teresa wrote her master's thesis in cooperation with the Department of General Psychology. The thesis was about applying implicit pupillary events for eyes only interaction strategies in virtual reality. The title of the master's thesis was Eyes Only Interaction in Virtual Reality (Cogain 2017).
Teresas research interests include eye-based Human-Computer Interaction (HCI), augmented and virtual reality (AR/VR) Head-Mounted Displays (HMDs) and mobile eye tracking. Her research focuses on developing interactive systems for eye-based interaction on AR/VR head-mounted displays. More specifically she is interested in investigating how AR and VR displays influence and change human visual perception and behavior. Since these devices are inherently relying on three dimensional information, they also require gaze to be estimated in three dimensions (3D Gaze). However, the interaction space that arises from that combination (3D Gaze + HMDs) is not yet fully understood and filled.
- Eye-based Human-Computer Interaction with Computerized Eyewear
- How does gaze behavior differ in different environments?
- 3D Gaze Interaction on Head-Mounted Displays
- Designing Eye-based applications for "Visual Well-Being"
Lectures, Projects and Seminars
- Measuring Gaze Depth: How Eye tracking Measures Vary in Real and Virtual Environments (ongoing)
- Using Eye Tracking To Improve Meditation by Applying EOG Signal Analysis to a Responsive Application (ongoing)
Open Topics for Theses
How does gaze behavior differ in different environments?
- Comparison of Eye Measures in Augmented Reality and the Real World
- Relating Pupil Dilation to Lighting Conditions in Virtual Reality
3D Gaze Interaction on Head-Mounted Displays:
- Applications of Con-/Divergence Eye Movements for Eye-based HCI
Designing Eye-based applications for "Visual Well-Being":
- Gaze Tracking in the Wild: What Do We See?
- Gaze Tracking in the Wild: How Do We See?
- Prevention of Eye Strain: Applying Eye Exercises to Improve Visual Well-Being (AR/VR/real world)
If you are interested in writing a thesis, please send me an email.
A Symbiotic Human-Machine Depth Sensor
The goal of this project is to explore how much we can learn about physical objects a user is looking at by observing gaze depth. We envision a symbiotic scenario, where current technology (e.g. depth cameras) is extended with "human sensing data". Here, a depth camera is able to create a rough understanding of a static environment and gaze depth is merged into the model by leveraging unique propetries of human vision.
A Design Space for Gaze Interaction on Head-Mounted Displays
Augmented and virtual reality (AR/VR) head-mounted display (HMD) applications inherently rely on three dimensional information. In contrast to gaze interaction on a two dimensional screen, gaze interaction in AR and VR therefore also requires to estimate a user's gaze in 3D (3D Gaze).
While first applications, such as foveated rendering, hint at the compelling potential of combining HMDs and gaze, a systematic analysis is missing. To fill this gap, we present the first design space for gaze interaction on HMDs.
- Conference Reviewer: CHI '18, CHI '19, VRST '18, Augmented Human '18
- Student Volunteer: MUM '17