Tobias Wagner, M. Sc.

In 2019, after Tobias Wagner graduated from Ulm University with a Bachelor's degree in Media Informatics with a thesis titled "Adaptive AR Interfaces Via EEG", he joined the Human-Computer Interaction research group as a research assistant. While completing his Master's degree in Media Informatics at Ulm University, he worked on several research projects in personal fabrication and gaze-based interaction in teaching. With his Master's thesis titled "Effects of Attention Guiding in Lecture Videos via Implicit and Explicit-Based Visual Pointers" Tobias completed his Master's degree in September 2022.
Currently, Tobias is PhD at the Institute of Media Informatics at Ulm University, researching technology-enhanced teaching and learning using gaze-based interactive systems.

Research interests

  • Learning Technology
  • Gaze-based Interaction & Systems
  • Eye-Tracking

Teaching

  • Research Trends for Media Informatics Winter Term 2022/23
  • Project User-Centered Design for Interactive Systems since Summer Term 2023
  • Project Human-Computer-Interaction since Summer Term 2023
  • Applied Subject Design-Thinking for Interactive Systems since Summer Term 2023

How-To Videos
Video sharing platforms such as YouTube and TikTok enable learners to learn new skills such as cooking, DIY, and sports exercises with how-to videos. These videos also offer viewers step-by-step solutions to many problems. However, experts need a lot of time to concept, record, and edit these videos to be valuable for learners. The goal of this project is to develop a system that supports experts in creating instructional how-to videos. The videos created with the developed system effectively support learners in their skill learning process.

Learning languages with subtitles in TV shows
Video-on-demand platforms offer viewers a variety of audiovisual media, which include both informative and entertaining content. To ensure maximum reach, these platforms present a majority of language options in audio and text as subtitles, which allow viewers to consume the audiovisual content in their preferred language. Subtitles can be used to facilitate the learning of a new language. With this in mind, the goal of this project is to develop a learning system that uses subtitles in TV shows and movies to help viewers learn new languages. The system will use automated procedures to identify words that are not understood by the learner. These words could then be processed, for example, in the form of a vocabulary list, which would be made available as a learning resource for the learner. The evaluation of the learning system should be done with regard to the learning success and the motivation of the learner.

I'm happy to supervise theses with a focus on gaze-based interactive systems for teaching and learning. I am always open to your suggestions in this area.
Don't hesitate to contact me via email if you are interested.

Publications

T. Wagner, T. Hirzle, A. Huckauf and E. Rukzio, "Exploring Gesture and Gaze Proxies to Communicate Instructor’s Nonverbal Cues in Lecture Videos", In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems (CHI EA ’23), 2023.
File:/fileadmin/website_uni_ulm/iui.inst.100/1-hci/hci-paper/2023/CHIEA23_Wagner_ExploringInstructorProxies.pdf
P. Hock, M. Colley, A. Askari, T. Wagner, M. Baumann and E. Rukzio, "Introducing VAMPIRE -- Using Kinaesthetic Feedback in Virtual Reality for Automated Driving Experiments", Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’22) Joint First Authors, 09 2022. ACM.
DOI:10.1145/3543174.3545252
File:/fileadmin/website_uni_ulm/iui.inst.100/1-hci/hci-paper/2022/AutoUI_2022_VRealChair_compressed-1.pdf
T. Hirzle, M. Sauter, T. Wagner, S. Hummel, E. Rukzio and A. Huckauf, "Attention of Many Observers Visualized by Eye Movements", ETRA '22: 2022 Symposium on Eye Tracking Research and Applications, 06 2022.
DOI:10.1145/3517031.3529235
Weblink:https://www.uni-ulm.de/in/mi/hci/projects/attention-of-many-observers-visualized-by-eye-movements/
File:/fileadmin/website_uni_ulm/iui.inst.100/1-hci/hci-paper/2022/hirzle_AttentionOfManyObservers_2022.pdf
M. Sauter, T. Hirzle, T. Wagner, S. Hummel, E. Rukzio and A. Huckauf, "Can Eye Movement Synchronicity Predict Test Performance With Unreliably-Sampled Data in an Online Learning Context?", ETRA '22: 2022 Symposium on Eye Tracking Research and Applications, 06 2022.
DOI:10.1145/3517031.3529239
File:/fileadmin/website_uni_ulm/iui.inst.100/1-hci/hci-paper/2022/sauter_CanEyeMovement_2022.pdf
M. Sauter, T. Wagner and A. Huckauf, "Distance between gaze and laser pointer predicts performance in video-based e-learning independent of the presence of an on-screen instructor", ETRA '22: 2022 Symposium on Eye Tracking Research and Applications, 06 2022.
DOI:10.1145/3517031.3529620
File:/fileadmin/website_uni_ulm/iui.inst.100/1-hci/hci-paper/2022/sauter_DistanceBetweenGazeAndLaser_2022.pdf
E. Stemasov, T. Wagner, J. Gugenheimer and E. Rukzio, "ShapeFindAR: Exploring In-Situ Spatial Search for Physical Artifact Retrieval using Mixed Reality", In Proc. of CHI 2022 (SIGCHI Conference on Human Factors in Computing Systems), 05 2022. ACM, https://arxiv.org/abs/2203.17211.
DOI:10.1145/3491102.3517682
Weblink:https://www.youtube.com/watch?v=rc2JNFkAHx0
File:/fileadmin/website_uni_ulm/iui.inst.100/1-hci/hci-paper/2022/CHI2022_ShapeFindAR_Stemasov.pdf
E. Stemasov, T. Wagner, J. Gugenheimer and E. Rukzio, "Mix&Match: Towards Omitting Modelling through In-Situ Alteration and Remixing of Model Repository Artifacts in Mixed Reality", In Proc. of CHI 2020 (SIGCHI Conference on Human Factors in Computing Systems), Apr. 2020. ACM, https://arxiv.org/abs/2003.09169.
DOI:10.1145/3313831.3376839
Weblink:https://youtu.be/B5EnkIk9ZFY
File:/fileadmin/website_uni_ulm/iui.inst.100/institut/mitarbeiterbereiche/stemasov/StemasovCHI2020MixAndMatch.pdf
D. Wolf, T. Wagner and E. Rukzio, "Low-Cost Real-Time Mental Load Adaptation for Augmented Reality Instructions - A Feasibility Study", In Adj. Proc. of ISMAR 2019 (2019 IEEE International Symposium on Mixed and Augmented Reality), Oct. 2019. IEEE.
DOI:10.1109/ISMAR-Adjunct.2019.00015
File:/fileadmin/website_uni_ulm/iui.inst.100/institut/mitarbeiterbereiche/wolf/LowCostEEGPoster.pdf