Dennis Wolf, M.Sc.
From 2010 to 2016, Dennis Wolf was a student at Ulm University. In 2016, he became a Ph.D . student in the Human-Computer-Interaction group.
Dennis acquired a B.Sc. (2013) and an M.Sc. in media informatics (2016) at Ulm University, repectively. The title of his master theses was: "OctiCam: An Immersive and Mobile Video Communication System for Relatives and Children".
In the winter of 2016, Dennis spend 3 months at Cambridge university, UK doing an internship in the Intelligent Interactive Systems Group lead by Dr Per Ola Kristensson. This lead to a collaboration on the publication "Performance Envelopes of In-Air Direct and Smartwatch Indirect Control for Head-Mounted Augmented Reality" that was presented at the conference IEEE VR 2018.
- Mixed Reality
- Multi-Modal Feedback
- Biometric Data
- User Adaptation
Dissertation Goal (copy 1)
Interaction with head-mounted displays (HMDs) can be seen as a continuous loop of user input and system output. Its most fundamental form is user head rotation: head movement is measured by internal and external sensors which updates the virtual camera in the virtual scene and generates a new image from the new perspective that is finally presented to the user. To improve user experience and presence in the scene, a virtual environment should address more than just the visual and auditory channel of the users and adapt to the users' specific needs.
To explore the potential of user adaptivity in mixed-reality environments, my research is focused on multi-modal input and output concepts and dynamic application content via biometric-feedback loops. First successful projects evaluated multi-modal input for augmented reality (AR), an AR framework for cognitively impaired users, and immersive multi-modal output for virtual environments.
While the real world provides humans with a huge variety of sensory stimuli, virtual worlds most of all communicate their properties by visual and auditory feedback due to the design of current head mounted displays (HMDs). Since HMDs offer sufficient contact area to integrate additional actuators, prior works utilised a limited amount of haptic actuators to integrate respective information about the virtual world.
The scarcity of established input methods for augmented reality (AR) head-mounted displays (HMD) motivates us to investigate the performance envelopes of two easily realisable solutions: indirect cursor control via a smartwatch and direct control by in-air touch.
Since 360 degree movies are a fairly new medium, creators are facing several challenges such as controlling the attention of a user. In traditional movies this is done by applying cuts and tracking shots which is not possible or advisable in VR since rotating the virtual scene in front of the user’s eyes will lead to simulator sickness. One of the reasons this effect occurs is when the physical movement (measured by the vestibular system) and the visual movement are not coherent.
GyroVR uses head worn flywheels designed to render inertia in Virtual Reality (VR). Motions such as flying, diving or floating in outer space generate kinesthetic forces onto our body which impede movement and are currently not represented in VR. GyroVR simulates those kinesthetic forces by attaching flywheels to the users head which leverage the gyroscopic effect of resistance when changing the spinning axis of rotation.
OctiCam is a mobile and child-friendly device that consists of a stuffed toy octopus on the outside and a communication proxy on the inside. Relying only on two squeeze buttons in the tentacles, we simplied the interaction with OctiCam to a child-friendly level. A build-in microphone and speaker allows audio chats while a build-in camera streams a 360 degree video using a fish-eye lens.
This project deals with a novel multi-screen interactive TV setup (smarTVision) and its enhancement through Companion-Technology. Due to their flexibility and the variety of interaction options, such multi-screen scenarios are hardly intuitive for the user. While research known so far focuses on technology and features, the user itself is often not considered adequately. Companion-Technology has the potential of making such interfaces really user-friendly. Building upon smarTVision, it’s extension via concepts of Companion-Technology is envisioned. This combination represents a versatile test bed that not only can be used for evaluating usefulness of Companion-Technology in a TV scenario, but can also serve to evaluate Companion-Systems in general.
ColorSnakes is an authentication mechanism based solely on software modification which provides protection against shoulder surfing and to some degree to video attacks. A ColorSnakes PIN consists of a starting colored digit and is followed by four consecutive digits. From the starting colored digit, users indirectly draw a path (selection path) consisting of their PIN. The input path can be drawn anywhere on the grid.
Lectures, Projects and Seminars
- Don't Sweat It!: Exploring Asymmetrical Biometric Feedback in Collaborative Gaming (MA, Shyukryan Karaosmanoglu, 2019)
- VRJumping: Exploring Jump-Based Locomotion in Virtual Reality (MA, Christopher Kunder, 2019)
- The Impact of Discrete Rotations on Locomotion in VR (BA, Laura Bottner, 2019)
- FootVR: Exploring Foot-Controlled Locomotion in Virtual Reality (BA, Anja Schikorr, 2019)
- Fall Prevention in VR (BA, Sarah Schaupp, 2019)
Open Topics for Theses
You can find an overview of all available thesis topics on our homepage.
If you are interested in writing a thesis, please send me an email or drop by my office. Possible topics could be:
- Haptic feedback in AR/VR
- Integration of biometric signals for VR environments
- Transition between virtual and physical reality
- Intergration of vibro-tactile, thermal, and EMS actuators into a virtual reality head-mounted display for increased immersion (MA, Leo Hnatek, 2017)
- Exploring Hand-Tracking and Smartwatch-Based Pointing in Virtual Reality (MA, Suhasaleem Holalkere, 2018)
- An Augmented Reality Framework for Assisted Activities in Dementia Therapy (MA, Daniel Besserer, 2018)
- C-AR: Improving Situational Awareness for Automated Driving Level 3 through Multi-User Augmented Reality Interaction (MA, David Klein, 2018)
- Adaptive AR Interfaces Via EEG (BA, Tobias Wagner, 2019)
- An In-Depth Evaluation of and Compensation for the Heisenberg Effect (BA, Marco Combosch, 2019)
- An Evaluation of Path Visualization and Positional Tracking for the Purpose of Indoor Navigation Using the HoloLens and Bluetooth Low Energy Beacons (BA, Linus Hunziger)
- Conference Reviewer: CHI '17, '18, '19, MobileHCI '18, PerDis '17, GI '19
- Conference Presenter: IEEE VR '18, UIST '18