2017

Improving Input Accuracy on Smartphones for Persons who are Affected by Tremor using Motion Sensors

Having a hand tremor often complicates interactions with touchscreens on mobile devices. Due to the uncontrollable oscillations of both hands, hitting targets can be hard, and interaction can be slow. Correcting input needs additional time and mental effort.

Read more ...

PocketThumb: a Wearable Dual-Sided Touch Interface for Cursor-based Control of Smart-Eyewear

We present PocketThumb, a wearable touch interface for smart-eyewear that is embedded into the fabrics of the front trouser pockets. The interface is reachable from outside and inside of the pocket to allow for a combined dual-sided touch input

Read more ...

inScent: a Wearable Olfactory Display as an Amplification for Mobile Notifications

We present inScent, a wearable olfactory display that can be worn in mobile everyday situations and allows the user to receive personal scented notifications.

Read more ...

ShareVR: Enabling Co-Located Experiences for Virtual Reality between HMD and Non-HMD Users

ShareVR is a proof-of-concept prototype using floor projection and mobile displays in combination with positional tracking to visualize the virtual world for the Non-HMD user, enabling them to interact with the HMD user and become part of the VR experience.

Read more ...

2016

BodySign: Evaluating the Impact of Assistive Technology on Communication Quality Between Deaf and Hearing Individuals

Deaf individuals often experience communication difficulties in face-to-face interactions with hearing people. We investigate the impact of real-time translation-based ATs on communication quality between deaf and hearing individuals.

Read more ...

CarVR: Enabling In-Car Virtual Reality Entertainment

CarVR enables virtual reality entertainmen in moving vehicles. We enhance the VR experience by matching kinesthetic forces of the car movements to the VR experience.

Read more ...

Carvatar

Carvatar was built to increase trust in automation through social cues. The prototype can imitate human behavior, such as a humanoid gaze behavior or by calling attention to specific situations. The avatar can be used to establish a cooperative communication between driver and vehicle.

Read more ...

GyroVR: Simulating Inertia in Virtual Reality using Head Worn Flywheels

GyroVR uses head worn flywheels designed to render inertia in Virtual Reality (VR). Motions such as flying, diving or floating in outer space generate kinesthetic forces onto our body which impede movement and are currently not represented in VR. GyroVR simulates those kinesthetic forces by attaching flywheels to the users head which leverage the gyroscopic effect of resistance when changing the spinning axis of rotation. GyroVR is an ungrounded, wireless and self contained device allowing the user to freely move inside the virtual environment...

Read more ...

CircularSelection: Optimizing List Selection for Smartwatches

As the availability of smartwatches advances, small round touch- screens are increasingly used. Until recently, round touchscreens were rather uncommon, and so most current user interfaces for small round touchscreens are still based on rectangular interface designs. Adjusting these standard rectangular interfaces to round touchscreens without loosing content comes with loss of pre- cious display space. To overcome this issue for list interfaces, we introduce CircularSelection. CircularSelection is a list selec- tion interface especially designed for small round touchscreens. 

Read more ...

FusionKit: Multi-Kinect fusion for markerless and marker-based tracking in HCI

FusionKit is a software suite developed in the course of the the research group's participation in the INTERACT project. It aims to be a low-cost tracking system based on multiple Kinect time-of-flight cameras, which enables markerless body tracking as well as marker-based object tracking for a broad range of HCI and tracking scenarios. The toolkit is to be released as fully open source software and can be used and modified by industry, researchers and other groups interested in an easy-to-setup, affordable tracking solution.

Read more ...

 

Solving Conflicts for Multi User Mid-Air Gestures for TVs  

In recent years, mid-air gestures have become a feasible input modality for controlling and manipulating digital content. In case of controlling TVs, mid-air gestures eliminate the need to hold remote controls, which quite often are not at hand or even need to be searched before use. Thus, mid-air gestures quicken interactions. However, the absence of a single controller and the nature of mid-air gesture detection also poses a disadvantage: gestures preformed by multiple watchers may result in conflicts. In this paper, we propose an interaction technique solving the conflicts arising in such multi viewer scenarios.

Read more...

SwiVRChair: A Motorized Swivel Chair to Nudge Users’ Orientation for 360 Degree Storytelling

Since 360 degree movies are a fairly new medium, creators are facing several challenges such as controlling the attention of a user. In traditional movies this is done by applying cuts and tracking shots which is not possible or advisable in VR since rotating the virtual scene in front of the user’s eyes will lead to simulator sickness. One of the reasons this effect occurs is when the physical movement (measured by the vestibular system) and the visual movement are not coherent.

Read more ...

FaceTouch: Touch Interaction for Mobile Virtual Reality

We present FaceTouch, a mobile Virtual Reality (VR) headmounted display (HMD) that leverages the backside as a touch-sensitive surface. FaceTouch allows the user to point at and select virtual content inside their field-of-view by touching the corresponding location at the backside of the HMD utilizing their sense of proprioception. This allows for a rich interaction (e.g. gestures) in mobile and nomadic scenarios without having to carry additional accessories (e.g. gamepad). We built a prototype of FaceTouch and present interaction techniques and three example applications that leverage the FaceTouch design space.

Read more...

2015

 

Towards Accurate Cursorless Pointing: The Effects of Ocular Dominance and Handedness

Pointing gestures are our natural way of referencing distant objects and thus widely used in HCI for controlling devices. Due to current pointing models' inherent inaccuracies, most of the systems using pointing gestures so far rely on visual feedback showing users where they point at. However, in many environments, e.g. smart homes, it is rarely possible to display cursors since most devices do not contain a display.

Read more...

OctiCam: An immersive and mobile video communication device for parents and children

OctiCam is a mobile and child-friendly device that consists of a stuff ed toy octopus on the outside and a communication proxy on the inside. Relying only on two squeeze buttons in the tentacles, we simpli ed the interaction with OctiCam to a child-friendly level. A build-in microphone and speaker allows audio chats while a build-in camera streams a 360 degree video using a fi sh-eye lens. 

Read more ...

How Companion-Technology can Enhance a Multi-Screen Television Experience: A Test Bed for Adaptive Multimodal Interaction in Domestic Environments

This project deals with a novel multi-screen interactive TV setup (smarTVision) and its enhancement through Companion-Technology. Due to their flexibility and the variety of interaction options, such multi-screen scenarios are hardly intuitive for the user. While research known so far focuses on technology and features, the user itself is often not considered adequately. Companion-Technology has the potential of making such interfaces really user-friendly. Building upon smarTVision, it’s extension via concepts of Companion-Technology is envisioned. This combination represents a versatile test bed that not only can be used for evaluating usefulness of Companion-Technology in a TV scenario, but can also serve to evaluate Companion-Systems in general.

Read more ...

ColorSnakes: Using Colored Decoys to Secure Authentication in Sensitive Contexts 

ColorSnakes is an authentication mechanism based solely on software modification which provides protection against shoulder surfing and to some degree to video attacks. A ColorSnakes PIN consists of a starting colored digit and is followed by four consecutive digits. From the starting colored digit, users indirectly draw a path (selection path) consisting of their PIN. The input path can be drawn anywhere on the grid.

Read more ...

Belt: An Unobtrusive Touch Input Device for Head-worn Displays

Belt is a novel unobtrusive input device for wearable displays that incorporates a touch surface encircling the user’s hip. The wide input space is leveraged for a horizontal spatial mapping of quickly accessible information and applications. We discuss social implications and interaction capabilities for unobtrusive touch input and present our hardware implementation and a set of applications that benefit from the quick access time. 

Read more ...

Glass Unlock: Enhancing Security of Smartphone Unlocking through Leveraging a Private Near-eye Display

Glass Unlock is a novel concept using smart glasses for smartphone unlocking, which is theoretically secure against smudge attacks, shoulder-surfing, and camera attacks. By introducing an additional temporary secret like the layout of digits that is only shown on the private near-eye display, attackers cannot make sense of the observed input on the almost empty phone screen.

Read more ...

2014

UbiBeam: An Interactive Projector-Camera System for Domestic Deployment

We conducted an in-situ user study by visiting 22 households and exploring specific use cases and ideas of portable projector-camera systems in a domestic environment. Using a grounded theory approach, we identified several categories such as interaction techniques, presentation space, placement and use cases. Based on our observations, we designed and implement UbiBeam, a domestically deployable projector-camera system.

Read more...

P.I.A.N.O.: Faster Piano Learning with Interactive Projection

We designed P.I.A.N.O., a piano learning system with interactive projection that facilitates a fast learning process. Note information in form of an enhanced piano roll notation is directly projected onto the instrument and allows mapping of notes to piano keys without prior sight-reading skills. Three learning modes support the natural learning process with live feedback and performance evaluation. P.I.A.N.O. supports faster learning, requires significantly less cognitive load, provides better user experience, and increases perceived musical quality compared to sheet music notation and non-projected piano roll notation.

Read more...

Hover Pad: Interacting with Autonomous and Self-Actuated Displays in Space

In this work, we investigate the use of autonomous, self-actuated displays that can freely move and hold their position and orientation in space without the need for users holding them at all times. We illustrate various stages of such a display’s autonomy ranging from manual to fully autonomous, which – depending on the tasks – facilitate the interaction. Further, we discuss possible motion control mechanisms for these displays and present several interaction techniques made possible by such displays. We designed a toolkit – Hover Pad – that enables exploring five degrees of freedom of self-actuated and autonomous displays and the developed control and interaction techniques.

Read more...

ASSIST: Blick- und Gestenbasierte Assistenzsysteme für Nutzer mit Bewegungseinschränkungen

Eine Vielzahl von Menschen lebt aus Alters- oder Gesundheitsgründen mit variierenden Einschränkungen hinsichtlich ihrer Motorik und ihrer Mobilität. Sie können viele Aufgaben in ihren Wohnungen nicht mehr selbstständig erledigen oder sind auf Hilfe Anderer angewiesen. Das Ziel von ASSIST ist es, hier Abhilfe zu schaffen. Im Zuge des Projekts wird erforscht, inwieweit Funktionen in intelligenten Wohnräumen mit Gesten und Blicken ausgewählt und gesteuert werden können.

Mehr erfahren ...

Pervasive Information through Constant Personal Projection: The Ambient Mobile Pervasive Display (AMP-D)

We introduce the concept of an Ambient Mobile Pervasive Display (AMP-D) which is a wearable projector system that constantly projects an ambient information dis- play in front of the user. The floor display provides serendipitous access to public and personal information. The display is combined with a projected display on the user’s hand, forming a continuous interaction space that is controlled by hand gestures. The AMP-D prototype illustrates the involved challenges concerning hardware, sensing, and visualization and shows several application examples.

Read more...

SurfacePhone: A Mobile Projection Device for Single- and Multiuser Everywhere Tabletop Interaction

In this work we present SurfacePhone; a novel configuration of a projector phone which aligns the projector to project onto a physical surface to allow ad-hoc tabletop-like interaction in a mobile setup. The projection is created behind the upright standing phone and is touch and gesture-enabled. Multiple projections can be merged to create shared spaces for multi-user collaboration.

Read more ...

Broken Display = Broken Interface? The Impact of Display Damage on Smartphone Interaction

This work is the first to assess the impact of touchscreen damage on smartphone interaction. We gathered a dataset consisting of 95 closeup images of damaged smartphones and extensive information about a device’s usage history, damage severity, and impact on use. Further interviews revealed that users adapt to damage with diverse coping strategies, closely tailored to specific interaction issues. Based on our results, we proposed guidelines for interaction design in order to provide a positive user experience when display damage occurs.

Read more...

2013

Penbook: Bringing Pen+Paper Interaction to a Tablet Device to Facilitate Paper-Based Workflows in the Hospital Domain

In many contexts, pen and paper are the ideal option for collecting information despite the pervasiveness of mobile devices. Reasons include the unconstrained nature of sketching or handwriting, as well as the tactility of moving a pen over a paper that supports very fine granular control of the pen. In particular in the context of hospitals, many writing and note taking tasks are still performed using pen and paper. This work presents the Penbook concept, detail specific applications in a hospital context, and present a prototype implementation of Penbook.

Read more ...

From the Private Into the Public

Interactive horizontal surfaces provide large semi-public or public displays for co-located collaboration. In many cases users want to show, discuss, and copy personal information or media, which are typically stored on their mobile phones, on such a surface. This paper presents three novel direct interaction techniques (Select&Place2Share, Select&Touch2Share, and Shield&Share) that allow users to select in private which information they want to share on the surface. All techniques are based on physical contact between mobile phone and surface. Users touch the surface with their phone or place it on the surface to determine the location for information or media to be shared. We compared these three techniques with the most frequently reported approach that immediately shows all media files on the table after placing the phone on a shared surface. 

Read more...

PointerPhone

We present the concept and design space of PointerPhone which enables users to directly point at objects on a remote screen with their mobile phone and interact with them in a natural and seamless way. We detail the design space and distinguish three categories of interactions including low-level interactions using the mobile phone as a precise and fast pointing device, as well as an input and output device. We detail the category of widgetlevel interactions. Further, we demonstrate versatile high-level interaction techniques and show their application in a collaborative presentation scenario. Based on the results of a qualitative study, we provide design implications for application designs.

Read more...

Extending Mobile Interfaces Using External Screens

We present an approach which allows users to establish an ad-hoc connection between their mobile devices and external displays by holding the phone on the border of an external screen in order to temporarily extend the mobile user interface across the mobile and the external screen. This allows users to take advantage of existing large displays in their environments through spanning the mobile application user interface across multiple displays which allows to display more information at once.

Read more...

2012

MobiSurf: Improving Co-located Collaboration through Integrating Mobile Devices and Interactive Surfaces

In this work, we investigated how the combination of personal devices and a simple way of exchanging information between these and an interactive surface changes the way people solve collaborative tasks compared to an existing approach of using personal devices. Our study results clearly indicate that the combination of personal and a shared device allows users to fluently switch between individual and group work phases and users take advantage of both device classes.

Read more...

Investigating Mid-Air Pointing Interaction for Projector Phones

Projector phones, mobile phones with built-in projectors, might significantly change the way we are going to use and interact with mobile phones. This project explores the potential of combining the mobile and the projected display and further the potential of the mid-air space between them for the first time. Results from two studies with several gesture pointing techniques indicate that interacting behind the phone yields the highest performance, albeit showing a twice as high error rate. Further they show that mobile applications benefit from the projection, e.g., by overcoming the fat-finger problem on touchscreens and increasing the visibility of small objects. 

Read more...

A Cross Device Interaction Style

Natural forms of interaction have evolved for personal devices that we carry with us (mobiles) as well as for shared interactive displays around us (surfaces) but interaction across the two remains cumbersome in practice. We propose a novel cross device interaction style for mobiles and surfaces that uses the mobile for tangible input on the surface in a stylus-like fashion. Building on the direct manipulation that we can perform on either device, it facilitates fluid and seamless interaction spanning across device boundaries. We provide a characterization of the combined interaction style in terms of input, output, and contextual attributes, and demonstrate its versatility by implementation of a range of novel interaction techniques for mobile devices on interactive surfaces.

Read more...

Don't Queue Up! User Attitudes Towards Mobile Interactions with Public Terminals.

Public terminals for service provision provide high convenience to users due to their constant availability. Yet, the interaction with them lacks security and privacy as it takes place in a public setting. Additionally, users have to wait in line until they can interact with the terminal. In comparison to that, personal mobile devices allow for private service execution. Since many services, like with-drawing money from an ATM, require physical presence at the terminal, hybrid approaches have been developed. These move parts of the interaction to a mobile device. In this work we present the results of a four week long real world user study, in which we investigated whether hybrid approaches would actually be used.

Read more...

2011

Interactive Phone Call

Smartphones provide large amounts of personal data, functionalities, but during phone calls the phone cannot be used much beyond voice communication and does not offer support for synchronous collaboration. This is owed to the fact that first, despite the availability of alternatives, the phone is typically held at one’s ear; and second that the small mobile screen is less suited to be used with existing collaboration software. This work presents a novel in-call collaboration system that leverages projector phones as they provide a large display that can be used while holding the phone to the ear to project an interactive interface anytime and anywhere.

Read more...