A Cross Device Interaction Style
Mobiles and surfaces and their associated forms of interaction become ever more powerful and pervasive in our lives, but in their separate ways. By mobiles we mean small devices of a personal nature that we have with us for interaction; devices that are highly personalized, that store private data, and that are a proxy of ourselves in the digital world. By surfaces we mean larger displays that we walk up to for interaction, that let us interact with content at larger scale, that are more public and that afford sharing. In terms of interaction, rich and distinctive direct manipulation styles have evolved with either class: mobiles incorporate multimodal interfaces and can sense in 3D how they are being manipulated, and surfaces support multi-user and multi-touch interaction. However, in spite of the advances on either platform, it remains cumbersome in practice to interact across mobiles and surfaces.
There are compelling reasons for combined use of mobiles and surfaces, and for seamless interaction across the two. Mobiles are great for carrying data and media while surfaces offer better scale for interaction with content. Mobiles provide user control over personal data while surfaces make it easy to share. Surfaces can be used by multiple users in the same way while mobiles can be used in highly personalized ways.
In this work we investigate a novel cross-device interaction style that we envision to be a generic platform for synergistic interaction with mobiles and surfaces across a variety of tasks and applications. The essence of this style is that the mobile device is used for selection of targets on a surface by direct touch, creating touch events that are associated with a position on the surface and the identity of the mobile. We build on the recently introduced PhoneTouch technique which demonstrated that this style can be implemented on multi-touch surfaces concurrently with finger touch sensing, by combining touch sensing embedded in the mobile with touch detection on the surface. We investigate our vision of a generic interaction style for mobiles and surfaces in three steps. Our first step is to characterize the interactions that are enabled by fusing mobile and surface. We identify the fundamental input, output, and contextual attributes that define the building blocks for mobile-surface interaction techniques. The second step is to demonstrate our vision by implementation of a range of interactions. We do this in the concrete setting of smartphone use on an interactive tabletop, building on top of the PhoneTouch method for sensing mobile touch events. For the purposes of our main argument, we show that our interaction style is generic and capable of underpinning a versatile range of interaction goals. At the same time we show that the way in which we facilitate integration of mobiles is effective in addressing practical challenges in surface interaction, by introducing novel techniques for data transfer, personalization, user interface composition, authentication, localized and private feedback, and input expressiveness. Our third step is to illustrate our interaction style in a number of applications we have built, to demonstrate the ﬂow and fluidity of interaction across mobile and surface.