Towards Accurate Cursorless Pointing: The Effects of Ocular Dominance and Handedness
Pointing gestures are our natural way of referencing distant objects and thus widely used in HCI for controlling devices. Due to current pointing models' inherent inaccuracies, most of the systems using pointing gestures so far rely on visual feedback showing users where they point at. However, in many environments, e.g. smart homes, it is rarely possible to display cursors since most devices do not contain a display. In this paper we present two user studies showing that previous cursorless techniques are rather inaccurate. We show that pointing accuracy could be significantly improved by acknowledging users' handedness and ocular dominance. In a first user study (n=33), we reveal the large effect of ocular dominance and handedness on human pointing behavior. Current ray-casting techniques neglect this effect, precluding them from accurate cursorless selection. With a second user study (n=25), we show that accounting for ocular dominance and handedness yields to significantly more accurate selections compared to two previously published ray casting techniques. This speaks for the importance of considering users' characteristics further to develop better selection techniques to foster robust accurate selections.