Breaking the Screen: Interaction Across Touchscreen Boundaries in Virtual Reality for Mobile Knowledge Workers

Abstract

Virtual Reality (VR) has the potential to transform knowledge work. One advantage of VR knowledge work is that it allows extending 2D displays into the third dimension, enabling new operations, such as selecting overlapping objects or displaying additional layers of information. On the other hand, mobile knowledge workers often work on established mobile devices, such as tablets, limiting interaction with those devices to a small input space. This challenge of a constrained input space is intensified in situations when VR knowledge work is situated in cramped environments, such as airplanes and touchdown spaces.

In this paper, we investigate the feasibility of interacting jointly between an immersive VR head-mounted display and a tablet within the context of knowledge work. Specifically, we design, implement and study how to interact with information that reaches beyond a single physical touchscreen in VR; design and evaluate a set of interaction concepts; and build example applications and gather user feedback on those applications. Finally we show the feasibility of a fully mobile application using camera-based tracking of the tablet screen.

Media

Videos

Teaser

Presentation at IEEE ISMAR 2020

Code

The code for our mobile implementation is available under https://gitlab.com/mixedrealitylab/breakingthescreen

Publication

Verena Biener, Daniel Schneider, Travis Gesslein, Alexander Otte, Bastian Kuth, Per Ola Kristensson, Eyal Ofek, Michel Pahud & Jens Grubert. Breaking the Screen: Interaction Across Touchscreen Boundaries in Virtual Reality for Mobile Knowledge Workers. In IEEE Transactions of Visualization and Computer Graphics (TVCG), vol. 26, no. 12, 2020, pp. 3490-3502. arxiv preprint | DOI | Data

Replication

Verena Biener & Jens Grubert. Content Transfer Across Multiple Screens with Combined Eye-Gaze and Touch Interaction - A Replication Study. In Adjunct Proceedings of IEEE International Symposium on Mixed and Augmented Reality 2022 (ISMAR 2022), pp.270-273. arxiv preprint