MRTouch: Adding Touch Input to Head-Mounted Mixed Reality (IEEE VR ’18)

We present MRTouch, a novel multitouch input solution for head-mounted mixed reality systems. Our system enables users to reach out and directly manipulate virtual interfaces affixed to surfaces in their environment, as though they were touchscreens. Touch input offers precise, tactile and comfortable user input, and naturally complements existing popular modalities, such as voice and hand gesture. Our research prototype combines both depth and infrared camera streams together with real-time detection and tracking of surface planes to enable robust finger-tracking even when both the hand and head are in motion. Our technique is implemented on a commercial Microsoft HoloLens without requiring any additional hardware nor any user or environmental calibration. Through our performance evaluation, we demonstrate high input accuracy with an average positional error of 5.4 mm and 95% button size of 16 mm, across 17 participants, 2 surface orientations and 4 surface materials. Finally, we demonstrate the potential of our technique to enable on-world touch interactions through 5 example applications.

Download PDF

Xiao, R., Schwarz, J., Throm, N., Wilson, A. and Benko, H. 2018. MRTouch: Adding Touch Input to Head-Mounted Mixed Reality. In IEEE Transactions on Visualization and Computer Graphics (TVCG) Special Issue, Volume 24, Number 4, 1653-1660.




Previous Article
Next Article