Estimating 3D Finger Angle on Commodity Touchscreens (ITS ’15)
Today’s touch interfaces are primarily driven by the 2D location of touch events. However, there are many other dimensions of touch that can be captured and utilized interactively. By increasing the richness of touch input, users can do more in the same small space, potentially enabling richer applications. In this research, we describe a new method that estimates a finger’s angle relative to the screen. The angular vector is described using two angles – altitude and azimuth – more colloquially referred to as pitch and yaw. Our approach works in tandem with conventional multitouch finger tracking, offering two additional analog degrees of freedom for a single touch point.
Our new algorithm simultaneously estimates finger pitch and yaw across a wide range of poses, from flat to perpendicular. Uniquely, it requires only data provided by commodity touchscreen devices, requiring no additional hardware or sensors. We prototyped our solution on two platforms – a smartphone and smartwatch – each fully self-contained and operating in real-time. We quantified the accuracy of our technique through a user study, and explored the feasibility of our approach through example applications and interactions.
Xiao, R. Schwarz, J. and Harrison, C. 2015. Estimating 3D Finger Angle on Commodity Touchscreens. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces (Madeira, Portugal, November 15 – 18, 2015). ITS ‘15. ACM, New York, NY. 47-50.