Estimating 3D Finger Angle on Commodity Touchscreens
Today’s touch interfaces are primarily driven by the 2D location of touch events. However, there are many other dimensions of touch that can be captured and utilized interactively. By increasing the richness of touch input, users can do more in the same small space, potentially enabling richer applications. In this research, we describe a new method that estimates a finger’s angle relative to the screen. The angular vector is described using two angles – altitude and azimuth – more colloquially referred to as pitch and yaw. Our approach works in tandem with conventional multitouch finger tracking, offering two additional analog degrees of freedom for a single touch point.
Our new algorithm simultaneously estimates finger pitch and yaw across a wide range of poses, from flat to perpendicular. Uniquely, it requires only data provided by commodity touchscreen devices, requiring no additional hardware or sensors. We prototyped our solution on two platforms – a smartphone and smartwatch – each fully self-contained and operating in real-time. We quantified the accuracy of our technique through a user study, and explored the feasibility of our approach through example applications and interactions.
Research Team: Robert Xiao, Julia Schwarz, Chris Harrison
Robert Xiao, Julia Schwarz, and Chris Harrison. 2015. Estimating 3D Finger Angle on Commodity Touchscreens. In Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces (ITS '15). Association for Computing Machinery, New York, NY, USA, 47–50. DOI:https://doi.org/10.1145/2817721.2817737