ActiTouch: Robust Touch Detection for On-Skin AR/VR Interfaces

Contemporary AR/VR systems use in-air gestures or handheld controllers for interactivity. This overlooks the skin as a convenient surface for tactile, touch-driven interactions, which are generally more accurate and comfortable than free space interactions. In response, we developed ActiTouch, a new electrical method that enables precise on-skin touch segmentation by using the body as an RF waveguide. We combine this method with computer vision, enabling a system with both high tracking precision and robust touch detection. Our system requires no cumbersome instrumentation of the fingers or hands, requiring only a single wristband (e.g., smartwatch) and sensors integrated into an AR/VR headset. We quantify the accuracy of our approach through a user study and demonstrate how it can enable touchscreen-like interactions on the skin.

Research Team: Yang Zhang, Wolf Kienzle, Yanjun Ma, Shiu S. Ng, Hrvoje Benko, and Chris Harrison


Zhang, Y., Kienzle, W., Ma, Y., Ng, S., Benko, H. and Harrison, C. 2019. ActiTouch: Precise Touch Segmentation for On-Skin VR/AR Interfaces. In Proceedings of the 32st Annual ACM Symposium on User Interface Software and Technology (New Orleans, USA, October 20 - 23, 2019). UIST '19. ACM, New York, NY. 1151–1159.