Around-Body Interaction: Sensing & Interaction Techniques for Proprioception-Enhanced Input with Mobile Devices
The space around the body provides a large interaction volume that can allow for 'big' interactions on 'small' mobile devices. Prior work has primarily focused on distributing information in the space around a user's body. We extend this by demonstrating three new types of around-body interaction: canvas, modal and context-aware. To illustrate our approach, these techniques are implemented in six demonstration applications. We also present a sensing solution using standard smartphone hardware: a phone's front camera, accelerometer and inertial measurement units. Our solution allows a person to interact with a mobile device by holding and positioning it between a normal field of view and its vicinity around the body. By leveraging a user's proprioceptive sense, around-body Interaction opens a new input channel that enhances conventional interaction on a mobile device without requiring additional hardware.
Research Team: Xiang 'Anthony' Chen, Julia Schwarz, Chris Harrison, Jennifer Mankoff, and Scott E. Hudson
Xiang 'Anthony' Chen, Julia Schwarz, Chris Harrison, Jennifer Mankoff, and Scott Hudson. 2014. Around-body interaction: sensing & interaction techniques for proprioception-enhanced input with mobile devices. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services (MobileHCI '14). Association for Computing Machinery, New York, NY, USA, 287–290. DOI:https://doi.org/10.1145/2628363.2628402