EyeSpyVR: Interactive Eye Sensing Using Off-the-Shelf, Smartphone-Based VR Headsets

Smartphone-based virtual reality (VR) headsets have become increasingly popular since the introduction of the Google Cardboard in 2014. These devices adapt a user’s existing smartphone into a powerful VR headset, taking advantage of the phone’s existing sensing and rendering capabilities to enable rich VR experiences, without the expense and complexity of a dedicated VR headset. Eye sensing has been shown to be a powerful and useful input modality in VR experiences, ranging from foveated rendering of complex scenes, to social VR applications and gaze-responsive non-player characters (NPCs) in games. However, existing VR eye sensing approaches require special illumination and sensors, or even instrument the eye itself, which is incompatible with a commodity smartphone approach. Furthermore, commercial eye trackers such as FOVE and Tobii eye tracking for HTC Vive generally cost hundreds to thousands of dollars. Ideally, we want a way to robustly sense a wearer’s eye using only a phone’s built-in hardware.

In this work, we show how a smartphone inserted into a VR headset can capture a user’s left periocular region through its front-facing camera. Additionally, we can utilize the screen as an illumination source. Note that unlike e.g., traditional gaze estimation approaches, illumination from the screen does not appear as a well-defined "eye glint", and instead appears as a large reflection on the pupil. Furthermore, the screen also produces a reflection on the inexpensive optics of commodity VR headsets (often plastic lenses of poorer optical clarity). Other challenging examples stem from poor VR headset misalignment, camera autofocus issues, and low illumination due to dark screen contents.

Nonetheless, we show that these images of the eye can enable four interesting sensing modalities: worn/not-worn detection, blink detection, user identification, and coarse gaze estimation. Our approach must contend with a number of technical challenges, including non-uniformity of the light source, occlusion of the pupil by reflections, and inconsistent sensor placement. We describe our software-only eye sensing implementation —which we call EyeSpyVR — and present the results of a series of studies involving 70 participants. We demonstrate useful eye-sensing capabilities without requiring additional or specialized hardware, allowing advanced features to be enabled with little more than a software update. We envision VR application developers leveraging these techniques to open new interactive possibilities and more immersive VR experiences on commodity VR headsets.

Research Team: Karan Ahuja, Rajul Islam, Varun Parashar, Kuntal Dey, and Chris Harrison


Ahuja, K., Islam, R., Parashar, V., Dey, K., Harrison, C. and Goel, M. 2018. EyeSpyVR: Interactive Eye Sensing Using Off-the-Shelf, Smartphone-Based VR Headsets. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. UBICOMP ’18. 2, 2, Article 57 (July 2018). ACM, New York, NY. 10 pages.