Minput: Enabling Interaction on Small Mobile Devices with High-Precision, Low-Cost, Multipoint Optical Tracking

The mass production of optical mice has made the highly sophisticated sensors on which they rely very inexpensive. Additionally, advances in electronics and optics have yielded sensors that are both small and extremely precise. A generic optical mouse, costing only a few dollars, is capable of capturing and comparing surface images several thousand times per second. Often, this high resolution enables their use on a variety of surfaces - both traditional (e.g., mouse pads, tables) and ad hoc (e.g., palms, pants, bed covers, papers). Vision-based interaction techniques tend to heavily tax even modern mobile device processors and batteries. Fortunately, the optical sensors we employ have dedicated, highly efficient processors that handle most of the computation with negligible power consumption.

The central idea behind Minput is simple: place two optical tracking sensors on the back of a very small device. This allows the whole device to be manipulated for input, enabling many interesting interactions with excellent physical affordances. This could allow for the creation of e.g., a matchbook-sized media player that is essentially all screen on its front side. The device could be operated using any convenient surface, e.g., a table or palm. The use of two tracking elements enables not only conventional x/y tracking, but also rotation, providing a more expressive design space. The latter motion is calculated by taking the difference in velocities of the two sensors.

Research Team: Chris Harrison, Scott Hudson

Awards: Best Paper Honorable Mention award at CHI 2010


Chris Harrison and Scott E. Hudson. 2010. Minput: enabling interaction on small mobile devices with high-precision, low-cost, multipoint optical tracking. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). Association for Computing Machinery, New York, NY, USA, 1661–1664. DOI:https://doi.org/10.1145/1753326.1753574

Additional Media

Other than the paper PDF, all media on this page is shared under Creative Commons BY 3.0