Wall++: Room-Scale Interactive and Context-Aware Sensing

Walls are everywhere, often making up more than half of indoor surface area, especially in residential and office buildings. In addition to delimiting spaces, both for functional and social purposes, they also hide infrastructure, such as wiring and HVAC. However, they are generally inactive structural elements, offering no inherent interactive or computational abilities (other than at small attached silos, e.g., thermostats and light switches), and thus present a tempting opportunity for augmentation, especially considering their ubiquity.


In this work, we set out to identify methods that could recast walls as smart infrastructure, able to sense interactions and activities happening in a room. In addition to supporting these broad application domains, we also added process constraints. In particular, we sought a technical approach that was versatile and easy to apply, requiring no special tools (e.g., CNC machines) or skills (e.g., carpentry, electrical engineering). We also required our approach to be low-cost, so as to be economically feasible at room scale (even a small room, e.g., 2×2.5×2.5m, has more than 20 square meters of walls). Finally, we wanted our sensing approach to be minimally obtrusive, and ideally invisible.


We quickly identified paint as a particularly attractive approach. Walls are already painted, and the average homeowner has the requisite skills to paint a wall. While there are special tools for applying paint (e.g., brushes, rollers, painter’s tape), these are all commodity supplies and readily available at home improvement stores. As we will discuss in greater depth later, we can apply a standard latex topcoat, which allows our technique to be wall-scale, and yet hidden in plain sight. Our ultimately selected method costs roughly $20 per square meter in materials at retail prices. These properties satisfied all our process criteria.


To enable user and environmental sensing, we drew upon two large bodies of work in the literature. First, we selected mutual capacitive sensing for close-range interactions. Owing to its widespread use in smartphones and tablets, mutual capacitive sensing is well understood and robust, allowing us to readily adapt it to wall-scale applications. Second, we extended work in air-borne electromagnetic (EM) sensing. This required us to develop an electrode pattern that supports both of these sensing modalities. For user sensing, we investigated touch interaction, pose estimation, user identification, and tracking. For environment sensing, we focused on context awareness through appliance recognition and localization. Collectively, we call our process, materials, patterns, sensor hardware and processing pipeline, Wall++. As we detail in the following pages, optimizing for ease-of-application and reliability, as well as sensing range and resolution, required iterative experimentation, physical prototyping, simulation modeling and user studies. We believe our resulting system demonstrates new and interesting HCI capabilities and presents a viable path towards smarter indoor environments.


Research Team: Yang Zhang, Chouchang Yang, Scott E. Hudson, Chris Harrison, and Alanson Sample


This research was conducted at Disney Research.

Additional media can be found on Yang Zhang's site.

Citation

Zhang, Y., Yang, C. Hudson, S., Harrison, C. and Sample, A. 2018. Wall++: Room-Scale Interactive and Context-Aware Sensing. In Proceedings of the 36th Annual SIGCHI Conference on Human Factors in Computing Systems (Montreal, Canada, April 21 -26, 2018). CHI '18. ACM, New York, NY. Paper 273, 15 pages.