Steerable interfaces for pervasive computing spaces
Abstract
This paper introduces a new class of interactive interfaces that can be moved around to appear on ordinary objects and surfaces anywhere in a space. By dynamically adapting the form, function, and location of an interface to suit the context of the user, such steerable interfaces have the potential to offer radically new and powerful styles of interaction in intelligent pervasive computing spaces. We propose defining characteristics of steerable interfaces and present the first steerable interface system that combines projection, gesture recognition, user tracking, environment modeling and geometric reasoning components within a system architecture. Our work suggests that there is great promise and rich potential for further research on steerable interfaces. © 2003 IEEE.