Today’s mobile computers provide omnipresent access to information, creation and communication facilities. It is undeniable that they have forever changed the way we work, play and interact. However, mobile interaction is far from solved. Diminutive screens and buttons mar the user experience, and otherwise prevent us from realizing their full potential.
We explored and prototyped a powerful alternative approach to mobile interaction that uses a body-worn projection/sensing system to capitalize on the tremendous surface area the real world provides. For example, the surface area of one hand alone exceeds that of typical smart phone. Tables are often an order of magnitude larger than a tablet computer. If we could appropriate these ad hoc surfaces in an on-demand way, we could retain all of the benefits of mobility while simultaneously expanding the interactive capability. However, turning everyday surfaces into interactive platforms requires sophisticated hardware and sensing. Further, to be truly mobile, systems must either fit in the pocket or be wearable.
We present OmniTouch, a novel wearable system that enables graphical, interactive, multitouch input on arbitrary, everyday surfaces. Our shoulder-worn implementation allows users to manipulate interfaces projected onto the environment (e.g., walls, tables), held objects (e.g., notepads, books), and their own bodies (e.g., hands, lap). A key contribution is our depth-driven template matching and clustering approach to multitouch finger tracking. This enables on-the-go interactive capabilities, with no calibration, training or instrumentation of the environment or the user, creating an always-available interface.
I completed this work while I was at Microsoft Research in Redmond, WA.
Download
|