Screenless mobile devices achieve maximum mobility, but at the expense of the visual feedback that is gener- ally assumed to be necessary for spatial interaction. With Imaginary Interfaces we re-enable spatial interac- tion on screenless devices. Users point and draw in the empty space in front of them or on the palm of their hands. While they cannot see the results of their inter- action, they do obtain some visual feedback by watching their hands move. Our user studies show that Imaginary Interfaces allow users to create simple draw- ings, to annotate with them and to operate interfaces, as long as their layout mimics a physical device they have used before.
While our main goal is to create and explore ultra- mobile devices, Imaginary Interfaces and interfaces designed for the visually impaired have interesting similarities and differences worth exploring. In particu- lar, we plan to explore the value derived from the extra feedback users obtain from watching their hands inter- act. Exploring this and related questions will help us better understand Imaginary Interfaces and at the same time it will allow us to discover which aspects of our technology can inform the design of interfaces for the visually impaired.
Wearable computing has often been predicted as the next big thing in computing, and yet users seem reluctant to adapt to its requirements. This research proposes that users can create and adapt there own interfaces by using gestures which are captured by a chest worn camera pendant. Wearing a pendant is much less obstructive than, say, a coat or other piece of clothing. Translating hand gestures into GUI commands could have many widespread uses, and very many relate to my area of concern: audio production.