OmniTouch projection interface makes the world your touchscreen
(EL FIN DE LOS TECLADOS, DE LAS PANTALLAS Y DEL MOUSE)
Sometimes you just want to make notes on your forearm. Put that permanent marker down though, because PhD student Chris Harrison et al at Microsoft Research have created a new system that allows touchscreen interaction on hairy and uneven surfaces. It uses a short-range depth camera instead of the infrared sensor we've seen on similar devices, which allows it to gauge the viewing angle and other characteristics of surfaces being used -- and it can even handle pinch-to-zoom.
We present OmniTouch, a novel wearable system that enables graphical, interactive, multitouch input on arbitrary, everyday surfaces. Our shoulder-worn implementation allows users to manipulate interfaces projected onto the environment (e.g., walls, tables), held objects (e.g., notepads, books), and their own bodies (e.g., hands, lap). A key contribution is our depth-driven template matching and clustering approach to multitouch finger tracking. This enables on-the-go interactive capabilities, with no calibration, training or instrumentation of the environment or the user, creating an always-available interface.