Typing minus touch
Typing minus touch? A PhD student at the University of Copenhagen's Department of Computer Science has developed a method to compose text on an interactive screen using mid-air gestures and a virtual keyboard. The method, already presented on Discovery News, shows promise for the healthcare sector, where bacteria laden keyboards are best avoided. Other potential applications abound.
Anders Markussen, a PhD student at the University of Copenhagen's Department of Computer Science (DIKU) has developed a method to enter text on a screen without direct touch by gesturing word spellings mid-air onto a virtual keyboard. Inspiration for the project comes from the Kinect, a motion sensing input device for Xbox game consoles. The motion-based controller eliminates the need for a handheld controller.
Healthcare sector requires a sterile communications platform
The method is ideal for the healthcare sector, where users could avoid bacteria covered keyboards and tablet screens. Anders Markussen says that the application will also be a valuable addition for those working in sterile environments, where personnel must not come into contact with anything while writing notes or looking up information. He continues:
"One can imagine other areas of application – for example, writing across windows. This might be helpful if a store has a contest where customers communicate across a storefront window after the store has closed. There are many possibilities", concludes Anders.
The method's greatest challenge is to reach an acceptable speed. While text entry speed has improved to a reasonable rate in lab tests, Markussen and research colleagues at DIKU have yet to attain a rate that can compete with classic keyboard or touchscreen text entry.
Project awarded at world's largest HCI conference
Markussen's Vulture project was presented with a Best Paper Award at the world's largest Human Computer Interaction (HCI) conference, CHI 2014 that recently took place in Toronto, Canada. Great applause met Anders Markussen, whose article was considered one of the best among 2000 articles submitted. During the conference, Markussen was interviewed by Discovery Canada. The video below demonstrates the method in action.
Technical description of the Vulture project
Keyboards that allow users to shape letters into words via a virtual keyboard already exist for touch devices, including mobile phones. However, the process has never been adapted for mid-air text entry.
"Vulture: A Mid-Air Word-Gesture Keyboard" deploys a new type of keyboard that is activated by users who draw words, mid-air. Vulture adapts touch based word-gesture algorithms to work in mid-air, projects users' movement onto a display, and uses pinch as a word delimiter. An initial 10-session study delivered text-entry rates of 20.6 Words Per Minute (WPM) and found that hand-movement speed was the primary predictor of WPM.
A second study demonstrated that by training with a few phrases, participants were able to compose 28.1 WPM, 59% of the text-entry rate of direct touch input. Participants' recall of trained gestures in mid-air was low, suggesting that visual feedback was important, but also limited performance.
The Vulture project is an offshoot of the Wallviz project, a four-year project investigating the use of large interactive screens. Wallviz is financed by the Danish Council for Strategic Research and runs through the end of 2014.
On the basis of study data, the research group is investigating ways to improve Vulture, including alternative designs for mid-air text entry. Vulture article co-authors include professor Kasper Hornbæk and Associate Professor Mikkel Rønne Jakobsen.