Relevant Links
Press Release
Project Web Page
Research Papers
Media Contact
Keywords
skinput, cmu, keypads, touchscreens, body input
Buzz
COMPUTING RESEARCH HIGHLIGHT OF THE WEEK [April 23 - 30, 2010]
Skin As Input For Smart Phones and Mobile Devices
A combination of simple bio-acoustic sensors and some sophisticated machine learning makes it possible for people to use their fingers or forearms - potentially, any part of their bodies - as touchpads to control smart phones or other mobile devices.
The technology, called Skinput, was developed by Chris Harrison, a third-year Ph.D. student in Carnegie Mellon University's Human-Computer Interaction Institute (HCII), along with Desney Tan and Dan Morris of Microsoft Research. Harrison will describe the technology in a paper to be presented on Monday, April 12, at CHI 2010, the Association for Computing Machinery's annual Conference on Human Factors in Computing Systems in Atlanta, Ga.
Skinput, could help people take better advantage of the tremendous computing power now available in compact devices that can be easily worn or carried. The diminutive size that makes smart phones, MP3 players and other devices so portable, however, also severely limits the size and utility of the keypads, touchscreens and jog wheels typically used to control them.
"With Skinput, we can use our own skin - the body's largest organ - as an input device," Harrison said "It's kind of crazy to think we could summon interfaces onto our bodies, but it turns out to make a lot of sense. Our skin is always with us, and makes the ultimate interactive touch surface"
In a prototype developed while Harrison was an intern at Microsoft Research last summer, acoustic sensors are attached to the upper arm. These sensors capture sound generated by such actions as flicking or tapping fingers together, or tapping the forearm. This sound is not transmitted through the air, but by transverse waves through the skin and by longitudinal, or compressive, waves through the bones.
Harrison and his colleagues found that the tap of each fingertip, a tap to one of five locations on the arm, or a tap to one of 10 locations on the forearm produces a unique acoustic signature that machine learning programs could learn to identify. These computer programs, which improve with experience, were able to determine the signature of each type of tap by analyzing 186 different features of the acoustic signals, including frequencies and amplitude.
In a trial involving 20 subjects, the system was able to classify the inputs with 88 percent accuracy overall. Accuracy depended in part on proximity of the sensors to the input; forearm taps could be identified with 96 percent accuracy when sensors were attached below the elbow, 88 percent accuracy when the sensors were above the elbow. Finger flicks could be identified with 97 percent accuracy.
"There's nothing super sophisticated about the sensor itself," Harrison said, "but it does require some unusual processing. It's sort of like the computer mouse - the device mechanics themselves aren't revolutionary, but are used in a revolutionary way." The sensor is an array of highly tuned vibration sensors - cantilevered piezo films.
The prototype armband includes both the sensor array and a small projector that can superimpose colored buttons onto the wearer's forearm, which can be used to navigate through menus of commands. Additionally, a keypad can be projected on the palm of the hand. Simple devices, such as MP3 players, might be controlled simply by tapping fingertips, without need of superimposed buttons; in fact, Skinput can take advantage of proprioception - a person's sense of body configuration - for eyes-free interaction.
Though the prototype is of substantial size and designed to fit the upper arm, the sensor array could easily be miniaturized so that it could be worn much like a wristwatch, Harrison said.
Testing indicates the accuracy of Skinput is reduced in heavier, fleshier people and that age and sex might also affect accuracy. Running or jogging also can generate noise and degrade the signals, the researchers report, but the amount of testing was limited and accuracy likely would improve as the machine learning programs receive more training under such conditions.
Skinput is an extension of an earlier invention by Harrison called Scratch Input, which used acoustic microphones to enable users to control cell phones and other devices by tapping or scratching on tables, walls or other surfaces.
Researchers:
Chris Harrison (Carnegie Mellon University)
Desney Tan (Microsoft Research)
Dan Morris (Microsoft Research)
‹ Current Highlight | Past Highlights ›
Computing Research Highlight of the Week is a service of the Computing Community Consortium and the Computing Research Association designed to highlight some of the exciting and important recent research results in the computing fields. Each week a new highlight is chosen by CRA and CCC staff and volunteers from submissions from the computing community. Want your research featured? Submit it!.