When using illustration or drawing software,

it's necessary to first chose a tool — such as a pen, brush or even a piece of charcoal — from a menu. And although the on-screen effect varies for each tool, the illustrator still grips the stylus the same way. Similarly, when using a

smartphone, a person typically clicks on an app to select a function — such as phone or camera. And then the user holds the phone a particular way to carry out that function, whether it's making a call or taking a picture.

But soon it could be all in the grip. Researchers are working on a way to engage the function of an electronic device simply by the way it's held. They call it FlexAura, for "flexible near-surface range sensor." It would allow an illustrator, for example, to hold a stylus like a brush and immediately engage the brush tool in the software. Or if a person held her phone like a camera, the camera app would pop up. 

Ginormous Armed Robot Controlled by Phone

Senwei Liu and Francois

Guimbretière at Cornell University presented their prototype

at the ACM Symposium on User Interface Software and Technology in Boston this


The technology itself is a

set of infrared LEDs wrapped around a pen-shaped stylus, which is connected to

a computer. The LEDs sit alongside photo-transistors that pick up how the

infrared light is reflected off of a person's hand as they hold the stylus.

That allows the software to create a

kind of topographic map of the shape of the grip, which indicates how the pen is being held. "The pressure of the grip

doesn't mater," Guimbretière, an associate professor of information

science at Cornell, told Discovery News. "What matters is the


When the user grips the

"pen" in a different way, the software can read that as input — and

change tools, as in the drawing program example.

Hack Yourself A Super Secret LCD Monitor

The next step, he said, is

to build a wireless connection because the prototype's wired link is a bit

cumbersome. He added that it might be possible to use a system of capacitive

touch sensors to accomplish the same thing. That would lend itself better to

devices such as phones, as LEDs tend to be power-hungry and battery life is at

a premium. The difficulty is that capacitive touch sensors only tell you where

a hand is contacting the device, not how far away the surface is.

Credit: Shenwei Liu / Francios Guimbretière

Via: "FlexAura: A Flexible

Near-Surface Range Sensor" presented at UIST