Now playing:
Watch this:
Ultrasound drives next-generation mobile gestures
1:36
LAS VEGAS — Waving your hand over a phone or tablet instead of touching the screen is cool in theory, but often jerky or temperamental in practice. Elliptic Labs is rewriting the rules of touch-free gestures using the sense of sound.
When you turn on gesture control, ultrasonic speakers begin emitting sound waves above 20kHz, outside the range of human hearing. These waves hit your hand, then bounce back to the listening microphones. From there, software turns signals into action.
Touchless inputs generally use camera optics or infrared to get their gestural commands, which means you need to be fairly close to the screen in order to trigger the action, and fairly precise. The use of sound waves, however, mean you’ll have a much larger field of action to work with, and the presence or lack of light won’t make a difference at all.
Related stories:
Elliptic Labs says it can give device-makers up to 180 degrees of sensitivity, which means you can wag your fingers and hands all around the device, and be inches away from the screen. There is such a thing as too much sensitivity, however, but OEMs can restrain and fine-tune the parameters so users get fewer unintended results, a must for people who talk with their hands.
Even with a more constrained field of motion, sound waves allow for more elaborate gestures. Spinning your hand up or down could control volume, for example, and spinning to the side could launch another action.
Another advantage? These integrated ultrasound speakers are low power, which means they’ll sip at your battery stores rather than gulp.
Elliptic Labs unveiled its demo smartphone and tablet at CEATEC in Japan this past October and promises that it’s actively working with device-makers to incorporate the technology in 2014. Since the Samsung Galaxy S4 was the demo smartphone, the forthcoming flagship Samsung Galaxy S5 could very well be a debut device.