Lumus smartglasses to get EyeSight gesture recognition

BARCELONA, Spain — One of the difficulties with wearable computing is that it can be hard to control devices that don’t have a handy keyboard or touch screen attached. And that’s how gesture control company EyeSight Mobile won a place in Lumus’ smartglasses.

With the technology a person can hold out a finger to tap on icons or swipe away notifications in the virtual view Lumus glasses present. Later, EyeSight plans to add the ability to drag items around the display, too.

“You can actually touch the icons in the air with your fingers,” EyeSight Chief Executive Gideon Shmuel told CNET.

The companies revealed the partnership here at the Mobile World Congress. EyeSight expects its gesture recognition software will ship with the glasses, Shmuel said.

EyeSight CEO Gideon Shmuel at Mobile World Congress 2014.EyeSight CEO Gideon Shmuel at Mobile World Congress 2014.
EyeSight CEO Gideon Shmuel at Mobile World Congress 2014.
Stephen Shankland/CNET

Related stories

The Lumus glasses mount a transparent 640×480 display onto the lens of the battery-powered, head-tracking glasses; the wearer can see information overlaid on top, and the glasses change what’s shown according to the wearer’s orientation.

The glasses have a camera, an OMAP 4 processor, and Android 4.1.2. That’s enough horsepower to run EyeSight’s gesture recognition software, which is smart enough to recognize fingers and hands even against a cluttered, moving backdrop.

Ari Grobman, Lumus’ director of business development, said the glasses are scheduled to ship later this year.

With the EyeSight software, a person wearing the Lumus glasses could do things like browse Facebook, play games, or control navigation instructions shown in a head-up display.

Ari Grobman, Lumus' director of business development, wearing his company's smartglasses at Mobile World Congress.Ari Grobman, Lumus' director of business development, wearing his company's smartglasses at Mobile World Congress.
Ari Grobman, Lumus’ director of business development, wearing his company’s smartglasses at Mobile World Congress.
Stephen Shankland/CNET

Shmuel said EyeSight has other wearable-computing partnerships as well, but declined to detail them at this stage.

People might feel a bit silly waving their hands around in front of their glasses, but it’s possible to add a second, downward-pointing camera that would let people control the system with a hand held more discreetly at waist level, Shmuel said. And it’s better than voice control, which fails in noisy environments, or touch control, which is mostly limited to taps and linear swipes, he argued.

Of course, Google might disagree — the Google Glass computerized eyewear use voice and touch controls. But it’s pretty clear that what whatever the future of wearable computing, the interfaces that’ll be used to control them have yet to settle down into something as ordinary as PC mice and phone touchscreens are today.

Ari Grobman, Lumus' director of business development, showing his smartglasses' EyeSight gesture-control interface.Ari Grobman, Lumus' director of business development, showing his smartglasses' EyeSight gesture-control interface.
Ari Grobman, Lumus’ director of business development, showing his smartglasses’ EyeSight gesture-control interface.
Stephen Shankland/CNET

Check Also

8 New Google Products We Expect to See This Year

Google’s device line could end up having a particularly important moment in 2023. The company usually announces new Pixel products throughout the year. Google is expected to release its first foldable phone this year, however, which would directly compete with Samsung’s proven line of Galaxy Z Fold devices. Google also introduced its own ChatGPT rival, …

Leave a Reply