BARCELONA, Spain — One of the difficulties with wearable computing is that it can be hard to control devices that don’t have a handy keyboard or touch screen attached. And that’s how gesture control company EyeSight Mobile won a place in Lumus’ smartglasses.
With the technology a person can hold out a finger to tap on icons or swipe away notifications in the virtual view Lumus glasses present. Later, EyeSight plans to add the ability to drag items around the display, too.
“You can actually touch the icons in the air with your fingers,” EyeSight Chief Executive Gideon Shmuel told CNET.
The companies revealed the partnership here at the Mobile World Congress. EyeSight expects its gesture recognition software will ship with the glasses, Shmuel said.
Related stories
- MWC 2022: How to Watch the Samsung Galaxy Book Reveal
- CES 2021: What to expect as the show goes all-digital
- Microsoft warns it might miss revenue guidance because of coronavirus
- Spending time with Huawei’s new Mate XS
- LG unveils new K series phones
The Lumus glasses mount a transparent 640×480 display onto the lens of the battery-powered, head-tracking glasses; the wearer can see information overlaid on top, and the glasses change what’s shown according to the wearer’s orientation.
The glasses have a camera, an OMAP 4 processor, and Android 4.1.2. That’s enough horsepower to run EyeSight’s gesture recognition software, which is smart enough to recognize fingers and hands even against a cluttered, moving backdrop.
Ari Grobman, Lumus’ director of business development, said the glasses are scheduled to ship later this year.
With the EyeSight software, a person wearing the Lumus glasses could do things like browse Facebook, play games, or control navigation instructions shown in a head-up display.
Shmuel said EyeSight has other wearable-computing partnerships as well, but declined to detail them at this stage.
People might feel a bit silly waving their hands around in front of their glasses, but it’s possible to add a second, downward-pointing camera that would let people control the system with a hand held more discreetly at waist level, Shmuel said. And it’s better than voice control, which fails in noisy environments, or touch control, which is mostly limited to taps and linear swipes, he argued.
Of course, Google might disagree — the Google Glass computerized eyewear use voice and touch controls. But it’s pretty clear that what whatever the future of wearable computing, the interfaces that’ll be used to control them have yet to settle down into something as ordinary as PC mice and phone touchscreens are today.