We humans get along pretty well seeing colors ranging from blue to red. But a startup is betting that when it’s time to endow electronic devices with vision, infrared light will find a place too.
Infrared light — just beyond red in the color spectrum but invisible to humans — is already used in some devices, like TV remote controls and the motion sensor for Microsoft’s Kinect video game technology. InVisage plans to sell a new image sensor for infrared cameras that the company believes will dramatically improve computer vision.
Infrared sight could revolutionize the way devices interact with us and their surroundings. Drones and self-driving cars could navigate more precisely. Phones could more reliably unlock automatically when we look at them and they do a retina scan. And when interacting with the computer-generated worlds of virtual reality, headsets could keep better track of our hands and arms.
We used wires and punched cards to tell the earliest computers what to do. Keyboards, mice, and touch screens came next, making it possible for ordinary people to use computers. But for the next wave of computing, where devices understand their surroundings and chart their own course through the world, they’ll have to rely on their own senses, including infrared vision.
“A drone doesn’t know how to avoid a tree or power line,” said Jess Lee, chief executive of InVisage. “They can’t do that without some sort of computer vision system.”
The Menlo Park, California-based startup just began selling its first product, the Quantum13 image sensor, which will bring the company’s “quantum film” technology to conventional phone cameras starting in the second quarter of 2016. The infrared-sensitive SparkP2 sensor uses the same technology foundation: tiny particles called quantum dots deposited onto an image-sensor chip. But in the case of the SparkP2, the quantum dots are sensitive to infrared light instead of visible light. The company is showing off prototypes later this month at the Mobile World Congress tech show, but the sensor should appear in products later this year, Lee said.
Infrared sensors can help computers see without needing bright light sources that distract or annoy humans. An infrared light source, such as a laser or LED, can bathe a scene in light that humans aren’t capable of noticing. When used outdoors, the particular frequency of infrared light InVisage uses is less susceptible to being drowned out by light from the sun. The company has big competitors, like Toshiba, Sony and OmniVision, but it thinks it’ll get ahead with products that are smaller and more sensitive.
The devices most likely to need large quantities of infrared image sensors are mobile phones, said IHS analyst Brian O’Rourke. A phone can unlock itself automatically when it recognizes patterns in its owner’s iris, the region around the pupil, but today the technology is limited because you have to get your eye in just the right place, Lee said. InVisage’s technology enables higher-resolution video so a phone can capture sufficient detail in a much wider field of view.
Other areas infrared cameras could help include home security, devices to monitor elderly people to make sure they’re safe, and gesture recognition, said Gartner analyst Dean Freeman.
Gesture recognition could be a hot area if virtual reality catches on. When people are immersed in a computer-generated 3D realm, it’s tough to use a keyboard, but an infrared sensor can track hands and fingers so people can control what’s going on. To do that, or to guide a car or drone, a computer needs to understand how things work in three dimensions.
3D sensing technology takes heavy-duty processing power, which quickly drains batteries on drones and VR headsets. But Lee thinks his chip’s high sensitivity will make 3D possible there. “We’re uniquely positioned to do this with the SparkP2 sensor,” he said.