From a prismatic head-up display (HUD) to waveguide displays to direct retinal projection, we look at how Google Glass works now and it will in the future.
The Google Glass Explorer Edition is now out in the wild, with the 2000 early adopters busily exploring the world of (and with) Google Glass. The basic idea of a HUD is well understood, but how exactly has Google made this display work? And how much more advanced could it become in the years ahead?
At the moment, the Google Glass display replicates a 25-inch screen seen from around 2.5m away. Google has been cagey about the resolution, but based on the recommendations that have been given to app developers, it’s something along the lines of 640×360.
To make this display happen, Glass uses a projector and a prism. The prism is situated on the right side of the headset, slightly above the eye. The projector beams the image into the side of the prism, and the image is made visible on the perpendicular side.
To use the slightly arcane phrasing of Google’s patent application for Glass, it is a:
… prism of a transparent material configured to make an image projected into a side of the prism visible at a surface of the prism that is at a nonzero angle to the side of the prism.
This transparent prism is what allows the perception of the “hovering display” — the sense that the image is overlayed onto the real world. Or as the patent would have it “viewable by a wearer of the device in conjunction with an external image viewable through the prism”.
In many ways, this is quite a simple head-mounted display technology. Just a projection bounced onto a viewing surface. Speaking to Mashable, the CEO of Vuzix, Paul Travers referred to it as an “excellent first step”. It’s the next step that will take us toward a truly 21st century solution.
Let us introduce the Virtual Retinal Display, or VRD. In VRD, the image is drawn directly onto your retina. Multiple sources of light — usually lasers or LEDs — paint a raster image onto your retina, in a way similar to how an old cathode ray display would work.
VRD can bypass a number of eyesight issues. People with damaged corneas or lens defects on the eye could see the display quite normally, as none of these elements are being used in allowing the image to arrive at the retina. In the same way, the image wouldn’t be affected by any environmental lighting.
Anthea Muir, senior optometrist at OPSM, thinks that VRD technology will end up competing with other types of head mounted displays in the near future.
“I believe that one rival product or technology we’ll see will project directly on the retina. I can see the competition even becoming the modern version of the VHS vs. Beta wars, or even Blu-ray vs. HD DVD,” Muir told CNET Australia.
Google holds no long-term commitment to the display technology currently used in Google Glass. Its patent notes that “alternative embodiments” of Glass could have an LCD in the lens itself. Waveguide displays are also mentioned, where light is essentially pushed into the side of a lens to create the image. Finally, direct retinal projection is listed:
Alternatively, or additionally, a laser or LED source and scanning system could be used to draw a raster display directly onto the retina of one or more of the user’s eyes.
Google Glass might lead the way right now for head-mounted display computing, but there’s clearly a lot of room for development before the technology itself becomes a standard. Until then, we can expect to see Google evolve Google Glass technology in a variety of ways — and even see new competition emerge using some of these alternative technologies to gain the upper hand.