There all sorts of companies working on glasses which they believe will either replace or augment the smartphone. Question is, exactly how do you type on such a device? The answer according to Google is to project a keyboard on your hand and to type on it using your other hand. This according to a recent patent application the search company filed which utilizes a laser.
Now I don’t mean to get all geeky on you or anything but how Borg is this? Gene Rodenberry had it nailed with so many ideas like the pocket sized communicator, speech recognition, universal translators, tablets and now this – we’ll all have laser lights beaming from our heads.
Here is the exact wording of the patent application so you can see for yourself how serious the company is:
 A projection keyboard is a virtual keyboard that can be projected onto a surface and components of the keyboard detect finger movements and translate the movements into keystrokes on a device. A projection keyboard unit generally includes a laser to project a visible virtual keyboard onto a surface (e.g., a red diode laser as a light source to project a full size QWERTY layout keyboard, with a size of 295 mm.times.95 mm projected at a distance of 60 mm from the projection keyboard unit), and a sensor or camera to sense finger movements. A location or detected co-ordinates of the finger can be used to determine actions or characters to be generated.
 A projection keyboard may also use a second (invisible infrared) beam projected above the virtual keyboard. In this example, as a finger makes a keystroke on the virtual keyboard, the finger breaks the infrared beam and infrared light is reflected back to a camera. Reflected infrared beam may pass through an infrared filter to the camera, and the camera can photograph an angle of incoming infrared light. A sensor may determine where the infrared beam was broken, and detected coordinates can be used to determine actions or characters to be generated.
 A projection keyboard may include use of a micro-controller to receive positional information corresponding to reflected light or light flashes from the sensor, and to interpret events to be communicated through an appropriate interface to external devices. Events may include a key stroke, mouse movement or touchpad control.
Notice Google has even identified the precise size of the keyboard – looks like they likely have this thing working now in their lab.
Here is another important part of the patent application:
 When the user moves a finger 304 (e.g., an action finger) through the laser pattern of objects 300, the camera 208 may photograph a discontinuity of laser line curved around the action finger 304. A change in the laser line can be used to determine a location or an approximate location of the action finger 304. For known laser patterns, e.g., a set of 10 icons displayed as buttons, a processor of the on-board computer of the system 200 can detect which button in the laser pattern of images 300 is distorted by the action finger 304 to determine which button is being selected. After determining which of the objects has been selected, the on-board computer of the system 200 may direct the laser projector to alter the selected object, such as by changing a color of the object, for example.
As you might expect – Google has figured out how to “read” the “pressing” of a virtual button via a discontinuity in a photograph of the keyboard.
Here is another interesting tidbit:
 When the hand is not intercepting a laser pattern, and the pattern is instead projected on the floor for example, the red/green/blue values of the laser spots tend toward average red/green/blue values of the pixels in a camera image. In the instance in which the surface is a hand, skin reflects a laser pattern relatively well (in some sense, skin resembles a Lambertian surface, where incoming light is reflected evenly in all directions). However, other surfaces, such as floors, tend to be darker (e.g., office carpet) or specular and offer other angles to reflect the light (e.g., polished tile floors). In addition, surfaces other than a user's hand may be more distant from the camera, and thus, light in the pattern may not reflect as well. Thus, when the hand is not intercepting the laser pattern, pixels corresponding to spots in a projected pattern for these more distant surfaces may tend more toward the average red/green/blue value in the image.
So if you thought typing on glass was the ultimate advance in technology and perhaps that Tactus virtual button technology would be the next generation of data input, I guess you'd be wrong. It seems typing on your hand may be the future and in some cases even the floor? All I can say is “fascinating.” More.