The GBUI as seen in The Matrix
The GBUI as seen in Minority Report
I have also seen Microsoft's vision on the UI in their Redmond headquarters and it involves lots of gestures which allow you to take applications and forward them on to others with simple hand movements. The demos included the concept of software understanding business processes and helping you work. So after reading a document - you could just push it off the side of your screen and the system would know to post it on an intranet and also send a link to a specific group of people.
While the intelligent software of the future is being developed I happened across an API called Touchless which takes the most interesting past of Microsoft's vision and allows you to start experimenting with it. I call this the gesture based user interface or GBUI.
A demo of Touchless in action
Using a standard video camera, you run the application and hold up a marker of a unique color and/shape. You then select the marker with the software. It is worth noting the marker can be an actual marker you draw with or anything else such as your nose, a cell phone, etc. You can also do this with two separate markers if you like. Once this is done, there are some demo applications you can experiment with. One is a Pong game where you can lift paddles up and down virtually as you move the markers. Then there is a map application which lets you flip, rotate and stretch a map by moving your markers. If you are trying this at home/work, the behavior of this application differs depending on the number of markers you select. Another application allows you to use the markers to draw.
Tom Keating alerted me to the Touchless API and demo program and I had a lengthy conversation with him and TMCnet's Webmaster Robert Hashemian about the future of this sort of interface. As usual, I was most excited about its potential while Tom and Robert saw more limited roles for this technology. We obviously saw the Wii as an example of this tech in action and realized that entertainment and gaming would be great applications for this 3D interface.
We also figured that very large screens might lend themselves to such a technology - where you could quickly expand and move windows just by waving your hand. Tom also felt a small computer such as a cell phone could take advantage of such gestures. This reminded me of a patent Apple applied for which combines multi-touch, camera input, voice and force to understand the user's intent.
After using this interface for a while, something struck me. It hurts to keep your arms in the air for a long time. Is this a strengthening issue? Does it just take time to adjust to having your arms in front of you all day? I am unsure but what I do know is that this sort of interface would work well with the SearchMe search engine and the WindowShop 3D shopping experience from Amazon.
In addition, it is worth pointing out that you may need a few cameras in stereo to maximize accuracy and you could theoretically use your hands as a mouse - meaning you can likely take advantage of all the functions of the GBUI while resting your hand on the desk in front of you for most of the day.
So I am restraining myself. I see the "cool" factor of this technology but I am not sure how common it will become. At some point we will see this stuff hit the OS and when that happens, the consumer can decide if the mouse and keyboard will rule the future or the GBUI will be the killer tech of the next decade.