Gesture-Based Eye-Tracking, Hand-Sensing Tech Coming but What of Standards?

Microsoft certainly popularized gesture-based computing with the Xbox Kinect but it seems one of the next areas where this sort of interface will have a big impact is in your automobile. Hyundai in fact unveiled a concept car recently which allows the driver to turn virtual knobs and choose from a menu using just their eyes.

Hyundai eye control technology
hyundai-eye-sensing.png

Hyundai virtual knob turning demonstration
hyundai-virtual-knob.png

Other areas are ripe for gesture control as well – Samsung is using the technology in its televisions and Leap Motion is using it to control PCs and laptops.

The logical question to ask as we enter this brave new world of gesture-based computing is what about standards? In other words, do I have to learn a different interface depending on manufacturer? Will a company like Apple come out with such a great interface that others will copy it? This is what happened in the smartphone world with touch interfaces.

There is a big hole in the market right now and if it doesn’t get filled by some centrally planned gestures, we may risk having to deal with different systems in different cars. Then deal with yet other control systems for our home appliances and then once again for our smart mobile devices.

Google seems to be well-positioned in this area since Android has become so ubiquitous. Samsung too has become a juggernaut in just about every market where a product has a plug… They could have great influence on the market. Then there is of course Microsoft and possibly someday soon maybe Apple.

In fact this sort of interface will likely be perfected by Cupertino before they launch their super-anticipated Apple Television. But will it be too late is a question worth pondering.

    Leave Your Comment


     

    Loading
    Share via
    Copy link
    Powered by Social Snap