Who Needs to Actually Touch a Screen When You Can Just Point at It?

By Sam Gibbs on at

This may sound a little far fetched, but a gesture research company called Eyesight is offering up a purely software-based solution for tracking your delicate little digits. It uses your device's existing camera to track the movement of your hands, and specifically, your fingertips. It's like Kinect condensed down, and supposedly sensitive enough to monitor a single finger at up to 5-metres away.

At the moment, it's in the prototype stage, with demos showing off control of a cursor through the moment of individual phalanges. Apparently it's even able to recognise and differentiate between people in a room, and only track the commander's fingers -- lest you get into a TV remote war of finger pointing.

Microsoft's been working on something similar for a while with Kinect being part of it. Eyesight's promising to enable all this just through a software layer on top of your existing hardware though -- no fancy stereoscopic camera needed. Whether it'll work in the real world, I'm not sure. It would have to be a pretty good camera to track you and your grubby mitts on a busy background, surely.

I guess the real question is, do people really want to be able to point and wave at screens and control them? I mean, what if you wave at a friend and the TV takes that as a command to send the volume through the roof? Or better yet, what if you're driving and some knob-jockey cuts you up, so you flip them the bird (totally understandable), which makes your sat nav cut out just as you're trying to navigate the spaghetti junction?

It's apparently pre-loaded on the Lenovo IdeaPad Yoga, if you want to try it out for yourself, but I think it might end up being like the eye-tracking tech I tried out at IFA last year. It's cool in theory, not so great in practice. But who knows, it might hit a phone or tablet near you soon, and it might be wonderful, especially if you can turn it off.