The flight deck on an aircraft carrier is like a perfectly choreographed ballet. And to ensure that unmanned autonomous aircraft fit right in, researchers at MIT are developing a system that will let drones recognise and follow gestures from the flight crew.
The hardware and software developed by MIT's Yale Song, Randall Davis, and David Demirdjian works a bit like Microsoft's Kinect—but on a far more advanced level. Your Kinect might be able to recognise when you're heading a football, but I doubt the military would entrust it to land their next-generation bombing and reconnaissance aircraft.
Not only does their image recognition software have to determine the position and shape of the flight crew's arms, but it also needs to be able to discern gestures made with their hands and fingers. And to make the challenge even more complicated, the carrier's flight deck is always a flurry of activity, so the person making the gestures will usually always be in motion. So far the system has been trained to successfully recognise 24 different gestures, with an accuracy of about 76 percent.
And while an accuracy of 76 percent is impressive in the lab, when it comes to the real world application there's no room for error when million pound aircraft are trying to land on a multi-billion dollar boat. But the researchers feel there's lots of room for improvement as their system continues to hone its skills and improve its accuracy. [MIT news via Popular Science]
Photo: Associated Press/Richard Vogel