Two university students have won a $10,000 prize after coming up with a clever wearable that will turn sign language motions into spoken language.
According to Phys.org Navid Azodi and Thomas Pryor came up with a pair of gloves which record hand position and movement and transmit the data wirelessly over bluetooth to a computer, where algorithms are used to figure out what is being said. This is similar to how Siri sends your voice over the internet for Apple's servers to figure out.
The way the system then recognises hand gestures is by comparing the data through sequential statistical regressions, to figure out what words the movements are most similar too. It is similar to how neural networks are used in artificial intelligence. If the system finds a match, it then reads the words aloud.
"Many of the sign language translation devices already out there are not practical for everyday use. Some use video input, while others have sensors that cover the user's entire arm or body", Pryor is quoted as saying.
"Our gloves are lightweight, compact and worn on the hands, but ergonomic enough to use as an everyday accessory, similar to hearing aids or contact lenses", he added.
Amazingly, the students do not even appear to be specialists in the field. Azodi is studying business administration - though he did previously intern for NASA, and Pryor is studying aeronautics and astronautics engineering (so we're guessing he did the coding).
It'll be interesting to see if the idea is picked up commercially, as not only could it be a useful tool for deaf people, but it could have much wider applications in terms of gesture recognition and helping people with other conditions communicate too. [Phys.org via our pals at TechRadar]