Yesterday at the annual Research Faculty Summit, Microsoft showed off a pretty impressive advancement in its AI tech. An app, entitled Project Adam, is poised to identify all of its surroundings just like a Fire Phone without the merchandise hooks. The app is still in development but shows promising results.
Microsoft says Adam has been meticulously calibrated by researchers to mimic a human brain, creating a high performance computer that builds and stores data in a large scale distribution system that works like our neural processes. Trishul Chilimbi and his team have been developing the app's neural network:
Recent research... focuses on Project Adam and its object classification, culling a massive dataset of 14 million images from the Web and sites such as Flickr, made up of more than 22,000 categories drawn from user-generated tags.
Using 30 times fewer machines than other systems, that data was used to train a neural network made up of more than two billion connections. This scalable infrastructure is twice more accurate in its object recognition and 50 times faster than other systems.
The app's demonstration included identifying the breed of three dogs in front of a live audience. Here's Cortana's partner in crime, Adam, correctly analysing and naming the specific breed of this dog by looking at it:
While still in its infancy and far from commercial release, Adam looks like it has serious potential. Microsoft researchers are proposing useful applications of Adam such as ascertaining a meal's entire nutritional contents from a single photo, or correctly diagnosing a skin condition or edible plants that could be found in the wild. Or doing an impression of an Amazon Fire Phone. [Microsoft Next]