This Neural Net Describes the City it Sees in Real-Time

By Jamie Condliffe on at

Take one neural network that describes what it sees in an image. Provide it with a webcam feed from the MacBook it’s running on. Then, wander around a city and see what happens. Here are the results of exactly that experiment.

That is, more or less, what Kyle McDonald did to create this video. Using Andrej Karpathy’s “NeuralTalk” code modified to run from a webcam feed, he wandered around the bridge at Damstraat and Oudezijds Voorburgwal in Amsterdam. The video is a little shaky and weird, because McDonald is just wandering around with a laptop pointing it at people, but the results are interesting.

Perhaps unsurprisingly, it doesn’t always get it right. For instance, it takes a while for the software to decide the man in shot is eating a hot dog rather that holding a mobile phone, and it occasionally sees things that no sane human would. But that fact that a chunk of code can decipher from pixel colour and brightness what’s being shown in an image, like this, in real-time, is frankly amazing.

If you interested in learning more about how neural networks are increasingly becoming part of your everyday online experience, why not read our explainer. [Kyle McDonald via Charles Arthur via Verge]


Want more updates from Gizmodo UK? Make sure to check out our @GizmodoUK Twitter feed, and our Facebook page.