ai

This App for Automatically Hiding Your Nude Pics Could Use An Anatomy Lesson

By Melanie Ehrenkranz on at

There’s a new app that uses machine learning to mine your camera roll for explicit images. The purpose of the app, Nude, is to file these sensitive photos into a for-your-eyes-only folder, and then help you wipe the pics from your camera roll and iCloud. But algorithms are still not very good at picking up on the nuances of human behaviour—and after letting Nude troll through my camera roll, I’m not convinced this algorithm has ever seen a naked body.

Photo: Nude App/Melanie Ehrenkranz

Full disclosure, I sent this algorithm on a fool’s errand. I don’t have any overtly explicit photos of me or past or current partners saved in my camera roll. But when I granted the app access to my photos and watched as it somewhat slowly trawled through my device, I noticed a few images being added to my nude folder. Like these ones.

Photo: Nude App/Melanie Ehrenkranz

It took a little over a half-an-hour for the algorithm to sift through my nearly 2,000 photos. A handful of the images arguably leaned a bit sexual in nature—like a shirtless actor, a topless woman (whose boobs are completely covered by her arms) sucking on a strawberry, and a screenshot of an Instagram explore video of what appears to be a boner beneath some dude’s swimming costume.

Photo: Instagram Explore Screenshot/Melanie Ehrenkranz

But there were also plenty of images that even the most prude bot wouldn’t assume needed to be filed away in a secret vault. Like a screenshot of Grace Kelly from To Catch a Thief. Or some good doggos. Or a screenshot of a tweet from Michelle Obama. Or a fully-clothed, adorable photo of me as a toddler. Or a tiled image of nine photos of Rami Malek at the beach. Or a doughnut.

21-year-old creators of the app Jessica Chiu and Y.C. Chen told the Verge that while they trained their algorithm using sites like Pornhub, collecting 30 million images, the service isn’t perfect just yet. We reached out to Nude to ask why some of the aforementioned images would be flagged as sensitive content.

“When it comes to the sensitivity of nude detection, we tried to play it safe,” said Chen in an email to Gizmodo. “There will always be some borderline false positive, and we are leaning towards catch-them-all rather than failing to detect some sensitive content. With that being said, we do recommend all our users update their iPhone to iOS11 before installing our app. CoreML has proven to be the most accurate when running our ML model, unfortunately Apple makes it so that CoreML would only work on iOS 11.” We tested the app on an iPhone 6s running iOS 11.

There is certainly a need for both a safe place to store sensitive content as well as a streamlined way for people to delete that content. But services like Nude miss the point if they fail to understand that a tweet from the former First Lady doesn’t belong in a folder for dick pics.


More Ai Posts: