Facebook Still Thinks Uploading Naked Pictures of Yourself is the Best Way to Stop Revenge Porn

By Tom Pritchard on at

Back in October Facebook announced its plans to tackle the problem of revenge porn being spread on its platform. Plans that involved asking users to upload naked pictures of themselves, so the platform could assign a digital fingerprint and curb any attempts to spread the pictures around. That story is back in the news again, because this system is going to be trialled in the UK, US, Australia, and Canada.

The point is that once uploaded and fingerprint, it will theoretically be impossible for anyone to share that particular photo on Facebook, Instagram, or Messenger. If someone can't upload an image to all three platforms, they can't use them to spread it around and cause the subject any sort of embarrassment or distress. Now that Facebook is expanding its initial trial, it's revealed the process people will go to if they want to stop photos from potentially being shared:

People who worry that someone might want to harm them by sharing an intimate image can proactively upload it so we can block anyone else from sharing it on Facebook, Instagram, or Messenger:

  • Anyone who fears an intimate image of them may be publicly can contact one of our partners to submit a form
  • After submitting the form, the victim receives an email containing a secure, one-time upload link
  • The victim can use the link to upload images they fear will be shared
  • One of a handful of specifically trained members of our Community Operations Safety Team will review the report and create a unique fingerprint, or hash, that allows us to identify future uploads of the images without keeping copies of them on our servers
  • Once we create these hashes, we notify the victim via email and delete the images from our servers – no later than seven days
  • We store the hashes so any time someone tries to upload an image with the same fingerprint, we can block it from appearing on Facebook, Instagram or Messenger

This is one step to help people who fear an intimate image will be shared without their consent. We look forward to learning from this pilot and further improving our tools for people in devastating situations like these.

That said Facebook has backtracked on on particular aspect of the system, specifically the promise that those images would scanned by an AI and not see by human eyes. Now the company says that the images will be reviewed by a "specially trained representative" who is an actual human being. The photo itself doesn't need to be saved once the hash has been created, so there shouldn't be any sort of hack that exposes a hidden database of intimate photos. Hopefully there are measures in place that prevent the reviewer from saving copies for themselves. You know damn well someone will try, special training or not.

It's a weird system, asking people to deliberately expose themselves to prevent further exposure, but I imagine it's better than the possible alternatives. Still, it doesn't stop terrible people from sharing those pictures on other sites, well known or not, but that's hardly something Facebook can do anything about. [Facebook via Buzzfeed News]