When Facebook expanded its pilot program to block revenge porn from being posted on its platforms last month, the blowback was swift. The program, which asks users to send in nude or partially nude images so they can be hashed and blocked, was called “creepy” and “controversial.”
Photo: Carl Court (Getty Images)
But Facebook’s approach builds on the way the tech industry fights the spread of child exploitation images, and it has the potential to protect millions of people who are victims of revenge porn (or non-consensual intimate imagery). Facebook doesn’t have a great track record on fighting revenge porn—it’s the platform where communities dedicated to nonconsensually sharing nude images, like Marines United, thrived—so it’s especially important for Facebook to step up on this issue.
The problem isn’t that Facebook is trying to accumulate your nudes. The problem is that Facebook has launched a somewhat complex technical solution to a nuanced problem at a time when user trust has been eroded by data mishandling scandals. The thinking goes: If Facebook can’t resist sharing your data and your friends’ data with everyone from Cambridge Analytica to device manufacturers like Samsung, then how can it be trusted to keep your intimate images safe?
But the pilot program Facebook has created to combat revenge porn is a smart, secure tool that allows potential victims to leverage Facebook’s massive scale against their abusers, instead of the other way around. A tool like this is long overdue, and it shouldn’t be delayed because Facebook is undergoing a broader reckoning over its impact on democracy.
“We know people have lost their jobs. They can become severely isolated, disconnected from their families,” Antigone Davis, Facebook’s global head of safety, explains of the consequences victims of nonconsensual image sharing can face. The question confronting her team, she says, was: “Is there a way to get in front of the issue and try to prevent the initial share?”
Facebook’s answer to the revenge porn problem is to hash user-submitted images and then block images that match those hashes from being uploaded on Facebook, Instagram, or Messenger. Microsoft developed a similar approach, branded as PhotoDNA, that is widely used to block the spread of child exploitation images online and has more recently been used to combat the spread of terrorist recruitment imagery.
Here’s how it works: First, people who are worried that an intimate image of them will be shared contact one of Facebook’s partner organisations. So far, Facebook has rolled out the tool in Australia, the United States, the United Kingdom, and Canada, and it has partners in each country: the Australian Office of the eSafety Commissioner, the Cyber Civil Rights Initiative and the National Network to End Domestic Violence in the U.S., the Revenge Porn Helpline in the U.K., and the YWCA in Canada.
Facebook wants victims to be in touch with these organisations so they can get help with more than just blocking their images from being shared—these organisations can also help people with legal questions and refer them to other resources. “We have the broader context to talk to them about what might be happening,” says Erica Olsen, the director of NNEDV’s Safety Net Project. “In many cases, when someone is threatening to distribute content, that person may very well have been abusive in other ways.”
A user will fill out a form with the partner organisation that includes their email address, and then Facebook will email them a single-use, encrypted link where they can upload the image or images.
The content is then reviewed by a team of five Facebook employees, transformed into a hash—a fingerprint that represents the image but can’t be reversed to reveal the image itself—and deleted within seven days. Facebook stores the hashes in a database, preventing images that match the hashes from being uploaded to its platforms.
Directing victims to a single-use, encrypted link prevents unnecessary copies of the image from being created—by using this method, Facebook is discouraging potential victims from sending their images via email or another unencrypted communication service. But even with secure image upload, image hashing, and a quick deletion process, victims of online harassment will have to share their intimate images with a Facebook employee. It sounds counterintuitive and even scary.
The alternative, though, is worse—someone you trusted posting those images publicly without your consent. And given the choice between sharing an intimate photo with one person at Facebook or having it non-consensually shared with the world, Facebook’s solution starts to make much more sense. Uploading the image to Facebook will get it blocked on Facebook, Instagram, and Messenger, wiping out some of the biggest image distribution platforms in the world.
“I think we have to recognise there is a group of people who are being threatened with this as a tactic for abuse and humiliation,” says Olsen. “We don’t want to underestimate how terrifying it is for people to have someone threaten to distribute nude or explicit and private images or content. I don’t think that the majority of people who opt into this process are submitting just to submit. I think people will be submitting them because there exists a fear and a reason for that fear—that someone else may try to distribute these photos to cause them harm.”
Facebook’s approach isn’t foolproof. Image hashing systems can be tricked if the image is altered slightly. “The photo matching technologies are getting better and better,” Davis says. “It is not a one-hundred-percent effective solution. It is important to realise that in dealing with bad actors, there will always be a bit of a cat-and-mouse game where people will try to get around the systems that you build.”
In addition to its pilot program to assist victims, Facebook is also taking a close look at the people who post revenge porn and trying to figure out how to discourage them, according to Davis.
“We are doing some work into understanding what you might think of as recidivist behaviour, for example. They may share in one place and be blocked and try to share in another place. We’ll continue to do research to understand not just the content that’s being shared but the behaviour,” Davis says.
As Facebook expands the program into more countries, the company also needs to grapple with understanding the kinds of images that may be considered intimate by those cultures. In the United States, Facebook initially defined revenge porn as nude images or videos that were meant to remain private. But it later expanded its definition to include other intimate images that were near-nude, such as images in which a person was wearing lingerie. And that definition will keep shifting as the program grows.
“There definitely are some interesting and challenging cultural issues that we will have to sort through as we develop a larger-scale solution. In some parts of the world, it’s not just an intimate image that has nudity that could put a woman at risk,” Davis explains. “Sometimes just an image with a male in a setting alone can put them in danger if people in their community know about it. We really wanted to limit this to what we define as NCII [nonconsensual intimate imagery] to start and then, as we learn more, address those other considerations as well.”