As It Happens

Why Facebook is asking for nude photos in order to combat revenge porn

Facebook is testing a new pilot project to halt the spread of revenge porn and it involves asking users to send them nudes. The Daily Beast's Joseph Cox explains.
An illustration picture shows a woman looking at the Facebook website. Facebook has launched a pilot program in Australia to tackle so-called revenge porn. (Michael Dalder/Reuters)

Story transcript

Facebook has launched a pilot project to halt the spread of revenge porn on its platform, and it involves asking users to send them nudes. 

The project, was launched in partnership with the government of Australia, to combat revenge porn, a term used to describe explicit photos being shared online without someone's consent.

Facebook says users who are concerned they could be targeted by revenge porn can submit a complaint with the country's eSafety commissioner, then send their nude images to themselves via Messenger.

These two actions combined will trigger a Facebook employee to screen the images and make a "digital fingerprint" of them, which will then be used to prevent those same images from showing up elsewhere on the platform.

"We don't want Facebook to be a place where people fear their intimate images will be shared without their consent. We're constantly working to prevent this kind of abuse and keep this content out of our community," a Facebook spokesperson told As It Happens. 

The Daily Beast's Joseph Cox has been reporting on the new initiative. He spoke to As It Happens host Carol Off.

What did you first think when Facebook said that it might have a way of preventing revenge porn?

It's long overdue. Facebook, in particular, really needs some way to tackle this problem. But it's not just Facebook. Twitter as well, other social networks, and then just other sort of communities. Revenge porn is pervasive on the internet today.

And what do you make of this proposal?

It's the logical approach. At some point you're going to need a human to look at the photos, which is what Facebook is proposing in this pilot. But, of course, that is not without controversy.

These people need to make a decision about whether they would rather take preventative measures to stop their nude photos becoming public and whether they're comfortable with a Facebook employee or a contractor seeing those images as well.

This depends on women themselves sending Facebook nude pictures of themselves, right?

Yeah, that's right. 

The key difference where this separates form a similar problem, which is child abuse or child pornography, is there's already a very large database of child sexual exploitation imagery that obviously the FBI or other law enforcement bodies will collect and bring together so that Facebook can then use that database.

There's no such thing for revenge porn. There's no central repository of all the images. Unfortunately it does look like the only solution at the moment is for men or women to submit their own photos they would like flagged for the future. 

Why on Earth would anybody trust Facebook to send their nude photos to them?

You're right. I mean, Facebook clearly does not have a great reputation or track record in this space and probably more broadly in social media when it comes to flagging things like terrorist content or, to a lesser extent, child pornography, because that problem has been largely curbed.

But with this particular pilot it will boil down to whether the individual sees more of a risk of their image being publicly posted on Facebook or sent among their friends, or whether they will, as you say, have to trust Facebook, or more specifically one or two employees at Facebook, to make the right judgement.

The success of Facebook's pilot program depends on whether people trust the social network enough to use it. (Sean Gallup/Getty Images)

Do they have any idea, those women and men, what Facebook might do with them? How secure that information, those pictures are?

Facebook, at least in my conversations with the company, they say they're only storing the images for a period of time. They didn't specify how long that is, but considering this is a pilot, it certainly won't be indefinitely and it won't be too long.

[Editor's Note: Facebook told As It Happens it deletes the original image as soon as the user removes it from Messenger.]

The idea is once the system is going and they've confirmed it functions as intended, they don't need to store the actual images themselves, they can just store that special fingerprint that will identify the photos without actually having that original image in place.

But doesn't everybody tell us the last thing you should ever do is send pictures of yourself, compromising pictures of yourself, on the internet?

In this case you're sending the message yourself on Facebook Instant Messenger and from that a small group of Facebook employees will see it. But that is obviously still a risk. It may be one that people aren't willing to take.

As we've seen recently when a Twitter contractor shut off President Donald Trump's Twitter account or in the past when Uber employees have abused the visibility that their app has onto their customers, it reminds us there's always people behind the tech and behind the companies that power our connectively across the world.

This interview has been edited for length and clarity.