Facebook has implemented a new photo-matching technology to ensure people can’t re-share images previously reported and tagged as revenge porn — intimate photos of people shared without their consent. That means if someone tries to share a photo that Facebook has previously taken down, that person will see a pop-up saying the photo violates Facebook’s policies and that Facebook will not allow the person to share that particular photo on Facebook, Messenger or Instagram.
“We’ve focused in on this because of the unique harm that this kind of sharing has on its victims,” Facebook Global Head of Safety Antigone Davis told me. “In the newsroom post we refer to a specific piece of research around the unique harm this has for victims. I think that’s where the focus was for this moving forward.”
The figure Davis is referring to is that 93% of people affected by the sharing of non-consensual intimate images report “significant emotional distress” and 82% report significant difficulties in other aspects of their lives, according to the US Victims of Non-Consensual Intimate Images.
Although Facebook has enabled people to report images for a while now, the language around revenge porn is now more clear and “very specific to these types of intimate images,” Davis said. In “many” cases, Facebook will also deactivate the account of the person who posted the revenge porn.
Facebook has also partnered with a handful of organizations, like the Cyber Civil Rights Initiative and the Revenge Porn Helpline, to offer support to people who are victims of revenge porn.
Last year, Facebook Director of Engineering for Applied Machine Learning Joaquin Candela told TechCrunch that the platform was using AI to report to detect and report offensive photos, but it seems that in instances of revenge porn, humans are still needed.
“At this moment, we’re not using AI to go through this particular content,” Davis said. “There is significant context that’s required for reviewing non-consensual sharing.”