Facebook says it now uses machine learning to proactively detect revenge porn

Facebook says it now uses machine learning to proactively detect revenge porn
Facebook has revealed its latest efforts to curb the spread of so-called “revenge porn” across the social network.
Revenge porn, which is when intimate photos or videos are shared to the public without the subject’s consent, has become a whole lot easier in the age of camera phones, messaging apps, and an always-on society — it’s easier than ever to share anything with anyone at any time. While many countries now have laws in place specifically to address revenge porn, this doesn’t stop the practice from happening — and by the time an intimate image has been shared, legal retribution will likely be too late for the victim.
As with other technology companies, Facebook has introduced a number of tools through the years designed to tackle revenge porn. Back in 2017, Facebook launched a new Report button to make it easier to report intimate content shared on Facebook, Messenger, or Instagram. Additionally, at the time it said that it would use image recognition technology to ensure that a photo detected as revenge porn is not reshared in the future.
Later, Facebook allowed its users to take a more proactive approach by submitting a digital copy of the image to help Facebook automatically block any future attempts to share it publicly.
Burden
With its latest endeavor, Facebook is now striving to remove the burden from the victim altogether by automatically detecting “near nude” images or videos that are shared without permission across Facebook or Instagram.
“This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared,” noted Antigone Davis, Facebook’s global head of safety, in a blog post.
Moving forward, Facebook’s machine learning smarts will flag visual content it thinks may constitute revenge porn, and a human employee will review the content.
“If the image or video violates our Community Standards, we will remove it, and in most cases we will also disable an account for sharing intimate content without permission,” Davis added. “We offer an appeals process if someone believes we’ve made a mistake.”
In addition to the new auto-detection technology, Facebook is also launching a new support hub called Not Without My Consent, which serves up a bunch of related tools and resources for revenge porn victims.
Privacy
It is impossible for Facebook to micro-manage every single piece of content that is shared on its platform, which is why artificial intelligence (AI) has played an increasingly bigger part in its moderation efforts. Through the years, however, Facebook has courted controversy over the way its algorithms decide what content is pushed to the masses, and what is blocked. Case in point: it removed the Pulitzer-Prize winning “napalm girl” picture for featuring nudity, and banned the writer who shared it. The same year, Facebook lost a bid to prevent a lawsuit from a 14-year old girl whose nude photos appeared on the social network.
While these cases reveal the imperfections of Facebook’s automated moderation, they also reveal that Facebook is fighting an uphill battle to salvage its reputation in a year that has been embroiled with privacy scandals. And despite CEO Mark Zuckerberg’s recent revelation that the company would pursue a “privacy first” approach in future, most people remain skeptical.