March 15 (UPI) -- Social networking company Facebook said Friday it's rolling out a new artificial intelligence tool that can detect and remove images and videos posted online without a certain users' consent.
The company said the tool is intended to prevent "revenge" postings that target specific persons, like an ex-girlfriend or spouse. The tool replaces the manual method previously available, in which users had to physically report instances to Facebook.
The system detects "near nude" content that's passed on to a human moderator for review. One motivation behind the tool is so-called "revenge porn," in which someone publicly posts sexual images of an ex.
"Finding these images goes beyond detecting nudity on our platforms," Facebook Global Head of Safety Antigone Davis said in a statement. "We can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram.
"This means we can find this content before anyone reports it."
Davis said people targeted in "revenge" schemes are often reluctant to report it for fear of retribution -- or they're simply unaware it's been posted.
Facebook didn't elaborate on the methodology of the tool for "non-consensual intimate images," but also announced a new topic hub called Not With My Consent. It promised users Facebook will act swiftly in cases of "revenge."
"In recent years we've used photo-matching technology to keep them from being re-shared. To find this content more quickly and better support victims, we're announcing new detection technology and an online resource hub to help people respond when this abuse occurs," the company said.