After a social media campaign driven by feminist groups pressured Facebook to revisit its policies and procedures regarding harmful and hateful content, the social networking site admitted that it had failed to deal effectively with hate speech.
In a Tuesday blog post, the social media site said: "We need to do better -- and we will."
In recent days, it has become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate. In some cases, content is not being removed as quickly as we want. In other cases, content that should be removed has not been or has been evaluated using outdated criteria. We have been working over the past several months to improve our systems to respond to reports of violations, but the guidelines used by these systems have failed to capture all the content that violates our standards.
Facebook addressed the issue after a coalition of feminist groups, working together as "Women, Action, & the Media," wrote an open letter to the social networking site calling for "swift, comprehensive and effective action addressing the representation of rape and domestic violence on Facebook."
The advocacy group called out Facebook for failing to censor pages like "Fly Kicking Sluts in the Uterus," "Kicking your Girlfriend in the Fanny because she won’t make you a Sandwich," "Violently Raping Your Friend Just for Laughs," and "Raping your Girlfriend."
"Women, Action, & the Media" praised Facebook's response to their complaints, saying in a statement that the group had been invited to help the site address hate speech.
Facebook advertisers, like Dove and Zappos, also faced pressure for supporting a site that didn't actively deal with hate speech against women, according to the New York Times.
“So, Dove, you’re willing to make money off of us, but not willing to lift a finger to let Facebook know violence against women isn’t acceptable?” one commenter wrote on the company's page.
To improve the way it monitors inappropriate content, Facebook pledged to review its Community Standards, update training for site monitors, increase creator accountability and work with activist groups "to assure expedited treatment of content they believe violate our standards."