1 of 2 | While Facebook and its parent company Meta have pledged to monitor or remove white supremacist content from the platform, research published Wednesday suggests that not only has the company failed to do what it said and continues to make money from their presence. File Photo by Sascha Steinbach/EPA-EFE
Aug. 10 (UPI) -- According to a report published Wednesday by the Tech Transparency Project, white supremacist groups are still active on Facebook and the social media platform is making money from their presence.
The TTP investigation found over 80 white supremacist groups -- some of them labelled by Facebook itself as dangerous organizations -- are still on Facebook.
After conducting Facebook searches for 226 white supremacist groups identified by the Southern Poverty Law Center and Anti-Defamation League, TPP found more than a third -- 37% -- still had a Facebook presence.
TPP found white supremacist groups were associated with 119 Facebook pages.
Of those, Facebook recommended other forms of content through the related pages feature on 69 of those pages. The result, TPP said, is that the feature "frequently directed" users to other forms of extremist content."
TTP said in statement that its investigation findings "underscore Facebook's inability, or unwillingness, to remove white supremacists from its platform despite the obvious dangers they pose to U.S. society."
"Despite numerous warnings about its role in promoting extremism, Facebook has failed to effectively address the presence of white supremacist organizations on its platform," the TPP statement said. "To make matters worse, the company is often monetizing searches for these hate groups, profiting off them through advertising."
Facebook said after meeting with civil rights groups in 2019 that "white nationalist" and "white separatist" groups would be banned.
TTP said in a statement Wednesday that its investigation found Facebook searches for groups with "Ku Klux Klan" in their name generated ads for Black churches, a potentially dangerous highlighting of Black institutions to users searching for white supremacist content.
The TPP report concluded that, "In some cases, Facebook is automatically generating Pages for white supremacist organizations, and it's amplifying hateful content through its algorithmic Related Pages feature."
The report said that while Facebook says publicly it bans white supremacist organizations, "these findings make clear that the company is more focused on profit than it is on removing hate groups and hateful ideologies that it has promised to purge from its platform."