A woman holds a "thumbs down" sign during a Free Action Press rally demanding Facebook ban 'Team Trump' advertisements in accordance with their June decision to bar former President Donald Trump outside the US Capitol on Thursday. Facebook whistleblower Frances Haugen accused Facebook of picking profits over protections. Photo by Bonnie Cash/UPI | License Photo
Oct. 4 (UPI) -- Former Facebook data scientist Frances Haugen told CBS's 60 Minutes in a segment that aired Sunday that the social media giant constantly put profits before the public and its users and has turned over internal documents to federal agents.
Haugen, who has also worked at Google and Pinterest, said in the segment Facebook knew from research that the social media platform posed potential harm to teenage girls, along with amplifying hate and misinformation.
She said the company also ended safety protections for the 2020 elections before the Jan. 6 attack on the U.S. Capitol. Haugen is expected to testify to a Senate committee this week.
She said after the U.S. Capitol incident, employees complained on an internal messaging board.
"Facebook makes more money when you consume more content," Haugen said. "People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume."
Her attorneys have filed at least eight complaints with the Securities and Exchange Commission against Facebook for violations with investors.
"The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook," Haugen said. "And Facebook, over and over again, chose to optimize for its own interests, like making more money."
Facebook responded to CBS News with a lengthy statement from Lena Pietsch, the social media company's director of policy communications.
"Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place," Pietsch said.
"We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true," Pietsch said.
Pietsch said Facebook has developed a strong track record of using its own research and research from third-party experts to identify challenges with policing hate speech, misinformation and other issues.
Facebook said it was not to blame for the Jan. 6 attacks and safety measures taken during the 2020 election were being adjusted after its conclusion. The company said it had also banned "tens of thousands" of QAnon and other conspiracy theory sites before Jan. 6.