1 of 4 | Former Facebook employee Frances Haugen testifies Tuesday during a Senate Committee on Commerce, Science, and Transportation hearing titled "Protecting Kids Online: Testimony from a Facebook Whistleblower" on Capitol Hill in Washington, D.C. Pool Photo by Drew Angerer/UPI |
License Photo
Oct. 5 (UPI) -- Facebook whistleblower Frances Haugen testified in the Senate on Tuesday that the social media company has long known about misinformation and hate speech on the platform and negative impacts on young users.
At the hearing, Haugen explained to a Senate commerce committee panel how she believes Facebook's Instagram platform affects children negatively.
"I am here today because I believe that Facebook's products harm children, stoke division and weaken our democracy," Haugen said during opening remarks.
"The company's leadership knows how to make Facebook and Instagram safer but won't make the necessary change because they have put their astronomical profits before people," she said. "Congressional action is needed. They won't solve this crisis without your help."
Haugen said the social media site's algorithms can quickly steer children from safe content like healthy recipes to content about eating disorders. She called on lawmakers to demand more transparency into the company's algorithms and internal metrics to guide how to regulate the company.
Haugen took aim at Section 230 of the Communications Decency Act, which protects platforms from legal liability for content posted by their users. She suggested exempting platform decisions about algorithms from those protections, which would expose the company to lawsuits regarding how content is ranked on users' feeds.
"Facebook knows that content that elicits an extreme reaction from you is more likely to get a click, a comment or a re-share," Haugen said. "Those clicks and comments and re-shares aren't necessarily for your benefit, but because they know other people will produce more content if they get the likes and comments and re-shares."
Monika Bickert, vice president of content policy at Facebook, said it was "not true" that the platform's algorithms are designed to push inflammatory content.
"We do the opposite, in fact, and if you look in our transparency center, you can actually see that we demote, meaning we reduce the visibility of engagement bait, click bait, and why would we do that? One big reason is for the long-term health of our services, we want people to have a good experience," Bickert told CNN.
During her testimony, Haugen said the company's systems for catching offending content such as hate speech catches "a very tiny minority of offending content." Because of the company's "deep focus on scale," it is unlikely to ever catch more than 10% to 20% of offending content.
Haugen also said the Facebook platform is "definitely" being used by "authoritarian or terrorist-based leaders" around the world.
"My team directly worked on tracking Chinese participation on the platform, surveilling, say Uighur populations, in places around the world. You could actually find the Chinese based on them doing these kinds of things," she said. "We also saw active participation of, say, the Iran government doing espionage on other state actors."
Despite the national security threat, Haugen said she did not believe Facebook was adequately prepared to monitor and combat this behavior.
"Facebook's consistent understaffing of the counterespionage information operations and counter terrorism teams is a national security issue, and I'm speaking to other parts of Congress about that ... I have strong national security concerns about how Facebook operates today."
Haugen identified herself as a whistleblower on CBS' 60 Minutes on Sunday, saying Facebook has prioritized profits over public safety and was aware of research that showed the negative impact of some policies on young users.
A former data scientist for Facebook, Haugen is pushing Congress for new rules that address the concerns she's raised.
"When we realized tobacco companies were hiding the harms it caused, the government took action," she said in her opening statement.
"When we figured out cars were safer with seat belts, the government took action. And today, the government is taking action against companies that hid evidence on opioids. I implore you to do the same here."
Haugen said she's filed at least eight complaints with the Securities and Exchange Commission that claimed Facebook is hiding key research from investors and the public.
Facebook has pushed back against Haugen's accusations and said it does not encourage "bad content" and works on a continual basis to root out harmful information.
"We've invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority," Lena Pietsch, Facebook director of policy communications, told CBS News on Sunday.
"If any research had identified an exact solution to these complex challenges, the tech industry, governments and society would have solved them a long time ago," Pietsch added. "We have a strong track record of using our research -- as well as external research and close collaboration with experts and organizations -- to inform changes to our apps."
Sen. Richard Blumenthal, D-Conn., on Tuesday urged Facebook founder and CEO Mark Zuckerberg to testify before the committee in response to the allegations.
"Mark Zuckerberg ought to be looking at himself in the mirror today, and yet, rather than taking responsibility and showing leadership, Mr. Zuckerberg is going sailing," Blumenthal said. "No apologies, no admission, no action, nothing to see here. Mark Zuckerberg, you need to come before this committee you need to explain to Francis Haugen, to us, to the world and to the parents of America what you were doing and why you did it."
Zuckerberg on Tuesday night shared a statement that was also sent to company employees in a Facebook post, denying the allegations made in Haugen's testimony.
"Now that today's testimony is over, I wanted to reflect on the public debate we're in. I'm sure many of you have found the recent coverage hard to read because it just doesn't reflect the company we know," he wrote. "We care deeply about issues like safety, well-being and mental health. It's difficult to see coverage that misrepresents our work and our motives."
Zuckerberg said that the accusations that Facebook prioritizes profit over safety and well-being are "just not true."
"The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads and advertisers consistently tell us they don't want their ads next to harmful or angry content," he wrote. "And I don't know any tech company that sets out to build products that make people angry or depressed. The moral, business and product incentives all point in the opposite direction."
He also highlighted services such as Messenger Kids that are targeted toward protecting children.
"Of everything published, I'm particularly focused on the questions raised about our work with kids," wrote Zuckerberg. "I've spent a lot of time reflecting on the kinds of experiences I want my kids and others to have online, and it's very important to me that everything we build is safe and good for kids."