Calling Facebook’s past decisions “disastrous,” a former product manager for the company who leaked internal research showing the harm its platforms can cause urged lawmakers Tuesday to take action to regulate the social media giant.

What You Need To Know

  • A former product manager for Facebook who leaked internal research showing the harm its platforms can cause urged lawmakers Tuesday to take action to regulate the social media giant

  • Frances Haugen, who worked in Facebook’s civic integrity unit, testified before the Senate Subcommittee on Consumer Protection, Product Safety and Data Security 

  • Haugen testified that "Facebook's products harm children, stoke division and weaken our democracy" and accused the company of putting "their astronomical profits before people"

  • Late Tuesday, Facebook CEO Mark Zuckerberg said that many of Haugen's claims "don't make any sense" and insisted the company doesn not prioritize profits over safety and well-being

Frances Haugen, who worked in Facebook’s civic integrity unit, testified before the Senate Subcommittee on Consumer Protection, Product Safety and Data Security in a hearing that was largely focused on protecting children online but veered into other areas, including political violence.

Haugen revealed in a “60 Minutes” interview that aired Sunday that she is the whistleblower who secretly copied tens of thousands of pages of internal research documents before leaving her job. She shared the documents with federal regulators and The Wall Street Journal, which published a series of explosive articles.

The research found, in part, that Facebook has been aware that its Instagram photo-sharing app has caused mental-health and body-image problems for some young people, as well as eating disorders and suicidal thoughts. The analyses also found that Facebook’s algorithm recommends hateful or harmful content to users. Meanwhile, the company continued to publicly downplay the negative impacts of its platforms.

“I'm here today because I believe Facebook's products harm children, stoke division and weaken our democracy,” Haugen said in her opening remarks. “The company's leadership knows how to make Facebook and Instagram safer but won't make the necessary changes because they have put their astronomical profits before people.”

Haugen said she came to recognize during her time with the company just how opaque its operations are to the outside world. Unless it becomes more transparent, Congress will have a difficult time passing necessary regulations and conducting oversight, she said. 

“Facebook wants to trick you into thinking that privacy protections or changes to Section 230 alone will be sufficient,” Haugen said. “While important, these will not get to the core of the issue, which is that no one truly understands the destructive choices made by Facebook except Facebook. We can afford nothing less than full transparency. As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable. Until the incentives change, Facebook will not change.

"Facebook has not earned our blind faith," she added.

Section 230 is the provision in the 1996 Communications Decency Act that gives online platforms legal immunity from liability for content posted on the internet.

Haugen recommended that a congressional oversight body be created and that Section 230 be reformed so that tech companies aren’t protected for content promoted by its algorithms.

“User-generated content is something that companies have less control over. They have 100% control over their algorithms,” she said. “And Facebook should not get a free pass on choices it makes to prioritize growth and virality and reactiveness over public safety.”

Haugen took aim at Facebook’s use of “engagement-based ranking,” which she said gives greater weight in feeds to content with high levels of user interaction. The algorithm favors posts that are more likely to elicit a strong emotional reaction from people, including content featuring anorexia or fanning the flames of ethnic violence in Ethiopia, Haugen said.

She said the website BuzzFeed once wrote a letter to Facebook complaining that a change to the algorithm led to the content of which it was most ashamed performing best on Facebook. Haugen also said politicians have been known to take positions unpopular with their constituents because it will spread wider on the platform.

According to The Wall Street Journal, Facebook’s research found that 40% of Instagram users who reported feeling “unattractive” said the feeling began on the app. About a quarter of teens said Instagram spurred feelings of not being good enough.

And 13% of British users and 6% of American users who reported having suicidal thought traced the desire to kill themselves to Instagram.

Haugen also said Facebook’s research has shown that Instagram has become another avenue for bullying. She noted that years ago high school students could escape verbal abuse from classmates in the hours they were away from school, but today, because of Instagram, “the last thing they see before they go to bed at night is someone being cruel to them, or the first thing they see in the morning is someone being cruel to them.”

Meanwhile, Instagram has been planning to roll out a version of the app for children under 13 years of age. Facing public pressure, it announced last week it has paused work on “Instagram Kids” in order to consult with parents, experts, policymakers and regulators. 

Haugen said she would be “sincerely surprised” if the company pulls the plug on Instagram Kids.

“Facebook understands that if they want to continue to grow, they have to find new users, they have to make sure that that the next generation is just as engaged in Instagram as the current one,” she said. “And the way they'll do that is by making sure that children establish habits before they have good self-regulation,” she said.

“By hooking kids?” asked Sen. Brian Schatz, D-Hawaii.

“By hooking kids,” Haugen answered.

Facebook has defended the plans for Instagram Kids by saying children under 13 are lying about their age in order to gain access to social media platforms and that the new version of the app would be safer because it would be catered to them and include parental controls.

Haugen said the moment she realized she needed to speak out was when Facebook disbanded its civic integrity team shortly after the 2020 presidential election, dividing up its responsibilities among other departments.

“It really felt like a betrayal of the promises that Facebook had made to people who had sacrificed a great deal to keep the election safe,” she said.

She also said Facebook put safeguards in place before the election to prevent misinformation, foreign interference and violence but then lifted them shortly after the election. 

“Facebook changed those safety defaults in the runup to the election because they knew they were dangerous, and because they wanted that growth back, they wanted the acceleration on the platform back after the election, they returned to their original defaults,” Haugen said. “And the fact that they had to break the glass on Jan. 6 (after the Capitol riot) and turn them back on, I think that's deeply problematic.”

Haugen also blamed many of Facebook’s problems on departments being too understaffed to tackle all the problems they face.

Tuesday’s hearing was the latest in a series aimed at proposing new internet regulations, including on social media.

Sen. Roger Wicker, R-Miss., said the revelations about Facebook “show how urgent it is for Congress to act against powerful tech companies on behalf of children and the broader public.

“They are possessive of immense, immense power,” he said. “Their product is addictive.”

Sen. Amy Klobuchar, D-Minn., told Haugen that she might be the catalyst for action, but the senator also noted that Congress has consistently failed to reform federal privacy laws. 

“Zilch in any major way,” she said. “Why? Because there are lobbyists around every single corner of this building that have been hired by the tech industry.”

Some senators took aim at Facebook founder Mark Zuckerberg. Sen. Richard Blumenthal, D-Conn., the subcommittee’s chairman, accused Zuckerberg of choosing to go sailing over answering questions, referencing a video the Facebook CEO posted online Sunday. 

“His new modus operandi? No apologies, no admission, no action, nothing to see here,” Blumenthal said. 

Sen. Ed Markey, D-Mass., had a message for Zuckerberg: “Your time of invading our privacy, promoting toxic content and preying on children and teens is over. Congress will be taking action.”

Late Tuesday, Zuckerberg posted on Facebook that many of Haugen's claims "don't make any sense."

"If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place?" he wrote. "If we didn't care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space -- even ones larger than us? If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we're doing?

"At the heart of these accusations is this idea that we prioritize profit over safety and well-being," he added. "That's just not true."

In a statement to Spectrum News, Lena Pietsch, Facebook’s director of policy communications, attempted to discredit Haugen but agreed that new, uniformed internet rules are needed.

“Today, a Senate Commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives – and testified more than six times to not working on the subject matter in question,” Pietsch said. “We don’t agree with her characterization of the many issues she testified about. 

“Despite all this, we agree on one thing; it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”

In a separate hearing before the same subcommittee last week, Antigone Davis, Facebook’s head of global safety, said the company strongly disagrees with how The Wall Street Journal reports characterized its research. She repeatedly cited that the company’s researchers found that 11 of 12 teen girls who said they were struggling with issues such as loneliness, sadness, anxiety and eating disorders reported that Instagram helped them more than harmed them.

“At Facebook, we take the privacy, safety and well-being of all those who use our platform very seriously, especially the youngest people on our services,” Davis said. “We work tirelessly to put in the right policies, products and precautions so they have a safe and positive experience.”

Haugen said Davis’ defense is further proof that Facebook needs to stop “focusing on all the good” and “admit they have responsibilities to also remedy.”

She said she believes Facebook’s problems are solvable. She added that the company needs to first declare “moral bankruptcy” by admitting it made mistakes and then ask for help from Congress as it charts a new path forward.

“A safer, free speech respecting more enjoyable social media is possible. But there's one thing that I hope everyone takes away from these disclosures: It is that Facebook can change, but it's clearly not going to do so on its own. My fear is that without action, divisive and extremist behaviors we see today are only the beginning,” she said, referencing the conflict in Ethiopia and the coup in Myanmar.

Note: This article was updated with Mark Zuckerberg's comments.