Former Facebook employee Frances Haugen argued that the social networking giant needs Congress to require more transparency from the company in testimony Tuesday before a Senate hearing.
“I believe that Facebook’s products harm children, stoke division, weaken our democracy and much more,” she said to a Commerce subcommittee focused on consumer protections.
Haugen, 37, a former product manager on Facebook’s civic misinformation team, said that effective regulation of Facebook would need to start with transparency, including allowing “full access to data for research not directed by Facebook.”
- Former Facebook employee Frances Haugen is testifying before a Senate subcommittee about what she says are problems at the social media company that “harm children” and “stoke division.”
- The hearing comes one day after a Facebook global outage, which has not been connected to Haugen coming forward to share internal documents from the company.
- Haugen said Facebook implemented safeguards leading up to the 2020 election but then turned them off afterward. But the Jan. 6 riot at the U.S. Capitol led Facebook to “break the glass” and turn them on again.
On Monday, Facebook had a massive outage that knocked out service to the social media giant’s platforms for users around the world.
Haugen said in her opening statement: “I don’t know why it went down. I know that for more than five hours Facebook wasn’t used to deepen divides, destabilize democracies and make young girls and women feel bad about their bodies.”
Haugen argued that allowing outside entities to see Facebook’s data would help combat much of the problems that the social platform creates. External researchers would allow for regulators to build “sensible rules and standards to address consumer harms, illegal content, data protection, anticompetitive practices, algorithmic systems and more,” she said.
“As long as Facebook is operating in the dark, it is accountable to no one,” she said. Haugen previously worked at Google, Pinterest and Yelp.
Haugen will also compared Facebook to Big Tobacco and pharmaceutical companies that manufacture opioids.
“When we realized tobacco companies were hiding the harms it caused, the government took action,” she said. “When we figured out cars were safer with seat belts, the government took action. And today, the government is taking action against companies that hid evidence on opioids. I implore you to do the same here.”
Tuesday’s hearing, and another one last week that saw senators question Facebook’s head of safety Antigone Davis, were called after the Wall Street Journal reported on leaked internal research that appeared to identify Instagram’s negative effects on children and teens’ mental health.
“The recent revelations about Facebook’s mental health effects on children, and its plan to target younger audiences are indeed disturbing,” said Sen. Roger Wicker, R-Miss., said at the start of the hearing on Tuesday.
Facebook said the Wall Street Journal mischaracterized the findings, according to a blog post released 12 days after the article was published.
Prior to the hearing last week, Facebook said it would pause development of a version of Instagram aimed at children following mounting criticism from child safety groups and lawmakers.
Haugen also alluded to the actions the platform took surrounding Jan. 6, the day of the riot at the U.S. Capitol.
“Facebook has been emphasizing a false choice. They said safeguards that were put in place [ahead of the 2020 election] implicated free speech. But the choices that were happening on the platform were about how reactive and twitchy was the platform. How viral was the platform,” said Huagen.
She added, Facebook changed the safety defaults in the run up to the election “because they knew they were dangerous and then returned them to their original defaults. They had to break the glass on January 6 and turn them back on and I think that’s deeply problematic.”