A former Facebook employee told members of Congress on Tuesday that the company knows its platform spreads false information and content that is harmful to children, but refuses to make changes that could hurt its income.
Speaking before the Senate Subcommittee on Consumer Protection, former Facebook data scientist Frances Haugen told lawmakers that new regulations are needed to force Facebook to improve its platforms.
However, he refrained from calling for the dissolution of the company, noting that that would not fix the existing problems and would instead cause Facebook to become a “Frankenstein” that continues to cause damage around the world while a separate Instagram takes most of the advertising dollars.
Efforts to pass new social media regulations have failed in the past, but senators said Tuesday that new revelations around Facebook show the time for inaction is over.
Here are some of the highlights from Tuesday’s audience:
FACEBOOK KNOWS THAT IT CAUSES HARM TO VULNERABLE PEOPLE
Haugen said Facebook knows that vulnerable people are affected by its systems, from children who are susceptible to feeling bad about their bodies because of Instagram to adults who are more exposed to misinformation after being widowed, divorced or experiencing other forms of isolation such as moving to a new city.
The platform is designed to harness negative emotions to keep people connected to the social network, he noted.
“They are aware of the side effects of the decisions they have made around amplification. They know that rankings based on algorithms, or on interaction, keep people on their sites longer. You have longer sessions, you go in more often and that makes them more money, “he explained.
THE INFORMANT TOUCHED A SENSITIVE TOPIC
During the hearing, Tennessee Sen. Marsha Blackburn, the highest-ranking Republican on the commission, said she had just received a text message from Facebook spokesperson Andy Stone stating that Haugen did not work in child safety or child safety. Instagram and also did not investigate these issues and has no direct knowledge of the matter from his work on Facebook.
Haugen clarified on several occasions that he did not work directly on these issues, but that he based his testimony on the documents he had or on his own experience.
But Facebook’s statement highlighted his limited involvement and relatively short tenure with the company, questioning his experience and credibility. Not everyone liked that.
Facebook’s tactic “shows that they don’t have a good answer to all these problems from those who are attacking it,” said Gautam Hans, an expert in technology law and freedom of expression at Vanderbilt University.
SMALL CHANGES COULD MAKE A BIG DIFFERENCE
According to Haugen, making changes that reduce the spread of misinformation and other harmful content would not require completely reinventing social media. One of the simplest changes could be to simply organize posts in chronological order rather than letting computers make predictions of what people want to see based on how much interaction, good or bad, they might attract.
Another was adding an extra click before users can easily share content, something that, he noted, Facebook knows can dramatically reduce false information and hate speech.
“A lot of the changes I’m talking about are not going to stop Facebook from being a profitable company, it just won’t be a ridiculously profitable company like it is today,” he said.
He added that Facebook won’t make those changes on its own if that can slow growth, despite the company’s own research showing that people use the platform less when exposed to more toxic content.
“You’d think a kinder, friendlier, and more collaborative Facebook could have more users within the next five years, so it’s for everyone’s good,” he said.
A LOOK INSIDE THE COMPANY
Haugen described Facebook’s corporate environment as being so machine-like and so metric-driven that it was difficult to curb known damages that, if addressed, could hurt growth and revenue.
He noted that the company’s famous “flat” organizational philosophy, with few levels of management and an open-plan workplace at its California headquarters that gathers nearly all staff in one huge room, was an impediment to the necessary leadership. to put an end to bad ideas.
He said the company was not intended to be a destructive platform, but noted that CEO Mark Zuckerberg has considerable power because he controls more than 50% of voting shares and letting metrics drive decisions was itself a decision on your part.
“Ultimately, the responsibility lies with Mark,” Haugen said.
Bipartisan outrage
Democrats and Republicans on the commission said Tuesday’s hearing showed the need for new regulations that change the way Facebook approaches its users and amplifies content. Such efforts have failed in Washington on previous occasions, but several senators said Haugen’s testimony could be the trigger for change.
“Our differences are very small, or appear to be few in the light of the revelations we have heard, so I hope we can move forward,” said Senator Richard Blumenthal, the chair of the panel.
However, Senator Amy Klobuchar acknowledged that Facebook and other technology companies have a lot of power in the nation’s capital, a power that has already blocked reforms on previous occasions.
“There are interest groups on every corner of this building that have been hired by the tech industry,” Klobuchar noted. “Facebook and the other tech companies are spending a lot of money in this city and people are listening to them.”
.