Facebook faced mounting pressure on Friday after a new whistleblower accused it of knowingly hosting hate speech and illegal activity, even as leaked documents shed further light on how the company failed to heed internal concerns over election misinformation.
Allegations by the new whistleblower, who spoke to the Washington Post, were reportedly contained in a complaint to the Securities and Exchange Commission, the US agency that handles regulation to protect investors in publicly traded companies.
In the complaint, the former employee detailed how Facebook officials frequently declined to enforce safety rules for fear of angering Donald Trump and his allies or offsetting the company’s huge growth. In one alleged incident, Tucker Bounds, a Facebook communications official, dismissed concerns about the platform’s role in 2016 election manipulation….
The claims echo those of the whistleblower Frances Haugen, a former Facebook product manager who has said the company repeatedly prioritizes profit over public safety. Haugen’s recent damning testimony before the US Congress, and forthcoming testimony before the UK parliament, has prompted a major PR crisis for the social network, which is said to be readying plans for a rebrand.
The whistleblower claims came on the same day that news outlets, including the New York Times, the Washington Post and NBC, published reports based on internal documents shared by Haugen. The documents offer a deeper look into the spread of misinformation and conspiracy theories on the platform, particularly related to the 2020 US presidential election.
The documents show that Facebook employees repeatedly flagged concerns before and after the election, when Donald Trump tried to falsely overturn Joe Biden’s victory. According to the New York Times, a company data scientist told coworkers a week after the election that 10% of all US views of political content were of posts that falsely claimed the vote was fraudulent. But as workers flagged these issues and urged the company to act, the company failed or struggled to address the problems, the Times reported.
The internal documents also show Facebook researchers have found the platform’s recommendation tools repeatedly pushed users to extremist groups, prompting internal warnings that some managers and executives ignored, NBC News reported.
In one striking internal study, a Facebook researcher created a fake profile for “Carol Smith”, conservative female user whose interests included Fox News and Donald Trump. The experiment showed that within two days, Facebook’s algorithm was recommending “Carol” join groups dedicated to QAnon, a baseless internet conspiracy theory.
The reports come as Facebook faces pressure from lawmakers on various fronts – including pending legislation from Congress, a lawsuit filed by US attorneys general, and a Federal Trade Commission lawsuit filed by the agency’s new chairwoman, Lina Khan.
Facebook watchdogs say the latest whistleblower accounts of wrongdoing underscore the need to regulate the platform.
“It’s time for Congress and the Biden administration to investigate a Facebook business model that profits from spreading the most extreme hate and disinformation,” said Jessica J González, co-CEO of the civil rights organization Free Press Action. “It’s time for immediate action to hold the company accountable for the many harms it’s inflicted on our democracy.”
Responding to the Post about the whistleblower’s claims, Bounds said: “Being asked about a purported one-on-one conversation four years ago with a faceless person, with no other sourcing than the empty accusation itself, is a first for me.”
Erin McPike, a Facebook spokeswoman, also criticized the Post’s reporting, saying in a statement to the news organization that it set “a dangerous precedent to hang an entire story on a single source making a wide range of claims without any apparent corroboration”.
“This is beneath the Washington Post, which during the last five years would only report stories after deep reporting with corroborating sources,” she told the Guardian in a statement.
But the reports align with what others have shared about the company. Haugen in her testimony stated that Facebook at one point tweaked its algorithm to improve safety and decrease inflammatory content but abandoned the changes after the election, a decision that Haugen tied directly to the 6 January riot at the Capitol. Facebook also disbanded the civic integrity team after the election.
“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety. And that really feels like a betrayal of democracy to me,” she said in her testimony on 5 October.
Referring to the algorithm change, Haugen added: “Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, and [Facebook] will make less money.”
Haugen’s own SEC filings alleged that Facebook leadership avoided reporting such issues in SEC filings available to investors. The SEC is tasked with scrutinizing whether public firms should disclose such information to investors.
The Guardian
Kari Paul in San Francisco and Dani Anguiano in Los Angeles