On Sunday’s 60 Minutes, the Facebook whistleblower revealed herself as Frances Haugen.
Who is Frances Haugen?
Haugen used to be a product manager on the civic misinformation team at Facebook. She quit the company earlier this year, however.
Haugen explained her decision: “I’ve seen a bunch of social networks, and it was substantially worse at Facebook than anything I’ve seen before.”
She had previously worked for Google and Pinterest.
Later, Haugen provided released pages of internal research and documents for the Wall Street Journal’s Facebook investigation.
Last Friday, WSJ released a series of articles and podcasts, named The Facebook Files. The newspaper claims that: “Facebook Inc. knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands.”
The investigation unveiled a lot of worrisome practices.
For example, while Facebook claims that its rules apply to everyone – but is it true?
Currently, 71% of US citizens use the social media site, but only a small percentage of them get VIP privileges. The people behind high-profile accounts enjoy different benefits, including the freedom to post whatever they want, without facing consequences.
But that’s just the tip of the iceberg.
Mental Health Issues and Life-Threatening Cases
It turns out that the social media giant is perfectly aware of how toxic one of its other platforms – Instagram – is. In fact, it has been studying the impact of the photo-sharing app on teen girls for years. The company, however, has been claiming in public that there is no such thing.
According to the documents, Facebook is actively working to attract preteens. The current age limit is set to 13, but the employees are researching ways to appeal to younger generations.
In addition, its efforts to improve the interactions between users actually backfired. The platform’s algorithm is capable of promoting objectionable or harmful content, making people angrier. Apparently, Facebook is working to resolve the issue.
It’s not, however, doing enough about human trafficking, pornography, abuse, drugs, etc. People in developing countries are using the social media app for luring women into abusive jobs, organ selling, political dissent, violence against ethnic minorities, and more. And while Facebook employees have been flagging problematic cases, the company’s response has been weak, at best.
Mark Zuckerberg might have intended to use his platform to promote the COVID-19 vaccines.
The algorithm, however, failed him.Facebook managed to help sow doubt about the virus and the vaccines. This, in turn, uncovered an uncomfortable truth – even the chief executive couldn’t control the way information spread on the app.
A spokesperson claimed that Facebook has processes for dealing with difficult challenges. They even announced that vaccine hesitancy in the US has declined by roughly 50% since January.
Frances Haugen says she quit because Facebook continuously chooses what’s better for the app, instead of what’s better for the public.
During the US presidential elections last year, Haugen was part of Facebook’s Civic Integrity project. The goal was to eliminate misinformation as much as possible. Once the elections were over, however, the unit got dissolved. The app is not fighting misinformation on other topics.
“One of the consequences of how Facebook is picking out that content today is that it is optimizing for content that gets engagement, a reaction, but its own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions.”
According to her, the employees are aware that “if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”
Besides 60 Minutes, Haugen testified in front of the Senate subcommittee, which was held in regard to Instagram’s impact on the mental health of young people.