Tech giant Facebook Inc., recently renamed Meta, has for a long time known of its detrimental effects on people. This includes negative effects on teen girls, women, minority groups, entire societies and democracies. Despite their knowledge, they have done very little to improve the safety of their platforms. Facebook and Instagram, which is owned by Meta, continue to prioritize profit over people.
This is what an investigation into leaked internal documents – called The Facebook Files – studied by The Wall Street Journal finds. Whistleblower and ex-employee Frances Haugen leaked these files earlier this year. The investigative series by The Wall Street Journal finds that Facebook knows the issues with its platforms. Yet, continues to fail to take action to fix them.
The Facebook Files report a number of disturbing truths from inside the company.
1. Treating users unequally by “shielding” celebrities from accountability
Although Facebook says that their rules apply to all, internal files reveal that “Facebook routinely makes exceptions for powerful actors”. These exceptions continue to be made because Facebook values engagement and time spent on the platform (in other words profit) over the safety for the users of the platform.
This is a privilege that many have abused. Powerful users have posted hate speech, incitement to violence and much more, and little action has been taken.
Users with large audiences may become a part of the XCheck program, previously called shielding or The Whitelist. This includes celebrities, politicians, or other high-profile users that have become important to the company’s revenue streams. This gives them the power to post almost anything on the Facebook and Instagram platforms – without the same accountability put on other users.
Following a rape accusation, high-profile football player, Neymar, live streamed a video of himself. He revealed the woman’s name and broadcasted revenge porn to his over 100 million followers. Although his post clearly went against Facebook policies, it continued to go viral on the Facebook platforms. It hit more than 50 million views before it was taken down. Instead of banning this user from its platform, which Facebook would’ve done if it was a smaller account, they just deleted the post. Neymar continues to have a powerful presence on Facebook and Instagram.
Although Facebook knew that this program undermines their fairness and legitimacy efforts, it continues to put profit before people.
2. Knowingly harming teen girls’ mental health and doing nothing
“We made body image issues worse for 1 in 3 teen girls”. Internal documents show that Facebook’s own research discovered the harmful effects of Instagram on teen girls’ body image. Despite this knowledge, it did nothing to share this research with the public.
Facebook tried to fix this issue with something called Project Daisy. It was an effort to decrease social comparison by removing the like count on posts. Results showed that this project did not work to reduce worse mental health on the Instagram platform.
Although Project Daisy didn’t have a statistical impact on teen’s mental health, the roll out continued. Why? Because it was good PR.
Facebook shows that it’s reputation and keeping users on the platform is more important than the mental wellbeing of its young users.
3. Allowing criminal networks and human trafficking to conduct business
The Facebook Files, released by Frances Haugen, show that the company knew that its platforms was used by criminal networks – including human traffickers. Sex trafficking and domestic servitude has been organized and conducted entirely on the Facebook platforms alone. Frances Haugen also says that Facebook has been used to incite violence against ethnic groups.
Yet, time and time again, Facebook fails to make its platforms safe for users. Young women and girls continue to be at high risk for trafficking. It’s a criminal activity that has moved online in the recent two years, largely because of the Covid-19 pandemic. The Facebook platform is making it easier for criminals to conduct their business, and the UN is calling social media companies to act.
The only time Facebook has taken any action was in 2019, when Apple found out about human trafficking on Facebook. Apple threatened to remove the Facebook and Instagram apps from the iPhone App Store. That would’ve been detrimental for the business of Facebook.
Facebook took down posts, accounts and devices. It launched an automated system to detect human trafficking and pledged to go further. Even so, human trafficking continues to take place on their platforms.
When making Facebook safer conflicts with the business priorities of the company, it continues to put profit before people.
4. Creating an “Outrage Algorithm” that spreads toxic content
Facebook constantly changes its algorithm which decides what you see when you open its apps. In early 2018 Facebook made a big change to its algorithm.
“It’s not enough to just connect people, we need to make sure that those connections are positive. It’s not enough to just give people a voice. We need to make sure people aren’t using it to harm other people or spread misinformation.”– Mark Zuckerberg, founder and CEO of Facebook said at a Congressional hearing in 2018.
Zuckerberg informed the public that the big algorithm change was an effort to make Facebook a better place. It was to improve the platform for humanity and for the mental health of their users. This did not turn out to be true.
The Wall Street Journal discovered that the change was actually made because of a big decline in user engagement. It was due to a crisis for the business.
The algorithm change was announced to increase meaningful social interaction. It was a success for Facebook’s own metrics. However, this big effort failed at doing what they said it should. It failed at increasing user wellbeing. Studies showed that people were less happy about what they saw in their newsfeeds on the apps.
The change led to an increased virality of sensational posts and “the very worst kind of content”.
Internal Facebook research showed that the algorithm change was actually pushing misinformation and toxic content like nudity, violence and hate speech. It also showed that the likelihood of something being false was consistent with a high level of reshares.
Fixes were suggested from Facebook’s own integrity team. However, it was only implemented in places like Myanmar and Ethiopia, and only on certain topics.
The Wall Street Journal’s investigation digs a lot deeper into the issue of the Facebook algorithm. It continues to show how Facebook is not doing enough to combat the spread of toxic content to the billions of users on its platforms.
5. Experimenting with AI while hate speech continues to spread
Although, Facebook’s artificial intelligence (AI) has been failing to take down content that spreads hate, the company’s executives continue to praise it externally.
In 2019, hate speech was the company’s most expensive problem. The company decided to combat this issue by investing more heavily in its AI. This was decided despite its failure at finding and taking down posts that include hate speech. Internal Facebook research estimates that the amount of hate speech that the company’s AI is able to take down is only around 2-5%.
There is so much work that needs to be done before an AI can do what the company hopes it will be able to do. This is what a Facebook employee from the Integrity Division wrote to a colleague before leaving the team:
“AI will not save us. The implicit vision guiding much of integrity is a world where human discourse is overseen by perfect, fair, omniscient robots owned by Mark Zuckerberg. This is clearly a dystopia. It is also so deeply ingrained, we hardly notice it anymore.”
Staying true to what we believe in
Girls’ Globe can’t condone the behavior of this monopolistic company that knowingly prioritizes profit over the safety and wellbeing of people in our world. We are a media platform that amplifies voices for gender equality, human rights and social justice and we will take no part in supporting a company that knowingly does the opposite.
Facebook and Instagram help us stay in touch with loved ones, connect with our neighbors and mobilizes changemakers. It helps raise money and solve collective problems.
With the amount of people on these platforms, Facebook and Instagram may be seen as the most important social media platforms to use. However, given the realizations from the Facebook Files investigation by the Wall Street Journal, we also know that we have little say over who sees the content we post. It is also something that may change whenever the company sees a risk to their business.
Facebook isn’t the only problem in our globally connected world – similar issues most probably exist on other platforms too. Yet, it is a private sector company that continues to monopolize our online spaces and fails to do its best to keep us safe.
Taking action to not use Facebook and Instagram is not an easy decision.
In the following months we will consult with our members and partners on how best to move forward.
We know that to create change, we must stay true to what we believe in. That is a world where everyone is treated equally and one that is free from discrimination and violence.
Right now, Facebook – or Meta – is not taking action to help create the world we want to see.