fbpx

Blowing the Whistle on Facebook

Wondering what all the fuss is around the Facebook Papers? Get the lowdown here.

A large trove of recently leaked documents from Meta/Facebook promises to keep the social platform in the news, and in hot water, for some time to come. While other recent “Paper” investigations (think Panama and Paradise) have revealed fraud, tax evasion, and all manner of wrongdoing, it’s not clear that exposing Facebook’s own organizational musings will lead to the same kind of fallout, but whatever happens, this is not likely to end well for Facebook.

Here’s what’s happening and what’s likely to come as we learn more.

What’s going on and who is behind it?

Beginning in September, the Wall Street Journal began a series of unflattering stories they referred to as the Facebook Files. The primary claim in these stories is that the social media giant has for some time been aware of the harm and potential for harm inherent in the design choices of its platform.

In early October a former Facebook employee, Frances Haugen, appeared on CBS’s “60 Minutes” revealing herself to be the inside source for the Wall Street Journal articles. During her interview, she put a fine point on her message saying that Facebook’s own internal research shows that their tools “amplify hate, misinformation, and political unrest, but the company hides what it knows.” Before leaving Facebook, she copied tens of thousands of pages of internal research and other communications like strategy presentations and employee discussion board conversations.

Through her lawyers from Whistleblower Aid, Haugen has filed complaints with the U.S. Securities and Exchange Commission, which is the government body that enforces business and financial laws. The gist of the complaints is that Facebook has been misleading investors given what it knows from its own research versus what it presents to the public. She has also testified before the U.S. Congress and the U.K. Parliament and shared her collection of redacted (to hide personal information of employees and users) documents with them.

Haugen and her attorneys subsequently released the documents to selected news outlets who are analyzing them. CNN reported on October 26 that 17 organizations now have access to the documents. The news groups we know of are The Associated Press, CNBC, CNN, Gizmodo, The Guardian, The Information, The New York Post, The New York Times, Politico, The Wall Street Journal, The Washington Post, and WIRED. More documents are expected to come out as Whistleblower Aid continues their redaction efforts.

What does this mean for Facebook?

We don’t know yet if the SEC intends to open an investigation into Facebook over these charges. The burden will be on the SEC to show clearly that Facebook willingly or through recklessness misled investors, which will be a hard case to make. However, even without prosecution from the SEC, Facebook could have problems with investors. The New York Times reports that these revelations “worry investors like Julie Goodridge, a portfolio manager for NorthStar Asset Management. She, along with the New York State Comptroller’s Office and other investment funds, filed a motion for the next shareholder meeting calling for the removal of Mr. Zuckerberg’s power as majority voting shareholder.”

There are also signs that Facebook investors in general may not be pleased. While this may not be a lasting effect, it’s worth noting that Facebook’s share price has dropped 15% since The Wall Street Journal started their reporting while the S&P index has trended mostly upward over the same time period.

It’s also possible that Haugen’s testimony to lawmakers in Britain and the U.S. will influence lawmakers to regulate more stringently than they might have otherwise. This could be particularly bad timing from Facebook’s perspective since the U.K. is right now debating the Online Harms Bill. In the U.S. this information comes out in an environment where there have already been calls to break up Facebook. Lawmakers might also use Facebook’s conduct to strengthen their case for making changes to Section 230 of the Communications Decency Act, which protects platforms like Facebook from civil lawsuits.

What is Facebook accused of exactly?

Besides the charges of misleading investors, Facebook is facing a slew of image tarnishing revelations. Reporters continue to analyze the documents, so there may be more on the horizon, but the following is a sample of some of the information that has come out so far.

Facebook’s own researchers and internal studies repeatedly showed that Facebook’s features and design choices function against the public’s best interests:

  • Instagram harms teenagers, especially teenage girls, exacerbating eating disorders and increasing suicidal feelings in teens.
  • The more a piece of content attracts comments of outrage and division, the more likely Facebook’s algorithms are to prioritize it in users’ feeds.
  • As people show interest in a topic, Facebook’s recommender algorithms will suggest more extreme versions of similar content.
  • Facebook’s incentives force even traditional media to be more polarizing, producing “darker, more divisive content.” Facebook’s 2018 change to create more “meaningful social interactions” had the opposite effect and devastated revenue for media companies.

Other revelations include:

  • Facebook knew that extremist groups have been using their platform to polarize American voters and failed to take significant action.
  • Facebook acted on only 3 to 5 percent of hate speech and less than 1 percent of speech advocating violence while at the same time claiming that AI has been uncovering the vast majority of bad content.
  • Mark Zuckerberg’s public comments and congressional testimony have often been at odds with what the documents show.
  • According to the civic integrity team, Facebook has been used to fan ethnic violence in some countries.

What does Facebook say?

It should be said that some of the information you’ll hear about what Facebook knew about potential harms comes from the internal employee discussions and is not necessarily official information. Facebook’s internal social network, which resembles the public one, is a place where many employees engage in conversations on a wide range of topics expressing various points of view and individual opinions. (Note that the items we listed above come from internal research studies or presentations rather than the informal employee discussion.)

Officially, Facebook says that they have already made changes to address many of the concerns revealed in these documents. In response to the original Wall Street Journal articles, Facebook said that the stories were deliberate mischaracterizations that attributed false motives to Facebook’s leadership. They maintain that the company is not responsible for the organic political divisions in the country, and they are not responsible for the current state of the media. They say that they have no commercial or moral incentive to create negative experiences for their users. They point out that the company has to make difficult decisions that must balance varying interests and have long advocated for Congress to pass updated regulations to set appropriate guidelines. Moreover, they say they make extensive disclosures in their SEC filings about their challenges giving investors the information they need to make informed decisions.

Was it legal for Haugen to take and publicize these documents?

Facebook can certainly challenge the legality of taking these documents, but there are several whistleblower protections in place that allow employees to reveal information for the purpose of exposing wrongdoing. The whistleblower protections also supersede any nondisclosure agreement that might have been in effect provided the information released is relevant to the allegations of wrongdoing.

While much of what’s been revealed was already known or suspected by Facebook watchers, the Facebook Papers provide documentary evidence that can’t easily be explained or apologized away. We may see more evidence of real or perceived misdeeds as journalists continue to review the documents. The headlines are bound to fade as news cycles move on, but rest assured that researchers, journalists, and lawmakers will continue to discuss the implications of the information they’re finding. There seems little doubt that Facebook will have to make changes to the way they operate. Still to be seen is whether those changes will be self-imposed or exerted on Facebook by various governments around the world.

Does Haugen Have an Agenda?

There has not been any evidence to suggest Haugen is not sincere in her desire to expose what she considers wrongdoing at Facebook. She has testified to legislatures in the U.S., the U.K., and the EU voicing her opinion that strong regulations will be required to keep companies like Facebook in check. She expressed her hope in Belgium that the EU’s Digital Services Act could serve as a global gold standard that could be adopted by the rest of the world.

Updated Nov. 15, 2021 11:06:02 PST: Added the question “Does Haugen Have an Agenda?”

Updated Nov. 11, 2021 17:37:28 PST: The Information was added to the list of publications having access to the Facebook Papers.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Trust and Safety Teams: Ensuring User Protection in the Digital World

As the internet becomes an integral part of our daily lives, companies must prioritize the safety and security of their users. This responsibility falls on trust and safety teams, whose primary goal is to protect users from fraud, abuse, and other harmful behavior.  Trust and Safety Teams Objectives  The Role of Trust and Safety Teams
6 minutes

Streamline Audio Moderation with the Power of AI

In today's digitally-driven world, audio content has become an integral part of online platforms, ranging from podcasts and audiobooks to user-generated audio clips on social media. With the increasing volume of audio content being generated daily, audio moderation has become a critical aspect of maintaining a safe and positive user experience. Audio moderation involves systematically
4 minutes

It’s Scale or Fail with AI in Video Moderation

In the digital age, video content has become a driving force across online platforms, shaping the way we communicate, entertain, and share experiences. With this exponential growth, content moderation has become a critical aspect of maintaining a safe and inclusive online environment. The sheer volume of user-generated videos poses significant challenges for platforms, necessitating advanced
4 minutes

Enable and Scale AI for Podcast Moderation

The podcasting industry has experienced an explosive growth in recent years, with millions of episodes being published across various platforms every day. As the volume of audio content surges, ensuring a safe and trustworthy podcast environment becomes a paramount concern. Podcast moderation plays a crucial role in filtering and managing podcast episodes to prevent the
4 minutes

Ready or Not, AI Is Coming to Content Moderation

As digital platforms and online communities continue to grow, content moderation becomes increasingly critical to ensure safe and positive user experiences. Manual content moderation by human moderators is effective but often falls short when dealing with the scale and complexity of user-generated content. Ready or not, AI is coming to content moderation operations, revolutionizing the
5 minutes

How to Protect the Mental Health of Content Moderators? 

Content moderation has become an essential aspect of managing online platforms and ensuring a safe user experience. Behind the scenes, content moderators play a crucial role in reviewing user-generated content, filtering out harmful or inappropriate materials, and upholding community guidelines. However, the task of content moderation is not without its challenges, as it exposes moderators
4 minutes

Scaling Content Moderation Through AI Pays Off, No Matter the Investment

In the rapidly evolving digital landscape, user-generated content has become the lifeblood of online platforms, from social media giants to e-commerce websites. With the surge in content creation, content moderation has become a critical aspect of maintaining a safe and reputable online environment. As the volume of user-generated content continues to grow, manual content moderation
4 minutes

Overhaul Image Moderation with the Power of AI

In today's digital world, visual content plays a significant role in online platforms, ranging from social media to e-commerce websites. With the exponential growth of user-generated images, ensuring a safe and inclusive user experience has become a paramount concern for platform owners. However, image moderation poses unique challenges due to the sheer volume, diverse content,
4 minutes

Outsourcing Content Moderation

Outsourcing content moderation has become an essential aspect of managing online platforms in the digital age. With the exponential growth of user-generated content, businesses are faced with the challenge of maintaining a safe and inclusive environment for their users while protecting their brand reputation. To address this, many companies are turning to outsourcing content moderation
4 minutes

Designing for Trust in 2023: How to Create User-Friendly Designs that Keep Users Safe

The Significance of designing for trust in the Digital World In today's digital landscape, building trust with users is essential for operating a business online. Trust is the foundation of successful user interactions and transactions, it is key to encouraging users to share personal information, make purchases, and interact with website content. Without trust, users
5 minutes

Fake Dating Pictures: A Comprehensive Guide to Identifying and Managing 

In the world of online dating, fake dating pictures are harmful, as pictures play a crucial role in making a strong first impression. However, not all dating pictures are created equal. There is a growing concern about fake profiles using deceptive or doctored images.  To navigate the online dating landscape successfully, it's important to know
5 minutes

17 Questions Trust and Safety Leaders Should Be Able to Answer 

A Trust and Safety leader plays a crucial role in ensuring the safety and security of a platform or community. Here are 17 important questions that a Trust and Safety leader should be able to answer.  What are the key goals and objectives of the Trust and Safety team? The key goals of the Trust
6 minutes

The Future of Dating: Embracing Video to Connect and Thrive

‍In a rapidly evolving digital landscape, dating apps are continually seeking innovative ways to enhance the user experience and foster meaningful connections. One such trend that has gained significant traction is the integration of video chat features. Video has emerged as a powerful tool to add authenticity, connectivity, and fun to the dating process. In
4 minutes

Content Moderation: A Comprehensive Guide

Content moderation is a crucial aspect of managing online platforms and communities. It involves the review, filtering, and approval or removal of user-generated content to maintain a safe and engaging environment. To navigate this landscape effectively, it's essential to understand the terminology associated with content moderation. In this article, we'll delve into a comprehensive glossary
7 minutes

How Predators Are Abusing Generative AI

The recent rise of generative AI has revolutionized various industries, including Trust and Safety. However, this technological advancement generates new problems. Predators have found ways to abuse generative AI, using it to carry out horrible acts such as child sex abuse material (CSAM), disinformation, fraud, and extremism. In this article, we will explore how predators
4 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert