fbpx

Blowing the Whistle on Facebook

Wondering what all the fuss is around the Facebook Papers? Get the lowdown here.

A large trove of recently leaked documents from Meta/Facebook promises to keep the social platform in the news, and in hot water, for some time to come. While other recent “Paper” investigations (think Panama and Paradise) have revealed fraud, tax evasion, and all manner of wrongdoing, it’s not clear that exposing Facebook’s own organizational musings will lead to the same kind of fallout, but whatever happens, this is not likely to end well for Facebook.

Here’s what’s happening and what’s likely to come as we learn more.

What’s going on and who is behind it?

Beginning in September, the Wall Street Journal began a series of unflattering stories they referred to as the Facebook Files. The primary claim in these stories is that the social media giant has for some time been aware of the harm and potential for harm inherent in the design choices of its platform.

In early October a former Facebook employee, Frances Haugen, appeared on CBS’s “60 Minutes” revealing herself to be the inside source for the Wall Street Journal articles. During her interview, she put a fine point on her message saying that Facebook’s own internal research shows that their tools “amplify hate, misinformation, and political unrest, but the company hides what it knows.” Before leaving Facebook, she copied tens of thousands of pages of internal research and other communications like strategy presentations and employee discussion board conversations.

Through her lawyers from Whistleblower Aid, Haugen has filed complaints with the U.S. Securities and Exchange Commission, which is the government body that enforces business and financial laws. The gist of the complaints is that Facebook has been misleading investors given what it knows from its own research versus what it presents to the public. She has also testified before the U.S. Congress and the U.K. Parliament and shared her collection of redacted (to hide personal information of employees and users) documents with them.

Haugen and her attorneys subsequently released the documents to selected news outlets who are analyzing them. CNN reported on October 26 that 17 organizations now have access to the documents. The news groups we know of are The Associated Press, CNBC, CNN, Gizmodo, The Guardian, The Information, The New York Post, The New York Times, Politico, The Wall Street Journal, The Washington Post, and WIRED. More documents are expected to come out as Whistleblower Aid continues their redaction efforts.

What does this mean for Facebook?

We don’t know yet if the SEC intends to open an investigation into Facebook over these charges. The burden will be on the SEC to show clearly that Facebook willingly or through recklessness misled investors, which will be a hard case to make. However, even without prosecution from the SEC, Facebook could have problems with investors. The New York Times reports that these revelations “worry investors like Julie Goodridge, a portfolio manager for NorthStar Asset Management. She, along with the New York State Comptroller’s Office and other investment funds, filed a motion for the next shareholder meeting calling for the removal of Mr. Zuckerberg’s power as majority voting shareholder.”

There are also signs that Facebook investors in general may not be pleased. While this may not be a lasting effect, it’s worth noting that Facebook’s share price has dropped 15% since The Wall Street Journal started their reporting while the S&P index has trended mostly upward over the same time period.

It’s also possible that Haugen’s testimony to lawmakers in Britain and the U.S. will influence lawmakers to regulate more stringently than they might have otherwise. This could be particularly bad timing from Facebook’s perspective since the U.K. is right now debating the Online Harms Bill. In the U.S. this information comes out in an environment where there have already been calls to break up Facebook. Lawmakers might also use Facebook’s conduct to strengthen their case for making changes to Section 230 of the Communications Decency Act, which protects platforms like Facebook from civil lawsuits.

What is Facebook accused of exactly?

Besides the charges of misleading investors, Facebook is facing a slew of image tarnishing revelations. Reporters continue to analyze the documents, so there may be more on the horizon, but the following is a sample of some of the information that has come out so far.

Facebook’s own researchers and internal studies repeatedly showed that Facebook’s features and design choices function against the public’s best interests:

  • Instagram harms teenagers, especially teenage girls, exacerbating eating disorders and increasing suicidal feelings in teens.
  • The more a piece of content attracts comments of outrage and division, the more likely Facebook’s algorithms are to prioritize it in users’ feeds.
  • As people show interest in a topic, Facebook’s recommender algorithms will suggest more extreme versions of similar content.
  • Facebook’s incentives force even traditional media to be more polarizing, producing “darker, more divisive content.” Facebook’s 2018 change to create more “meaningful social interactions” had the opposite effect and devastated revenue for media companies.

Other revelations include:

  • Facebook knew that extremist groups have been using their platform to polarize American voters and failed to take significant action.
  • Facebook acted on only 3 to 5 percent of hate speech and less than 1 percent of speech advocating violence while at the same time claiming that AI has been uncovering the vast majority of bad content.
  • Mark Zuckerberg’s public comments and congressional testimony have often been at odds with what the documents show.
  • According to the civic integrity team, Facebook has been used to fan ethnic violence in some countries.

What does Facebook say?

It should be said that some of the information you’ll hear about what Facebook knew about potential harms comes from the internal employee discussions and is not necessarily official information. Facebook’s internal social network, which resembles the public one, is a place where many employees engage in conversations on a wide range of topics expressing various points of view and individual opinions. (Note that the items we listed above come from internal research studies or presentations rather than the informal employee discussion.)

Officially, Facebook says that they have already made changes to address many of the concerns revealed in these documents. In response to the original Wall Street Journal articles, Facebook said that the stories were deliberate mischaracterizations that attributed false motives to Facebook’s leadership. They maintain that the company is not responsible for the organic political divisions in the country, and they are not responsible for the current state of the media. They say that they have no commercial or moral incentive to create negative experiences for their users. They point out that the company has to make difficult decisions that must balance varying interests and have long advocated for Congress to pass updated regulations to set appropriate guidelines. Moreover, they say they make extensive disclosures in their SEC filings about their challenges giving investors the information they need to make informed decisions.

Was it legal for Haugen to take and publicize these documents?

Facebook can certainly challenge the legality of taking these documents, but there are several whistleblower protections in place that allow employees to reveal information for the purpose of exposing wrongdoing. The whistleblower protections also supersede any nondisclosure agreement that might have been in effect provided the information released is relevant to the allegations of wrongdoing.

While much of what’s been revealed was already known or suspected by Facebook watchers, the Facebook Papers provide documentary evidence that can’t easily be explained or apologized away. We may see more evidence of real or perceived misdeeds as journalists continue to review the documents. The headlines are bound to fade as news cycles move on, but rest assured that researchers, journalists, and lawmakers will continue to discuss the implications of the information they’re finding. There seems little doubt that Facebook will have to make changes to the way they operate. Still to be seen is whether those changes will be self-imposed or exerted on Facebook by various governments around the world.

Does Haugen Have an Agenda?

There has not been any evidence to suggest Haugen is not sincere in her desire to expose what she considers wrongdoing at Facebook. She has testified to legislatures in the U.S., the U.K., and the EU voicing her opinion that strong regulations will be required to keep companies like Facebook in check. She expressed her hope in Belgium that the EU’s Digital Services Act could serve as a global gold standard that could be adopted by the rest of the world.

Updated Nov. 15, 2021 11:06:02 PST: Added the question “Does Haugen Have an Agenda?”

Updated Nov. 11, 2021 17:37:28 PST: The Information was added to the list of publications having access to the Facebook Papers.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Text Moderation: Scale your content moderation with AI

In today's interconnected world, text-based communication has become a fundamental part of our daily lives. However, with the exponential growth of user-generated text content on digital platforms, ensuring a safe and inclusive online environment has become a daunting task. Text moderation plays a critical role in filtering and managing user-generated content to prevent harmful or…
4 minutes

How Predators Are Abusing Generative AI

The recent rise of generative AI has revolutionized various industries, including Trust and Safety. However, this technological advancement generates new problems. Predators have found ways to abuse generative AI, using it to carry out horrible acts such as child sex abuse material (CSAM), disinformation, fraud, and extremism. In this article, we will explore how predators…
4 minutes

Content Moderation for Virtual Reality

What is content moderation in virtual reality? Content moderation in virtual reality (VR) is the process of monitoring and managing user-generated content within VR platforms to make sure it meets certain standards and guidelines. This can include text, images, videos, and any actions within the 3D virtual environment. Given the interactive and immersive nature of…
31 minutes

Why moderation has become essential for UGC 

User-Generated Content (UGC) has become an integral part of online participation. Any type of material—whether it's text, photos, videos, reviews, or discussions—that is made and shared by people instead of brands or official content providers is called user-generated content. Representing variety and honesty, it is the online community's collective voice. Let's explore user-generated content (UGC)…
6 minutes

Audio Moderation: AI-Driven Strategies to Combat Online Threats

In today's digitally-driven world, audio content has become an integral part of online platforms, ranging from podcasts and audiobooks to user-generated audio clips on social media. With the increasing volume of audio content being generated daily, audio moderation has become a critical aspect of maintaining a safe and positive user experience. Audio moderation involves systematically…
4 minutes

‍The Future of Dating: Embracing Video to Connect and Thrive

In a rapidly evolving digital landscape, dating apps are continually seeking innovative ways to enhance the user experience and foster meaningful connections. One such trend that has gained significant traction is the integration of video chat features. Video has emerged as a powerful tool to add authenticity, connectivity, and fun to the dating process. In…
4 minutes

Live Chat Content Moderation Guide

During any live streaming nowadays, whether it be a content creator on Youtube, an influencer on Instagram, or even live sports in some cases, there's always some sort of live chat. These are public commentary sections where viewers can interact and share their thoughts and opinions, but depending on which event or what sort of…
6 minutes

3 Facts you Need to Know about Content Moderation and Dating Going into 2024

What is Content Moderation? Content moderation is the practice of monitoring and managing user-generated content on digital platforms to ensure it complies with community guidelines, legal standards, and ethical norms. This process aims to create a safe and inclusive online environment by preventing the spread of harmful, offensive, or inappropriate content. The rise of social…
6 minutes

Video Moderation : It’s Scale or Fail with AI

In the digital age, video content has become a driving force across online platforms, shaping the way we communicate, entertain, and share experiences. With this exponential growth, content moderation has become a critical aspect of maintaining a safe and inclusive online environment. The sheer volume of user-generated videos poses significant challenges for platforms, necessitating advanced…
4 minutes

What is Content Moderation: a Guide

Content moderation is one of the major aspect of managing online platforms and communities. It englobes the review, filtering, and approval or removal of user-generated content to maintain a safe and engaging environment. In this article, we'll provide you with a comprehensive glossary to understand the key concepts, as well as its definition, challenges and…
15 minutes

The Effects of Unregulated Content for Gen Z

The Internet as an Irreplaceable Tool Gen Z’s are the first generation to be born in a world where the internet plays an irreplaceable role, and in some way, these children and adolescents are not just consumers but have become inhabitants of the digital society. Apart from school, generation Z spends most of their time…
5 minutes

How to use Content Moderation to Build a Positive Brand Image

The idea of reputation has changed dramatically in the digital age, moving from conventional word-of-mouth to the wide world of user-generated material on the internet. Reputation has a long history that reflects changes in communication styles, cultural developments, and technological advancements. The importance of internet reviews has been highlighted by recent research conducted by Bright…
5 minutes

Podcast Moderation at Scale: Leveraging AI to Manage Content

The podcasting industry has experienced an explosive growth in recent years, with millions of episodes being published across various platforms every day. As the volume of audio content surges, ensuring a safe and trustworthy podcast environment becomes a paramount concern. Podcast moderation plays a crucial role in filtering and managing podcast episodes to prevent the…
4 minutes

17 Questions Trust and Safety Leaders Should Be Able to Answer 

A Trust and Safety leader plays a crucial role in ensuring the safety and security of a platform or community. Here are 17 important questions that a Trust and Safety leader should be able to answer.  What are the key goals and objectives of the Trust and Safety team? The key goals of the Trust…
6 minutes

How to Protect Online Food Delivery Users: The Critical Role of Moderation

Nowadays, most people can’t remember the last time they called a restaurant and asked for their food to be delivered. In fact, most people can’t recall the last time they called a restaurant for anything. In this new era of convenience, food delivery has undergone a revolutionary transformation. What once involved a phone call to…
5 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert