fbpx

Blowing the Whistle on Facebook

Wondering what all the fuss is around the Facebook Papers? Get the lowdown here.

A large trove of recently leaked documents from Meta/Facebook promises to keep the social platform in the news, and in hot water, for some time to come. While other recent “Paper” investigations (think Panama and Paradise) have revealed fraud, tax evasion, and all manner of wrongdoing, it’s not clear that exposing Facebook’s own organizational musings will lead to the same kind of fallout, but whatever happens, this is not likely to end well for Facebook.

Here’s what’s happening and what’s likely to come as we learn more.

What’s going on and who is behind it?

Beginning in September, the Wall Street Journal began a series of unflattering stories they referred to as the Facebook Files. The primary claim in these stories is that the social media giant has for some time been aware of the harm and potential for harm inherent in the design choices of its platform.

In early October a former Facebook employee, Frances Haugen, appeared on CBS’s “60 Minutes” revealing herself to be the inside source for the Wall Street Journal articles. During her interview, she put a fine point on her message saying that Facebook’s own internal research shows that their tools “amplify hate, misinformation, and political unrest, but the company hides what it knows.” Before leaving Facebook, she copied tens of thousands of pages of internal research and other communications like strategy presentations and employee discussion board conversations.

Through her lawyers from Whistleblower Aid, Haugen has filed complaints with the U.S. Securities and Exchange Commission, which is the government body that enforces business and financial laws. The gist of the complaints is that Facebook has been misleading investors given what it knows from its own research versus what it presents to the public. She has also testified before the U.S. Congress and the U.K. Parliament and shared her collection of redacted (to hide personal information of employees and users) documents with them.

Haugen and her attorneys subsequently released the documents to selected news outlets who are analyzing them. CNN reported on October 26 that 17 organizations now have access to the documents. The news groups we know of are The Associated Press, CNBC, CNN, Gizmodo, The Guardian, The Information, The New York Post, The New York Times, Politico, The Wall Street Journal, The Washington Post, and WIRED. More documents are expected to come out as Whistleblower Aid continues their redaction efforts.

What does this mean for Facebook?

We don’t know yet if the SEC intends to open an investigation into Facebook over these charges. The burden will be on the SEC to show clearly that Facebook willingly or through recklessness misled investors, which will be a hard case to make. However, even without prosecution from the SEC, Facebook could have problems with investors. The New York Times reports that these revelations “worry investors like Julie Goodridge, a portfolio manager for NorthStar Asset Management. She, along with the New York State Comptroller’s Office and other investment funds, filed a motion for the next shareholder meeting calling for the removal of Mr. Zuckerberg’s power as majority voting shareholder.”

There are also signs that Facebook investors in general may not be pleased. While this may not be a lasting effect, it’s worth noting that Facebook’s share price has dropped 15% since The Wall Street Journal started their reporting while the S&P index has trended mostly upward over the same time period.

It’s also possible that Haugen’s testimony to lawmakers in Britain and the U.S. will influence lawmakers to regulate more stringently than they might have otherwise. This could be particularly bad timing from Facebook’s perspective since the U.K. is right now debating the Online Harms Bill. In the U.S. this information comes out in an environment where there have already been calls to break up Facebook. Lawmakers might also use Facebook’s conduct to strengthen their case for making changes to Section 230 of the Communications Decency Act, which protects platforms like Facebook from civil lawsuits.

What is Facebook accused of exactly?

Besides the charges of misleading investors, Facebook is facing a slew of image tarnishing revelations. Reporters continue to analyze the documents, so there may be more on the horizon, but the following is a sample of some of the information that has come out so far.

Facebook’s own researchers and internal studies repeatedly showed that Facebook’s features and design choices function against the public’s best interests:

  • Instagram harms teenagers, especially teenage girls, exacerbating eating disorders and increasing suicidal feelings in teens.
  • The more a piece of content attracts comments of outrage and division, the more likely Facebook’s algorithms are to prioritize it in users’ feeds.
  • As people show interest in a topic, Facebook’s recommender algorithms will suggest more extreme versions of similar content.
  • Facebook’s incentives force even traditional media to be more polarizing, producing “darker, more divisive content.” Facebook’s 2018 change to create more “meaningful social interactions” had the opposite effect and devastated revenue for media companies.

Other revelations include:

  • Facebook knew that extremist groups have been using their platform to polarize American voters and failed to take significant action.
  • Facebook acted on only 3 to 5 percent of hate speech and less than 1 percent of speech advocating violence while at the same time claiming that AI has been uncovering the vast majority of bad content.
  • Mark Zuckerberg’s public comments and congressional testimony have often been at odds with what the documents show.
  • According to the civic integrity team, Facebook has been used to fan ethnic violence in some countries.

What does Facebook say?

It should be said that some of the information you’ll hear about what Facebook knew about potential harms comes from the internal employee discussions and is not necessarily official information. Facebook’s internal social network, which resembles the public one, is a place where many employees engage in conversations on a wide range of topics expressing various points of view and individual opinions. (Note that the items we listed above come from internal research studies or presentations rather than the informal employee discussion.)

Officially, Facebook says that they have already made changes to address many of the concerns revealed in these documents. In response to the original Wall Street Journal articles, Facebook said that the stories were deliberate mischaracterizations that attributed false motives to Facebook’s leadership. They maintain that the company is not responsible for the organic political divisions in the country, and they are not responsible for the current state of the media. They say that they have no commercial or moral incentive to create negative experiences for their users. They point out that the company has to make difficult decisions that must balance varying interests and have long advocated for Congress to pass updated regulations to set appropriate guidelines. Moreover, they say they make extensive disclosures in their SEC filings about their challenges giving investors the information they need to make informed decisions.

Was it legal for Haugen to take and publicize these documents?

Facebook can certainly challenge the legality of taking these documents, but there are several whistleblower protections in place that allow employees to reveal information for the purpose of exposing wrongdoing. The whistleblower protections also supersede any nondisclosure agreement that might have been in effect provided the information released is relevant to the allegations of wrongdoing.

While much of what’s been revealed was already known or suspected by Facebook watchers, the Facebook Papers provide documentary evidence that can’t easily be explained or apologized away. We may see more evidence of real or perceived misdeeds as journalists continue to review the documents. The headlines are bound to fade as news cycles move on, but rest assured that researchers, journalists, and lawmakers will continue to discuss the implications of the information they’re finding. There seems little doubt that Facebook will have to make changes to the way they operate. Still to be seen is whether those changes will be self-imposed or exerted on Facebook by various governments around the world.

Does Haugen Have an Agenda?

There has not been any evidence to suggest Haugen is not sincere in her desire to expose what she considers wrongdoing at Facebook. She has testified to legislatures in the U.S., the U.K., and the EU voicing her opinion that strong regulations will be required to keep companies like Facebook in check. She expressed her hope in Belgium that the EU’s Digital Services Act could serve as a global gold standard that could be adopted by the rest of the world.

Updated Nov. 15, 2021 11:06:02 PST: Added the question “Does Haugen Have an Agenda?”

Updated Nov. 11, 2021 17:37:28 PST: The Information was added to the list of publications having access to the Facebook Papers.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Scaling Content Moderation Through AI Pays Off, No Matter the Investment

In the rapidly evolving digital landscape, user-generated content has become the lifeblood of online platforms, from social media giants to e-commerce websites. With the surge in content creation, content moderation has become a critical aspect of maintaining a safe and reputable online environment. As the volume of user-generated content continues to grow, manual content moderation…
4 minutes

The Future of AI-Powered Content Moderation: Careers and Opportunities

As companies are grappling with the challenge of ensuring user safety and creating a welcoming environment: AI-powered content moderation has emerged as a powerful solution, revolutionizing the way organizations approach this task. In this article, we will explore the careers and opportunities that AI-powered content moderation presents, and how individuals and businesses can adapt to…
6 minutes

How to Build a Safe Social Media Platform without Sacrificing the User’s Freedom

It was once unthinkable that social media would become an integral aspect of daily life, but here we are, relying on it for communication, information, entertainment, and even shaping our social interactions. It’s brought to our lives a whole new set of rules, and now that online duality is expected, the balance between safety and…
6 minutes

Image Moderation Guide: Discover the Power of AI

In today's digital world, visual content plays a significant role in online platforms, ranging from social media to e-commerce websites. With the exponential growth of user-generated images, ensuring a safe and inclusive user experience has become a paramount concern for platform owners. However, image moderation poses unique challenges due to the sheer volume, diverse content,…
4 minutes

The Psychology Behind AI Content Moderation: Understanding User Behavior

Social media platforms are experiencing exponential growth, with billions of users actively engaging in content creation and sharing. As the volume of user-generated content continues to rise, the challenge of content moderation becomes increasingly complex. To address this challenge, artificial intelligence (AI) has emerged as a powerful tool for automating the moderation process. However, user…
5 minutes

From Trolls to Fair Play: The Transformative Impact of AI Moderation in Gaming

The Online Battlefield The online gaming community, once a haven for enthusiasts to connect and share their passion, has faced the growing challenge of toxic behaviour and harassment. Teenagers and young adults are still the main demographic of players, and as multiplayer games became more popular, so did instances of trolling, hate speech, and other…
4 minutes

Expert’s Corner with Head of Research Isabelle Augenstein

This month we were very happy to sit down with one of the brains behind Checkstep who is also a recognized talent among European academics. She is the co-head of research at Checkstep and also an associate professor at the University of Copenhagen. She currently holds a prestigious DFF Sapere Aude Research Leader fellowship on ‘Learning to…
5 minutes

Outsourcing Content Moderation

Outsourcing content moderation has become an essential aspect of managing online platforms in the digital age. With the exponential growth of user-generated content, businesses are faced with the challenge of maintaining a safe and inclusive environment for their users while protecting their brand reputation. To address this, many companies are turning to outsourcing content moderation…
4 minutes

Ethical Consideration in AI Content Moderation : Avoiding Censorship and Biais

Artificial Intelligence has revolutionized various aspects of our lives, including content moderation on online platforms. As the volume of digital content continues to grow exponentially, AI algorithms play a crucial role in filtering and managing this content. However, with great power comes great responsibility, and the ethical considerations surrounding AI content moderation are becoming increasingly…
3 minutes

The Evolution of Online Communication: Cultivating Safe and Respectful Interactions

What was once an outrageous dream is now a mundane reality. Going from in-person communication to being able to hold a conversation from thousands of kilometres away has been nothing short of revolutionary. From the invention of email to the meteoric rise of social media and video conferencing, the ways we connect, share, and interact…
5 minutes

Expert’s Corner with Community Building Expert Todd Nilson

Checkstep interviews expert in online community building Todd Nilson leads transformational technology projects for major brands and organizations. He specializes in online communities, digital workplaces, social listening analysis, competitive intelligence, game thinking, employer branding, and virtual collaboration. Todd has managed teams and engagements with national and global consultancy firms specialized in online communities and the…
7 minutes

Designing for Trust in 2023: How to Create User-Friendly Designs that Keep Users Safe

The Significance of designing for trust in the Digital World In today's digital landscape, building trust with users is essential for operating a business online. Trust is the foundation of successful user interactions and transactions, it is key to encouraging users to share personal information, make purchases, and interact with website content. Without trust, users…
5 minutes

Global Perspective : How AI Content Moderation Differs Across Cultures and Religion

The internet serves as a vast platform for the exchange of ideas, information, and opinions. However, this free exchange also brings challenges, including the need for content moderation to ensure that online spaces remain safe and respectful. As artificial intelligence (AI) increasingly plays a role in content moderation, it becomes essential to recognize the cultural…
5 minutes

TikTok DSA Statement of Reasons (SOR) Statistics

What can we learn from TikTok Statements of Reasons? Body shaming, hypersexualisation, the spread of fake news and misinformation, and the glorification of violence are a high risk on any kind of Social Network. TikTok is one of the fastest growing between 2020 and 2023 and has million of content uploaded everyday on its platform.…
10 minutes

Expert’s Corner with Checkstep CEO Guillaume Bouchard

This month’s expert is Checkstep’s CEO and Co-Founder Guillaume Bouchard. After exiting his previous company, Bloomsbury AI to Facebook, he’s on a mission to better prepare online platforms against all types of online harm. He has a PhD in applied mathematics and machine learning from INRIA, France. 12 years of scientific research experience at Xerox…
3 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert