fbpx

Lie of the Year: Insurrection Denials Claim Top Spot

“The efforts to downplay and deny what happened are an attempt to brazenly recast reality itself.”

After twelve months of hard work debunking hundreds of misleading and false claims, the good folks at Poynter Institute’s PolitiFact take a moment out of their normal schedule to award the Lie of the Year, and for 2021 that dubious honor goes to, “​​the collective attempts to downplay and deny the Jan. 6 insurrection, the most serious attack on representative democracy in modern times.” As much as this one stands out as a doozy, we imagine PolitiFact had a hard time choosing just one to highlight in a year full of misinformation ​​about COVID-19, vaccinations, elections, climate change and

the U.S. pullout from Afghanistan. They cite two important factors that put this one at the top. The first is the historical significance of the event, which did truly shock the world. The second is of particular interest to those of us in the online Trust and Safety space, namely the degree to which the “efforts to downplay and deny what happened are an attempt to brazenly recast reality itself.” An attempt that, in fact, seems unimaginable given that the event was so widely broadcast and had so many first-person accounts. But that’s what separates your workaday casual lies from truly worthy award winners.

This year’s choice is somewhat unusual since it’s not just a single lie but a collection of falsehoods, albeit consistent in their theme and motivation. It also stands out in that the distortions continue. They started just seven days into the year and are far from over as more information comes out from the ongoing hearings by the House Select Committee to Investigate the January 6th Attack. Even as the evidence mounts, and with criminal charges filed against more than 700 people, there are several voices that continue to try to alter reality with claims that the assault was a peaceful political protest without violence or that the government itself was responsible.

Republican House Members hold a news conference.
Republican House members (from left Paul Gosar, Marjorie Taylor Greene, Matt Gaetz, and Louie Gohmert) hold a news conference on their concern for the January 6 defendants (Washington, Dec. 7, 2021). All four are among those cited by PolitiFact for denying or downplaying the reality of the Jan. 6 attack on the Capitol. (Sarah Silbiger/UPI/Shutterstock)

The PolitiFact article is a great read to understand the actual events and the falsehoods that followed as well as the reasoning behind this year’s pick. The danger of lies that try to deny or downplay horrific events makes the work of combating disinformation so important. We at Checkstep are reminded that it was only last year and the lies of 2020 that prompted our own efforts to counter the damaging effects of online disinformation. What’s really struck us this year is how significant, real-world harms can result from online activity. How outright deception about voting and the mischaracterization of political opponents can be so effective at motivating so many people to respond with violence; how disinformation about the coronavirus and its treatments can prolong a devastating pandemic with serious health and economic damage, and how decades of energy and climate misinformation can hamper efforts to maintain a healthy environment — just to name a few examples.

We’re gearing up for the long-haul as we see a clear playbook to muddy the waters on important public and social issues. The counter defense requires lots of us to be involved, and it’s going to take time, but we need to build up the structures and systems that allow quality information to dominate. There will still be voices that peddle misinformation and hate, but as we make progress, hopefully those will retreat back to the fringes and become less effective weapons. It’s important that we all continue the work of helping reality push back against those voices.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Misinformation could decide the US Presidential Election

 In 1993 The United Nations declared May 3 as World Press Freedom Day recognizing that a free press is critical to a healthy, functioning democracy. According to the UN, media freedom and individual access to information is key to empowering people with control over their own lives. It seems the global population mostly believes it…
5 minutes

Content Moderation Using ChatGPT

In 10 minutes, you’ll learn how to use ChatGPT for content moderation across spam and hate speech. Who is this for? If you are in a technical role, and work at a company that has user generated content (UGC) then read on. We will show you how you can easily create content moderation models to…
11 minutes

What is Trust and Safety?

The rapid expansion of online platforms and services has transformed the way we connect, communicate, and conduct business. As more interactions and transactions move into virtual spaces, the concept of trust and safety has become essential. Trust and safety covers a range of strategies, policies, and technologies designed to create secure, reliable, and positive online…
4 minutes

Trust and Safety Teams: Ensuring User Protection in the Digital World

As the internet becomes an integral part of our daily lives, companies must prioritize the safety and security of their users. This responsibility falls on trust and safety teams, whose primary goal is to protect users from fraud, abuse, and other harmful behavior.  Trust and Safety Teams Objectives  The Role of Trust and Safety Teams…
6 minutes

The Role of a Content Moderator: Ensuring Safety and Integrity in the Digital World

In today's digital world, the role of a content moderator is central to ensuring the safety and integrity of online platforms. Content moderators are responsible for reviewing and moderating user-generated content to ensure that it complies with the platform's policies and guidelines, and the laws and regulations. Their work is crucial in creating a safe…
5 minutes

How to Launch a Successful Career in Trust and Safety‍

Before diving into the specifics of launching a career in Trust and Safety, it's important to have a clear understanding of what this field entails. Trust and Safety professionals are responsible for maintaining a safe and secure environment for users on digital platforms. This includes identifying and addressing harmful content, developing policies to prevent abuse,…
5 minutes

Navigating Trust and Safety: A Guide to the Best Learning Materials

Trust and Safety professionals play a major role in creating secure, welcoming online environments. To excel in this field, it's essential to have access to high-quality learning materials that cover a wide range of topics, from content moderation to cybersecurity. In this article, we'll explore some of the best resources available for individuals looking to…
4 minutes

Ethical Consideration in AI Content Moderation : Avoiding Censorship and Biais

Artificial Intelligence has revolutionized various aspects of our lives, including content moderation on online platforms. As the volume of digital content continues to grow exponentially, AI algorithms play a crucial role in filtering and managing this content. However, with great power comes great responsibility, and the ethical considerations surrounding AI content moderation are becoming increasingly…
3 minutes

The Impact of AI Content Moderation on User Experience and Engagement

User experience and user engagement are two critical metrics that businesses closely monitor to understand how their products, services, or systems are being received by customers. Now that user-generated content (UGC) is on the rise, content moderation plays a main role in ensuring a safe and positive user experience. Artificial intelligence (AI) has emerged as…
4 minutes

Customizing AI Content Moderation for Different Industries and Platforms

With the exponential growth of user-generated content across various industries and platforms, the need for effective and tailored content moderation solutions has never been more apparent. Artificial Intelligence (AI) plays a major role in automating content moderation processes, but customization is key to address the unique challenges faced by different industries and platforms. Understanding Industry-Specific…
3 minutes

TikTok DSA Statement of Reasons (SOR) Statistics

What can we learn from TikTok Statements of Reasons? Body shaming, hypersexualisation, the spread of fake news and misinformation, and the glorification of violence are a high risk on any kind of Social Network. TikTok is one of the fastest growing between 2020 and 2023 and has million of content uploaded everyday on its platform.…
10 minutes

The Importance of Scalability in AI Content Moderation

Content moderation is essential to maintain a safe and positive online environment. With the exponential growth of user-generated content on various platforms, the need for scalable solutions has become crucial. Artificial Intelligence (AI) has emerged as a powerful tool in content moderation but addressing scalability is still a challenge. In this article, we will explore…
3 minutes

3 Facts you Need to Know about Content Moderation and Dating Going into 2024

What is Content Moderation? Content moderation is the practice of monitoring and managing user-generated content on digital platforms to ensure it complies with community guidelines, legal standards, and ethical norms. This process aims to create a safe and inclusive online environment by preventing the spread of harmful, offensive, or inappropriate content. The rise of social…
6 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert