fbpx

Misinformation could decide the US Presidential Election

 In 1993 The United Nations declared May 3 as World Press Freedom Day recognizing that a free press is critical to a healthy, functioning democracy. According to the UN, media freedom and individual access to information is key to empowering people with control over their own lives. It seems the global population mostly believes it as well. The Pew Research Center reports that majorities across Europe, North America, and Latin America increasingly rate freedom of the press as very important. On the other hand, the Pew Center has also been tracking the declining business of US newspapers showing significant reductions in the number of employees, a number that has dropped by half since 2008. Most people are now getting their news from online sources and importantly for this discussion social media sites.

Two-thirds of Americans (67%) get at least some of their news on social media. For young adults, social media is the most common way to get political news. That’s not necessarily a bad thing, but those who mainly get their news that way are less likely to have their facts correct and are more likely to be exposed to false claims.

News on social media presents a different landscape compared to traditional sources. User-generated content with freewheeling discussion is rife with pseudo-facts or outright misinformation.

As I write this, the United States is only days away from a presidential election. This year in particular illustrates just how high the stakes are for quality information online. The Transition Integrity Project recently conducted a series of ‘war games’ using seasoned experts as stand-ins for the real candidates’ teams asking them to imagine how they would respond given various election outcome scenarios. If their modeling holds true, this could be a particularly chaotic and perhaps dangerous time for the country. Only one of the situations they considered leads to an orderly transfer of power following the election — a landslide victory by one of the candidates.

Professor Edward B. Foley at Ohio State University has also been thinking about what happens after election day. He writes in the Loyola University Chicago Law Journal that without a clear victory, the results are very likely to be disputed and almost certainly won’t be certified until days after election night. He explains that what happens between the election and seating the next president largely depends on how the population reads the situation and that could be the deciding factor.

His article describes a hypothetical but plausible scenario where following the election neither candidate concedes defeat leading to each side launching bitter legal and legislative cases to be decided in their favor. How these disputes are resolved will likely depend on claims the parties make, claims that may or may not be grounded in factual information. If resolution is to be decided by one or more state legislatures or Congress, in other words elected officials, people’s perceptions of the situation will matter. ​

Unless and until we are in the midst of the situation itself, we can only speculate the kind of allegations that might be raised in an effort to cast doubt on overtime votes counted during the canvass. Presumably provisional ballots would be attacked as ineligible for counting, as would any absentee ballots not previously counted, because when one is ahead and attempting to preserve a lead, the goal is to shut down the counting process as much as possible.

A trailing candidate can try to cast doubt on the legitimacy of the elections. One effective way to do that is to fill media and online outlets with pseudo-facts or outright wrong information. It seems likely that in any scenario with a close count, misinformation and deliberate muddying of the waters are likely to be wielded as weapons in post-election fighting. The fact is elections are run by people and people do make mistakes especially in chaotic times like we’re living in now. Honest errors with minimal actual impact are too easily contorted into cries of a rigged election. It’s not necessary to absolutely convince anyone of dubious claims. It’s enough to put out enough uncertainty that judges, representatives, and citizens are not sure what to believe. This might sound like an unlikely and dark point of view, but we’ve already seen how dangerous misinformation can be.

Almost since the novel coronavirus hit the news, there has been a troubling amount of misinformation. To be fair, at the beginning of an outbreak, it’s hard to know what’s correct when so much is actually unknown. But that doesn’t explain the abundance of mis- and disinformation — some of it seemingly designed to cause harm. Social media sites where so many people now get their news, have proven to be highly effective misinformation networks. The Avaaz organization published a report on the significant threat posed by Facebook’s algorithm in spreading bad information. The actors behind the spread of global health misinformation are largely not political. They do it for the money. However, 39% of misinformation spreading websites indicate some political affiliation. The outcome of the US presidential election offers both profit and very strong political motivation to launch misinformation campaigns.

The WHO has been monitoring both the pandemic and its associated ‘infodemic’, which they define as “an over-abundance of information — some accurate and some not — that makes it hard for people to find trustworthy sources and reliable guidance when they need it.” There is little doubt that the world will recover from the pandemic. With luck and a strong democratic process, the United States may not even face its own infodemic crisis related to the election outcome. But if it does, it can be managed and its harms minimized. Detailed, timely and quality information is the best antidote and many people and organizations will be working hard to make sure it’s widely available.

At Checkstep, we’re keenly aware of the need for accurate information to inform online discussions. We’re developing technology that helps to promote a productive interchange of ideas online. With the world facing fraught political situations, a global pandemic, and increasing climate effects, today’s most important issues are literally life and death. Our technology is designed to understand the origin, the impact, and the beneficiaries of any piece of information produced on the internet. We all have a responsibility to understand the full context of the issues we as a population are tasked with deciding.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Lie of the Year: Insurrection Denials Claim Top Spot

“The efforts to downplay and deny what happened are an attempt to brazenly recast reality itself.” After twelve months of hard work debunking hundreds of misleading and false claims, the good folks at Poynter Institute’s PolitiFact take a moment out of their normal schedule to award the Lie of the Year, and for 2021 that dubious…
3 minutes

Content Moderation Using ChatGPT

In 10 minutes, you’ll learn how to use ChatGPT for content moderation across spam and hate speech. Who is this for? If you are in a technical role, and work at a company that has user generated content (UGC) then read on. We will show you how you can easily create content moderation models to…
11 minutes

What is Trust and Safety: a Guide

The rapid expansion of online platforms and services has transformed the way we connect, communicate, and conduct business. As more interactions and transactions move into virtual spaces, the concept of trust and safety has become essential. Trust and safety covers a range of strategies, policies, and technologies designed to create secure, reliable, and positive online…
10 minutes

Trust and Safety Teams: Ensuring User Protection

As the internet becomes an integral part of our daily lives, companies must prioritize the safety and security of their users. This responsibility falls on trust and safety teams, whose primary goal is to protect users from fraud, abuse, and other harmful behavior.  Trust and Safety Teams Objectives  The Role of Trust and Safety Teams…
6 minutes

The Role of a Content Moderator: Ensuring Safety and Integrity in the Digital World

In today's digital world, the role of a content moderator is central to ensuring the safety and integrity of online platforms. Content moderators are responsible for reviewing and moderating user-generated content to ensure that it complies with the platform's policies and guidelines, and the laws and regulations. Their work is crucial in creating a safe…
5 minutes

How to Launch a Successful Career in Trust and Safety‍

Before diving into the specifics of launching a career in Trust and Safety, it's important to have a clear understanding of what this field entails. Trust and Safety professionals are responsible for maintaining a safe and secure environment for users on digital platforms. This includes identifying and addressing harmful content, developing policies to prevent abuse,…
5 minutes

Navigating Trust and Safety: A Guide to the Best Learning Materials

Trust and Safety professionals play a major role in creating secure, welcoming online environments. To excel in this field, it's essential to have access to high-quality learning materials that cover a wide range of topics, from content moderation to cybersecurity. In this article, we'll explore some of the best resources available for individuals looking to…
4 minutes

Ethical Consideration in AI Content Moderation : Avoiding Censorship and Biais

Artificial Intelligence has revolutionized various aspects of our lives, including content moderation on online platforms. As the volume of digital content continues to grow exponentially, AI algorithms play a crucial role in filtering and managing this content. However, with great power comes great responsibility, and the ethical considerations surrounding AI content moderation are becoming increasingly…
3 minutes

The Impact of AI Content Moderation on User Experience and Engagement

User experience and user engagement are two critical metrics that businesses closely monitor to understand how their products, services, or systems are being received by customers. Now that user-generated content (UGC) is on the rise, content moderation plays a main role in ensuring a safe and positive user experience. Artificial intelligence (AI) has emerged as…
4 minutes

Customizing AI Content Moderation for Different Industries and Platforms

With the exponential growth of user-generated content across various industries and platforms, the need for effective and tailored content moderation solutions has never been more apparent. Artificial Intelligence (AI) plays a major role in automating content moderation processes, but customization is key to address the unique challenges faced by different industries and platforms. Understanding Industry-Specific…
3 minutes

TikTok DSA Statement of Reasons (SOR) Statistics

What can we learn from TikTok Statements of Reasons? Body shaming, hypersexualisation, the spread of fake news and misinformation, and the glorification of violence are a high risk on any kind of Social Network. TikTok is one of the fastest growing between 2020 and 2023 and has million of content uploaded everyday on its platform.…
10 minutes

The Importance of Scalability in AI Content Moderation

Content moderation is essential to maintain a safe and positive online environment. With the exponential growth of user-generated content on various platforms, the need for scalable solutions has become crucial. Artificial Intelligence (AI) has emerged as a powerful tool in content moderation but addressing scalability is still a challenge. In this article, we will explore…
3 minutes

3 Facts you Need to Know about Content Moderation and Dating Going into 2024

What is Content Moderation? Content moderation is the practice of monitoring and managing user-generated content on digital platforms to ensure it complies with community guidelines, legal standards, and ethical norms. This process aims to create a safe and inclusive online environment by preventing the spread of harmful, offensive, or inappropriate content. The rise of social…
6 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert