The longest 20 days and 20 nights: how can Trust & Safety Leaders best prepare for US elections

US Elections Trust & Safety

Trust and Safety leaders during the US elections: are you tired of election coverage and frenzied political discussion yet? It’s only 20 days until the US votes to elect either Kamala Harris or Donald Trump into the White House and being a Trust and Safety professional has never been harder. Whether your site has anything to do with politics, it’s almost certainly coming up in comments, posts, reviews, or (worse) deepfake images or videos. 

While most online companies would rather avoid politics altogether, anyone engaging a community of customers is going to get roped into ‘policing’ political content at some point. Be prepared to make ‘tough decisions’ in the final weeks leading up to the election – make sure you have clear policies and that you’ve thought about those policies and can stick to them when the going gets tough. The last thing you want is to be seen as removing customers’ content arbitrarily or with a political agenda.

Trust and Safety during US elections: Business Administration meets Philosophy

Getting your policy right is critical to handling political content on your site fairly and consistently. With polls showing that more and more Americans are getting their news from social media companies, it is more crucial now than ever before that the company’s unbiased policies are being enforced uniformly and justly for everyone. These platforms serve as a virtual public square, amplifying both official and unofficial sources of information.  Protecting users from misinformation and political interference is not just a nice to have but a must have given the role online companies are having in shaping the world’s political future.  

As the election countdown clock winds down to zero and our next US President is announced, online companies will face great hurdles in content moderation. As we have seen misinformation is rampant, “they’re eating the dogs, they’re eating the cats”, “ you can legally abort a child after it is born”, “The JD Vance couch joke that turned into a trending story”; companies have important content moderation decisions to make. This is where strong policies and guidance help take personal belief out of content moderation decisions, and the right decision is able to be made even if the moderator does not personally believe in it.  

So What? If you want to believe misinformation that is on you, not anyone else

Where does our responsibility begin, and more importantly where does it end? Many have argued that it is not the role of online companies to “censor” what is being said, open the floodgates and let everything through. People can then decide on their own what is true and what is misinformation. History has shown us that it is too high of a responsibility to place on humans. With sophisticated tooling available to create deep fakes, AI generated content that could and does fool human review to simple tools such as podcasts, posting images and creating quick videos, online companies must step in and take their content moderation responsibilities seriously.

When you create a platform to give people a voice, that also comes with a responsibility to ensure that the platform does not give voice to those spreading misinformation, hate, or harmful content. Words have consequences, some intended and some not. During this election we have seen unchecked comments made during the Presidential debate have harmful offline actions, for example causing children to be evacuated from their school multiple days in a row based on threats of violence.

Applying the Trust and Safety recipe to prevent disaster from happening during the US elections

Working with our customers we’ve seen a range of common behaviours that need a Trust & Safety lens:

  • Political Grandstanding in comments, reviews, or profiles where it’s not relevant
  • Debate and discussion turned to fights and threats
  • Viral misinformation spread for shock value and rage-baiting
  • Deepfakes and AI Generated Content
  • Bots spamming or amplifying content rather than your community

What can be done? Online platforms can implement strategies such as well developed and clear policies, third party fact checking, investing in AI and human moderation, real time content moderation and appeals process implementations. 

Having a full Trust and Safety toolkit (built around your policies) is a great starting point to ensure that you’re ready to enforce your policy fairly and consistently. For Checkstep customers, we’ve seen that blending human moderation and AI scanning and automation unlocks opportunities to identify and (where necessary) remove or restrict the most harmful content in the political firestorm. Following the news and reviewing community reported content in your platform can surface ‘fast moving’ conversations about misinformation, deepfakes, or other dangerous content. With keywords or (more effectively) with LLM labels, you can easily pick out content that references emerging trends and review them with Checkstep’s platform.

It’s not just about finding harmful content and protecting your community – you also need tools that help you ensure that you’re appropriately enforcing content without bias given the highly charged political climate. Checkstep gives you the ability to regularly run QA with secondary (or tertiary!) moderator reviews to identify areas where your AI or your moderation is overly restrictive or under-enforcing. Find the areas where your moderators don’t agree with decisions and use them to inspect your operation and your policy!

Conclusion

None of this is easy, and online platforms have to strive to do the best they can to help combat and prevent this harm. Avoiding censorship while balancing users’ right to freedom of speech is a delicate dance of policy implementation and social responsibility. Online companies must get content moderation right not just as a technical issue but as a matter of safeguarding democracy itself. 

What you can control as a Trust & Safety Leader: 

  • Share this link with your colleagues to make sure everyone knows where their polling place is,
  • Fill out the form below or click here to audit your current moderation system with us, and make sure you have no policy or technology gaps.

Audit your current moderation system

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Image Moderation Guide: Discover the Power of AI

In today's digital world, visual content plays a significant role in online platforms, ranging from social media to e-commerce websites. With the exponential growth of user-generated images, ensuring a safe and inclusive user experience has become a paramount concern for platform owners. However, image moderation poses unique challenges due to the sheer volume, diverse content,…
4 minutes

How to Build a Trustworthy E-Commerce Brand Using AI-text Moderation

In the fast-paced and competitive world of online commerce, trust is the most important element in ensuring successful transactions, and customer evaluations hold a top spot in the ranking of factors that contribute to the development of brand reliability. They act as a kind of digital word-of-mouth, influencing consumers' choices to make purchases and moulding…
4 minutes

Top 3 Digital Services Act Tools to make your compliance easier

Introduction The Digital Service Act (DSA) is a European regulation amending the June, 8th 2000 Directive on electronic commerce (Directive 2000/31/EC). Its goal is to modernize and harmonize national legislation within the internal market in response to the risks and challenges of digital transformation. The DSA applies to a large range of digital services such…
12 minutes

The Evolution of Content Moderation Rules Throughout The Years

The birth of the digital public sphere This article is contributed by Ahmed Medien. Online forums and social marketplaces have become a large part of the internet in the past 20 years since the early bulletin boards on the internet and AOL chat rooms. Today, users moved primarily to social platforms, platforms that host user-generated content. These…
7 minutes

How AI is Revolutionizing Content Moderation in Social Media Platforms

Social media platforms have become an integral part of our lives, connecting us with friends, family, and the world at large. Still, with the exponential growth of user-generated content, ensuring a safe and positive user experience has become a daunting task. This is where Artificial Intelligence (AI) comes into play, revolutionizing the way social media…
3 minutes

How Video Game Bullying is Threatening the Future of the Industry

Video games have become an integral part of modern entertainment, offering immersive experiences and interactive gameplay. With the rise in popularity of online multiplayer games, a dark side has emerged : video game bullying. This pervasive issue threatens the well-being of players and the reputation of the entire video game industry. In this article, we…
4 minutes

How to Build a Safe Social Media Platform without Sacrificing the User’s Freedom

It was once unthinkable that social media would become an integral aspect of daily life, but here we are, relying on it for communication, information, entertainment, and even shaping our social interactions. It’s brought to our lives a whole new set of rules, and now that online duality is expected, the balance between safety and…
6 minutes

Digital Services Act (DSA) Transparency Guide [+Free Templates]

The Digital Services Act (DSA) is a comprehensive set of laws that aims to regulate digital services and platforms to ensure transparency, accountability, and user protection. In other words, it’s the European Union’s way of regulating and harmonizing separate laws under one universal piece of legislation to prevent illegal and harmful activities online and the…
7 minutes

Trust and Safety Teams: Ensuring User Protection

As the internet becomes an integral part of our daily lives, companies must prioritize the safety and security of their users. This responsibility falls on trust and safety teams, whose primary goal is to protect users from fraud, abuse, and other harmful behavior.  Trust and Safety Teams Objectives  The Role of Trust and Safety Teams…
6 minutes

Building Trust and Safety Online: The Power of AI Content Moderation in Community Forums

Community forums are providing spaces for individuals to connect, share ideas, and build relationships. However, maintaining a safe and welcoming environment in these forums is crucial for fostering trust and ensuring the well-being of community members. To address this challenge, many forums are turning to the power of artificial intelligence (AI) content moderation. In this…
3 minutes

Outsourcing Content Moderation

Outsourcing content moderation has become an essential aspect of managing online platforms in the digital age. With the exponential growth of user-generated content, businesses are faced with the challenge of maintaining a safe and inclusive environment for their users while protecting their brand reputation. To address this, many companies are turning to outsourcing content moderation…
4 minutes

From Trolls to Fair Play: The Transformative Impact of AI Moderation in Gaming

The Online Battlefield The online gaming community, once a haven for enthusiasts to connect and share their passion, has faced the growing challenge of toxic behaviour and harassment. Teenagers and young adults are still the main demographic of players, and as multiplayer games became more popular, so did instances of trolling, hate speech, and other…
4 minutes

Minor protection : 3 updates you should make to comply with DSA provisions

Introduction While the EU already has some rules to protect children online, such as those found in the Audiovisual Media Services Directive, the Digital Services Act (DSA) introduces specific obligations for platforms. As platforms adapt to meet the provisions outlined in the DSA Minor Protection, it's important for businesses to take proactive measures to comply…
5 minutes

The Role of a Content Moderator: Ensuring Safety and Integrity in the Digital World

In today's digital world, the role of a content moderator is central to ensuring the safety and integrity of online platforms. Content moderators are responsible for reviewing and moderating user-generated content to ensure that it complies with the platform's policies and guidelines, and the laws and regulations. Their work is crucial in creating a safe…
5 minutes

Educational Content: Enhancing Online Safety with AI

The internet has revolutionized the field of education, offering new resources and opportunities for learning. With the increased reliance on online platforms and digital content, it is now a priority to ensure the safety and security of educational spaces. This is where artificial intelligence (AI) plays a big role. By using the power of AI,…
3 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert