Expert’s Corner with Community Building Expert Todd Nilson

Checkstep interviews expert in online community building

Todd Nilson leads transformational technology projects for major brands and organizations. He specializes in online communities, digital workplaces, social listening analysis, competitive intelligence, game thinking, employer branding, and virtual collaboration.

Todd has managed teams and engagements with national and global consultancy firms specialized in online communities and the social psychology of building trust and engagement in digital channels.

In addition to his consulting work as the president of Clocktower Advisors, he volunteers his time to local and regional entrepreneurial programs including The New North and Two Rivers Main Street. He is serving his second term as president of the Two Rivers Business Association, representing over 60 local businesses in Manitowoc County and is an active Rotarian.

1. How did you first become interested in online communities? Can you tell us a little about how your work supports them?

I first became active in some online discussion forums related to gaming back around 2000 or a little before that. At the time, I was working as a recruiter in the technology sector but I was even then starting to get excited by the collaboration and connection opportunities inherent in forums, early social media, and wikis. I quickly jumped on board.

Today, I work as a digital strategist specialized in building online communities. My company, Clocktower Advisors, helps organizations ranging from startups to Fortune 100, from nonprofit to publicly traded, to plan, build, launch, and grow safe and thriving spaces for people to connect and do meaningful work online. You can find free resources on my website. I also run a LinkedIn Live streaming show about community building three days a week and publish a weekly newsletter about online communities, too.

The work begins with helping the organization’s leader to articulate their goals for starting up a community. At the same time, there’s work to do to validate whether the people you’d like to engage will be interested in the kind of online community you propose to create. Do they care enough about the purpose to get involved and stay involved? Do competing communities already exist?

These are just some of the upfront questions I try to address before aligning the expectations with leaders about the time-to-success, ensuring an appropriate level of funding, operational support in the form of a community manager and related teammates, and more. Communities online are hard to get right. But taking your time in the beginning to really think about what your organization wants and what your members want will help tilt the chances for success in your favor.

2. What do you think is the difference between enterprise-based communities and social communities? What motivates people to participate in different communities?

If I had to say something about the history of online communities, I’d probably have to say with a fair degree of confidence that social communities were the first thing on the scene. Communities based on an interest, a hobby, a skill, or a belief tend to appear on their own. In early days, these were email discussion lists and forums. You’ll still find them on these legacy community systems today but now there are even more ways they express themselves: in private community platforms, mobile apps, Facebook Groups, subreddits, Whatsapp groups, and so many more than I could probably name. The management of these communities is loose and organic (erratic even!), a living system, and so many of them don’t catch on, or they do and quickly overwhelm the volunteers who are trying to run them.

Some of the most powerful, active, and interesting online communities are social groups. The motivation to participate in them is almost entirely intrinsic and the payoff for belonging is status, a feeling of belonging, and the ability to share one’s creativity.

Enterprise-based communities, based often on a product and offering support from peers and the company itself, celebration of a brand (and the lifestyle it connotes), and encouraging ideas for new products are services are immediately different because there’s money behind them. While there’s nothing stopping a large brand from setting up an open source forum software or bulletin board, most of the early versions of community building software don’t convey the right visual look and feel needed by more sophisticated brands. Enterprise communities tend to be more configurable, resemble commercial social media platforms in terms of a user experience, and often need to connect with other technology run by the enterprise like a learning management system, ideation functionality, or help desk ticketing. Because of the complexity and costs involved, enterprise-based communities tend to be more focused, but they are simultaneously a bigger gamble!

A big pitfall I see with enterprise-based communities is validating and clearly communicating to members why they should take part. It’s natural that the complexity and cost of enterprise online communities should tend to be a distracting part of running them. But enterprises need to pay special attention to the motivation of members and resist the urge to offer meaningless extrinsic rewards as a way to encourage participation. Instead, focus on solving problems, celebrating customers whose lives are enriched or transformed by your product or service, and encouraging creativity and connection.

3. What kinds of challenges do platforms face to establish and maintain healthy communities?

There are a few big challenges to maintaining a healthy community that I bump into time and again:

  • Not having at least one dedicated community manager is the biggest reason communities struggle
  • Not being transparent enough about the purpose of community with members, how their information will be used, resulting in distrust and abandonment of the community
  • Focusing too much on the platform and not enough on building trust, reciprocity, and friendships can stall communities
  • Executive leadership expectations not aligning with the time, budget, or operational effort involved with running the community
  • Leadership not understanding the value of the insights gleaned from community interaction or other tangible factors like the SEO value of member-generated content
  • The community management team not speaking the language of leadership, resulting in mismatched expectations and, eventually, reduced funding

4. Do you think there’s a gap which makes it difficult for smaller communities to actively want to moderate content on their platforms?

Absolutely! Smaller communities or underfunded communities without a dedicated (paid) community manager can become burdensome on your most active members, especially if they have to moderate posts or answer basic questions again and again. While I’m seeing a renaissance of new community platforms out there with many integrations with useful applications, the one that I find myself wanting all the time is one with a means to automate some of the more rote activities performed by a community manager.

5. Of course, everyone supports free speech online, but do you think Section 230 reforms make sense especially in light of some platforms that might hide hate speech and disinformation campaigns behind free speech claims?

I think we need to be careful since Section 230 has been largely responsible for enabling the sorts of user-generated content hosting necessary to allow online communities to exist in the first place. But I think I understand where you’re going: that some groups wrapping themselves in language about free speech are touting hate, division, misinformation, and disinformation.

I’m not sure that we can legislate our way out of the problem without potentially making things worse. How do you strengthen protections for unpopular or underrepresented groups whilst tightly prohibiting destructive hate groups that damage public health and safety? That’s not a legal question that I feel remotely qualified to answer but it’s one that needs to be addressed!

But I do think that the solution dwells somewhere in the land of technology enhancements for moderation. The sheer human effort of flagging, monitoring, evaluating, and responding to user-generated content online simply isn’t tenable. We need tools that help us to organize, triage, and respond more efficiently to these challenges. Social media platforms in particular are hurting in a big way because of these negative posters, trolls, and bad actors and, even worse, their algorithms still seem to reward the most incendiary posts with greater visibility. That’s gotta stop.

For the smaller, more specialized online communities, I still think it’s a similar solution. Having a good code of conduct, clear and documented repercussions for bad behavior, and automation for flagging, deleting or surfacing gray area posts to a lone community manager is a way forward. The community managers who run private online communities are facing a big mental health issue from the sheer weight of the moderation needs of their platforms on a different scale (but similar vein) from the big social media players.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Image Moderation Guide: Discover the Power of AI

In today's digital world, visual content plays a significant role in online platforms, ranging from social media to e-commerce websites. With the exponential growth of user-generated images, ensuring a safe and inclusive user experience has become a paramount concern for platform owners. However, image moderation poses unique challenges due to the sheer volume, diverse content,…
4 minutes

The Psychology Behind AI Content Moderation: Understanding User Behavior

Social media platforms are experiencing exponential growth, with billions of users actively engaging in content creation and sharing. As the volume of user-generated content continues to rise, the challenge of content moderation becomes increasingly complex. To address this challenge, artificial intelligence (AI) has emerged as a powerful tool for automating the moderation process. However, user…
5 minutes

From Trolls to Fair Play: The Transformative Impact of AI Moderation in Gaming

The Online Battlefield The online gaming community, once a haven for enthusiasts to connect and share their passion, has faced the growing challenge of toxic behaviour and harassment. Teenagers and young adults are still the main demographic of players, and as multiplayer games became more popular, so did instances of trolling, hate speech, and other…
4 minutes

Outsourcing Content Moderation

Outsourcing content moderation has become an essential aspect of managing online platforms in the digital age. With the exponential growth of user-generated content, businesses are faced with the challenge of maintaining a safe and inclusive environment for their users while protecting their brand reputation. To address this, many companies are turning to outsourcing content moderation…
4 minutes

Ethical Consideration in AI Content Moderation : Avoiding Censorship and Biais

Artificial Intelligence has revolutionized various aspects of our lives, including content moderation on online platforms. As the volume of digital content continues to grow exponentially, AI algorithms play a crucial role in filtering and managing this content. However, with great power comes great responsibility, and the ethical considerations surrounding AI content moderation are becoming increasingly…
3 minutes

The Evolution of Online Communication: Cultivating Safe and Respectful Interactions

What was once an outrageous dream is now a mundane reality. Going from in-person communication to being able to hold a conversation from thousands of kilometres away has been nothing short of revolutionary. From the invention of email to the meteoric rise of social media and video conferencing, the ways we connect, share, and interact…
5 minutes

Designing for Trust in 2023: How to Create User-Friendly Designs that Keep Users Safe

The Significance of designing for trust in the Digital World In today's digital landscape, building trust with users is essential for operating a business online. Trust is the foundation of successful user interactions and transactions, it is key to encouraging users to share personal information, make purchases, and interact with website content. Without trust, users…
5 minutes

Global Perspective : How AI Content Moderation Differs Across Cultures and Religion

The internet serves as a vast platform for the exchange of ideas, information, and opinions. However, this free exchange also brings challenges, including the need for content moderation to ensure that online spaces remain safe and respectful. As artificial intelligence (AI) increasingly plays a role in content moderation, it becomes essential to recognize the cultural…
5 minutes

TikTok DSA Statement of Reasons (SOR) Statistics

What can we learn from TikTok Statements of Reasons? Body shaming, hypersexualisation, the spread of fake news and misinformation, and the glorification of violence are a high risk on any kind of Social Network. TikTok is one of the fastest growing between 2020 and 2023 and has million of content uploaded everyday on its platform.…
10 minutes

Fake Dating Images: Your Ultimate Moderation Guide

Introduction: Combatting fake dating images to protect your platform With growing number of user concerns highlighting fake dating images to mislead users, dating platforms are facing a growing challenge. These pictures are not only a threat to dating platform's integrity but it also erodes user trusts and exposes companies to reputational and compliance risks. In…
5 minutes

Future Technologies : The Next Generation of AI in Content Moderation 

With the exponential growth of user-generated content on various platforms, the task of ensuring a safe and compliant online environment has become increasingly complex. As we look toward the future, emerging technologies, particularly in the field of artificial intelligence (AI), are poised to revolutionize content moderation and usher in a new era of efficiency and…
3 minutes

How to deal with Fake Dating Profiles on your Platform

Have you seen an increase in fake profiles on your platform? Are you concerned about it becoming a wild west? In this article, we’ll dive into how to protect users from encountering bad actors and create a safer environment for your customers. An Introduction to the Issue Dating apps have transformed the way people interact…
5 minutes

Trust and Safety Teams: Ensuring User Protection

As the internet becomes an integral part of our daily lives, companies must prioritize the safety and security of their users. This responsibility falls on trust and safety teams, whose primary goal is to protect users from fraud, abuse, and other harmful behavior.  Trust and Safety Teams Objectives  The Role of Trust and Safety Teams…
6 minutes

The Impact of AI Content Moderation on User Experience and Engagement

User experience and user engagement are two critical metrics that businesses closely monitor to understand how their products, services, or systems are being received by customers. Now that user-generated content (UGC) is on the rise, content moderation plays a main role in ensuring a safe and positive user experience. Artificial intelligence (AI) has emerged as…
4 minutes

Building Trust and Safety Online: The Power of AI Content Moderation in Community Forums

Community forums are providing spaces for individuals to connect, share ideas, and build relationships. However, maintaining a safe and welcoming environment in these forums is crucial for fostering trust and ensuring the well-being of community members. To address this challenge, many forums are turning to the power of artificial intelligence (AI) content moderation. In this…
3 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert