Expert’s Corner with Community Building Expert Todd Nilson

Checkstep interviews expert in online community building

Todd Nilson leads transformational technology projects for major brands and organizations. He specializes in online communities, digital workplaces, social listening analysis, competitive intelligence, game thinking, employer branding, and virtual collaboration.

Todd has managed teams and engagements with national and global consultancy firms specialized in online communities and the social psychology of building trust and engagement in digital channels.

In addition to his consulting work as the president of Clocktower Advisors, he volunteers his time to local and regional entrepreneurial programs including The New North and Two Rivers Main Street. He is serving his second term as president of the Two Rivers Business Association, representing over 60 local businesses in Manitowoc County and is an active Rotarian.

1. How did you first become interested in online communities? Can you tell us a little about how your work supports them?

I first became active in some online discussion forums related to gaming back around 2000 or a little before that. At the time, I was working as a recruiter in the technology sector but I was even then starting to get excited by the collaboration and connection opportunities inherent in forums, early social media, and wikis. I quickly jumped on board.

Today, I work as a digital strategist specialized in building online communities. My company, Clocktower Advisors, helps organizations ranging from startups to Fortune 100, from nonprofit to publicly traded, to plan, build, launch, and grow safe and thriving spaces for people to connect and do meaningful work online. You can find free resources on my website. I also run a LinkedIn Live streaming show about community building three days a week and publish a weekly newsletter about online communities, too.

The work begins with helping the organization’s leader to articulate their goals for starting up a community. At the same time, there’s work to do to validate whether the people you’d like to engage will be interested in the kind of online community you propose to create. Do they care enough about the purpose to get involved and stay involved? Do competing communities already exist?

These are just some of the upfront questions I try to address before aligning the expectations with leaders about the time-to-success, ensuring an appropriate level of funding, operational support in the form of a community manager and related teammates, and more. Communities online are hard to get right. But taking your time in the beginning to really think about what your organization wants and what your members want will help tilt the chances for success in your favor.

2. What do you think is the difference between enterprise-based communities and social communities? What motivates people to participate in different communities?

If I had to say something about the history of online communities, I’d probably have to say with a fair degree of confidence that social communities were the first thing on the scene. Communities based on an interest, a hobby, a skill, or a belief tend to appear on their own. In early days, these were email discussion lists and forums. You’ll still find them on these legacy community systems today but now there are even more ways they express themselves: in private community platforms, mobile apps, Facebook Groups, subreddits, Whatsapp groups, and so many more than I could probably name. The management of these communities is loose and organic (erratic even!), a living system, and so many of them don’t catch on, or they do and quickly overwhelm the volunteers who are trying to run them.

Some of the most powerful, active, and interesting online communities are social groups. The motivation to participate in them is almost entirely intrinsic and the payoff for belonging is status, a feeling of belonging, and the ability to share one’s creativity.

Enterprise-based communities, based often on a product and offering support from peers and the company itself, celebration of a brand (and the lifestyle it connotes), and encouraging ideas for new products are services are immediately different because there’s money behind them. While there’s nothing stopping a large brand from setting up an open source forum software or bulletin board, most of the early versions of community building software don’t convey the right visual look and feel needed by more sophisticated brands. Enterprise communities tend to be more configurable, resemble commercial social media platforms in terms of a user experience, and often need to connect with other technology run by the enterprise like a learning management system, ideation functionality, or help desk ticketing. Because of the complexity and costs involved, enterprise-based communities tend to be more focused, but they are simultaneously a bigger gamble!

A big pitfall I see with enterprise-based communities is validating and clearly communicating to members why they should take part. It’s natural that the complexity and cost of enterprise online communities should tend to be a distracting part of running them. But enterprises need to pay special attention to the motivation of members and resist the urge to offer meaningless extrinsic rewards as a way to encourage participation. Instead, focus on solving problems, celebrating customers whose lives are enriched or transformed by your product or service, and encouraging creativity and connection.

3. What kinds of challenges do platforms face to establish and maintain healthy communities?

There are a few big challenges to maintaining a healthy community that I bump into time and again:

  • Not having at least one dedicated community manager is the biggest reason communities struggle
  • Not being transparent enough about the purpose of community with members, how their information will be used, resulting in distrust and abandonment of the community
  • Focusing too much on the platform and not enough on building trust, reciprocity, and friendships can stall communities
  • Executive leadership expectations not aligning with the time, budget, or operational effort involved with running the community
  • Leadership not understanding the value of the insights gleaned from community interaction or other tangible factors like the SEO value of member-generated content
  • The community management team not speaking the language of leadership, resulting in mismatched expectations and, eventually, reduced funding

4. Do you think there’s a gap which makes it difficult for smaller communities to actively want to moderate content on their platforms?

Absolutely! Smaller communities or underfunded communities without a dedicated (paid) community manager can become burdensome on your most active members, especially if they have to moderate posts or answer basic questions again and again. While I’m seeing a renaissance of new community platforms out there with many integrations with useful applications, the one that I find myself wanting all the time is one with a means to automate some of the more rote activities performed by a community manager.

5. Of course, everyone supports free speech online, but do you think Section 230 reforms make sense especially in light of some platforms that might hide hate speech and disinformation campaigns behind free speech claims?

I think we need to be careful since Section 230 has been largely responsible for enabling the sorts of user-generated content hosting necessary to allow online communities to exist in the first place. But I think I understand where you’re going: that some groups wrapping themselves in language about free speech are touting hate, division, misinformation, and disinformation.

I’m not sure that we can legislate our way out of the problem without potentially making things worse. How do you strengthen protections for unpopular or underrepresented groups whilst tightly prohibiting destructive hate groups that damage public health and safety? That’s not a legal question that I feel remotely qualified to answer but it’s one that needs to be addressed!

But I do think that the solution dwells somewhere in the land of technology enhancements for moderation. The sheer human effort of flagging, monitoring, evaluating, and responding to user-generated content online simply isn’t tenable. We need tools that help us to organize, triage, and respond more efficiently to these challenges. Social media platforms in particular are hurting in a big way because of these negative posters, trolls, and bad actors and, even worse, their algorithms still seem to reward the most incendiary posts with greater visibility. That’s gotta stop.

For the smaller, more specialized online communities, I still think it’s a similar solution. Having a good code of conduct, clear and documented repercussions for bad behavior, and automation for flagging, deleting or surfacing gray area posts to a lone community manager is a way forward. The community managers who run private online communities are facing a big mental health issue from the sheer weight of the moderation needs of their platforms on a different scale (but similar vein) from the big social media players.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

The Future of AI-Powered Content Moderation: Careers and Opportunities

As companies are grappling with the challenge of ensuring user safety and creating a welcoming environment: AI-powered content moderation has emerged as a powerful solution, revolutionizing the way organizations approach this task. In this article, we will explore the careers and opportunities that AI-powered content moderation presents, and how individuals and businesses can adapt to…
6 minutes

The Impact of Trust and Safety in Marketplaces

Nowadays, its no surprise that an unregulated marketplace with sketchy profiles, violent interactions, scams, and illegal products is doomed to fail. In the current world of online commerce, trust and safety are essential, and if users don't feel comfortable, they won’t buy. As a marketplace owner, ensuring that your platform is a safe and reliable…
9 minutes

How AI is Revolutionizing Content Moderation in Social Media Platforms

Social media platforms have become an integral part of our lives, connecting us with friends, family, and the world at large. Still, with the exponential growth of user-generated content, ensuring a safe and positive user experience has become a daunting task. This is where Artificial Intelligence (AI) comes into play, revolutionizing the way social media…
3 minutes

Customizing AI Content Moderation for Different Industries and Platforms

With the exponential growth of user-generated content across various industries and platforms, the need for effective and tailored content moderation solutions has never been more apparent. Artificial Intelligence (AI) plays a major role in automating content moderation processes, but customization is key to address the unique challenges faced by different industries and platforms. Understanding Industry-Specific…
3 minutes

Emerging Threats in AI Content Moderation : Deep Learning and Contextual Analysis 

With the rise of user-generated content across various platforms, artificial intelligence (AI) has played a crucial role in automating the moderation process. However, as AI algorithms become more sophisticated, emerging threats in content moderation are also on the horizon. This article explores two significant challenges: the use of deep learning and contextual analysis in AI…
4 minutes

The Impact of AI Content Moderation on User Experience and Engagement

User experience and user engagement are two critical metrics that businesses closely monitor to understand how their products, services, or systems are being received by customers. Now that user-generated content (UGC) is on the rise, content moderation plays a main role in ensuring a safe and positive user experience. Artificial intelligence (AI) has emerged as…
4 minutes

Future Technologies : The Next Generation of AI in Content Moderation 

With the exponential growth of user-generated content on various platforms, the task of ensuring a safe and compliant online environment has become increasingly complex. As we look toward the future, emerging technologies, particularly in the field of artificial intelligence (AI), are poised to revolutionize content moderation and usher in a new era of efficiency and…
3 minutes

Global Perspective : How AI Content Moderation Differs Across Cultures and Religion

The internet serves as a vast platform for the exchange of ideas, information, and opinions. However, this free exchange also brings challenges, including the need for content moderation to ensure that online spaces remain safe and respectful. As artificial intelligence (AI) increasingly plays a role in content moderation, it becomes essential to recognize the cultural…
5 minutes

Ethical Consideration in AI Content Moderation : Avoiding Censorship and Biais

Artificial Intelligence has revolutionized various aspects of our lives, including content moderation on online platforms. As the volume of digital content continues to grow exponentially, AI algorithms play a crucial role in filtering and managing this content. However, with great power comes great responsibility, and the ethical considerations surrounding AI content moderation are becoming increasingly…
3 minutes

The Psychology Behind AI Content Moderation: Understanding User Behavior

Social media platforms are experiencing exponential growth, with billions of users actively engaging in content creation and sharing. As the volume of user-generated content continues to rise, the challenge of content moderation becomes increasingly complex. To address this challenge, artificial intelligence (AI) has emerged as a powerful tool for automating the moderation process. However, user…
5 minutes

How Predators Are Abusing Generative AI

The recent rise of generative AI has revolutionized various industries, including Trust and Safety. However, this technological advancement generates new problems. Predators have found ways to abuse generative AI, using it to carry out horrible acts such as child sex abuse material (CSAM), disinformation, fraud, and extremism. In this article, we will explore how predators…
4 minutes

What is Content Moderation: a Guide

Content moderation is one of the major aspect of managing online platforms and communities. It englobes the review, filtering, and approval or removal of user-generated content to maintain a safe and engaging environment. In this article, we'll provide you with a comprehensive glossary to understand the key concepts, as well as its definition, challenges and…
15 minutes

‍The Future of Dating: Embracing Video to Connect and Thrive

In a rapidly evolving digital landscape, dating apps are continually seeking innovative ways to enhance the user experience and foster meaningful connections. One such trend that has gained significant traction is the integration of video chat features. Video has emerged as a powerful tool to add authenticity, connectivity, and fun to the dating process. In…
4 minutes

17 Questions Trust and Safety Leaders Should Be Able to Answer 

A Trust and Safety leader plays a crucial role in ensuring the safety and security of a platform or community. Here are 17 important questions that a Trust and Safety leader should be able to answer.  What are the key goals and objectives of the Trust and Safety team? The key goals of the Trust…
6 minutes

How to Launch a Successful Career in Trust and Safety‍

Before diving into the specifics of launching a career in Trust and Safety, it's important to have a clear understanding of what this field entails. Trust and Safety professionals are responsible for maintaining a safe and secure environment for users on digital platforms. This includes identifying and addressing harmful content, developing policies to prevent abuse,…
5 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert