fbpx

Expert’s Corner with Community Building Expert Todd Nilson

Checkstep interviews expert in online community building

Todd Nilson leads transformational technology projects for major brands and organizations. He specializes in online communities, digital workplaces, social listening analysis, competitive intelligence, game thinking, employer branding, and virtual collaboration.

Todd has managed teams and engagements with national and global consultancy firms specialized in online communities and the social psychology of building trust and engagement in digital channels.

In addition to his consulting work as the president of Clocktower Advisors, he volunteers his time to local and regional entrepreneurial programs including The New North and Two Rivers Main Street. He is serving his second term as president of the Two Rivers Business Association, representing over 60 local businesses in Manitowoc County and is an active Rotarian.

1. How did you first become interested in online communities? Can you tell us a little about how your work supports them?

I first became active in some online discussion forums related to gaming back around 2000 or a little before that. At the time, I was working as a recruiter in the technology sector but I was even then starting to get excited by the collaboration and connection opportunities inherent in forums, early social media, and wikis. I quickly jumped on board.

Today, I work as a digital strategist specialized in building online communities. My company, Clocktower Advisors, helps organizations ranging from startups to Fortune 100, from nonprofit to publicly traded, to plan, build, launch, and grow safe and thriving spaces for people to connect and do meaningful work online. You can find free resources on my website. I also run a LinkedIn Live streaming show about community building three days a week and publish a weekly newsletter about online communities, too.

The work begins with helping the organization’s leader to articulate their goals for starting up a community. At the same time, there’s work to do to validate whether the people you’d like to engage will be interested in the kind of online community you propose to create. Do they care enough about the purpose to get involved and stay involved? Do competing communities already exist?

These are just some of the upfront questions I try to address before aligning the expectations with leaders about the time-to-success, ensuring an appropriate level of funding, operational support in the form of a community manager and related teammates, and more. Communities online are hard to get right. But taking your time in the beginning to really think about what your organization wants and what your members want will help tilt the chances for success in your favor.

2. What do you think is the difference between enterprise-based communities and social communities? What motivates people to participate in different communities?

If I had to say something about the history of online communities, I’d probably have to say with a fair degree of confidence that social communities were the first thing on the scene. Communities based on an interest, a hobby, a skill, or a belief tend to appear on their own. In early days, these were email discussion lists and forums. You’ll still find them on these legacy community systems today but now there are even more ways they express themselves: in private community platforms, mobile apps, Facebook Groups, subreddits, Whatsapp groups, and so many more than I could probably name. The management of these communities is loose and organic (erratic even!), a living system, and so many of them don’t catch on, or they do and quickly overwhelm the volunteers who are trying to run them.

Some of the most powerful, active, and interesting online communities are social groups. The motivation to participate in them is almost entirely intrinsic and the payoff for belonging is status, a feeling of belonging, and the ability to share one’s creativity.

Enterprise-based communities, based often on a product and offering support from peers and the company itself, celebration of a brand (and the lifestyle it connotes), and encouraging ideas for new products are services are immediately different because there’s money behind them. While there’s nothing stopping a large brand from setting up an open source forum software or bulletin board, most of the early versions of community building software don’t convey the right visual look and feel needed by more sophisticated brands. Enterprise communities tend to be more configurable, resemble commercial social media platforms in terms of a user experience, and often need to connect with other technology run by the enterprise like a learning management system, ideation functionality, or help desk ticketing. Because of the complexity and costs involved, enterprise-based communities tend to be more focused, but they are simultaneously a bigger gamble!

A big pitfall I see with enterprise-based communities is validating and clearly communicating to members why they should take part. It’s natural that the complexity and cost of enterprise online communities should tend to be a distracting part of running them. But enterprises need to pay special attention to the motivation of members and resist the urge to offer meaningless extrinsic rewards as a way to encourage participation. Instead, focus on solving problems, celebrating customers whose lives are enriched or transformed by your product or service, and encouraging creativity and connection.

3. What kinds of challenges do platforms face to establish and maintain healthy communities?

There are a few big challenges to maintaining a healthy community that I bump into time and again:

  • Not having at least one dedicated community manager is the biggest reason communities struggle
  • Not being transparent enough about the purpose of community with members, how their information will be used, resulting in distrust and abandonment of the community
  • Focusing too much on the platform and not enough on building trust, reciprocity, and friendships can stall communities
  • Executive leadership expectations not aligning with the time, budget, or operational effort involved with running the community
  • Leadership not understanding the value of the insights gleaned from community interaction or other tangible factors like the SEO value of member-generated content
  • The community management team not speaking the language of leadership, resulting in mismatched expectations and, eventually, reduced funding

4. Do you think there’s a gap which makes it difficult for smaller communities to actively want to moderate content on their platforms?

Absolutely! Smaller communities or underfunded communities without a dedicated (paid) community manager can become burdensome on your most active members, especially if they have to moderate posts or answer basic questions again and again. While I’m seeing a renaissance of new community platforms out there with many integrations with useful applications, the one that I find myself wanting all the time is one with a means to automate some of the more rote activities performed by a community manager.

5. Of course, everyone supports free speech online, but do you think Section 230 reforms make sense especially in light of some platforms that might hide hate speech and disinformation campaigns behind free speech claims?

I think we need to be careful since Section 230 has been largely responsible for enabling the sorts of user-generated content hosting necessary to allow online communities to exist in the first place. But I think I understand where you’re going: that some groups wrapping themselves in language about free speech are touting hate, division, misinformation, and disinformation.

I’m not sure that we can legislate our way out of the problem without potentially making things worse. How do you strengthen protections for unpopular or underrepresented groups whilst tightly prohibiting destructive hate groups that damage public health and safety? That’s not a legal question that I feel remotely qualified to answer but it’s one that needs to be addressed!

But I do think that the solution dwells somewhere in the land of technology enhancements for moderation. The sheer human effort of flagging, monitoring, evaluating, and responding to user-generated content online simply isn’t tenable. We need tools that help us to organize, triage, and respond more efficiently to these challenges. Social media platforms in particular are hurting in a big way because of these negative posters, trolls, and bad actors and, even worse, their algorithms still seem to reward the most incendiary posts with greater visibility. That’s gotta stop.

For the smaller, more specialized online communities, I still think it’s a similar solution. Having a good code of conduct, clear and documented repercussions for bad behavior, and automation for flagging, deleting or surfacing gray area posts to a lone community manager is a way forward. The community managers who run private online communities are facing a big mental health issue from the sheer weight of the moderation needs of their platforms on a different scale (but similar vein) from the big social media players.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Fake Dating Pictures: A Comprehensive Guide to Identifying and Managing 

In the world of online dating, fake dating pictures are harmful, as pictures play a crucial role in making a strong first impression. However, not all dating pictures are created equal. There is a growing concern about fake profiles using deceptive or doctored images.  To navigate the online dating landscape successfully, it's important to know
5 minutes

What is Content Moderation? 

Content moderation is the strategic process of evaluating, filtering, and regulating user-generated content on digital ecosystems. It plays a crucial role in fostering a safe and positive user experience by removing or restricting content that violates community guidelines, is harmful, or could offend users. An effective content moderation system is designed to strike a delicate
5 minutes

Transforming Text Moderation with Content Moderation AI

In today's interconnected world, text-based communication has become a fundamental part of our daily lives. However, with the exponential growth of user-generated text content on digital platforms, ensuring a safe and inclusive online environment has become a daunting task. Text moderation plays a critical role in filtering and managing user-generated content to prevent harmful or
4 minutes

Streamline Audio Moderation with the Power of AI

In today's digitally-driven world, audio content has become an integral part of online platforms, ranging from podcasts and audiobooks to user-generated audio clips on social media. With the increasing volume of audio content being generated daily, audio moderation has become a critical aspect of maintaining a safe and positive user experience. Audio moderation involves systematically
4 minutes

It’s Scale or Fail with AI in Video Moderation

In the digital age, video content has become a driving force across online platforms, shaping the way we communicate, entertain, and share experiences. With this exponential growth, content moderation has become a critical aspect of maintaining a safe and inclusive online environment. The sheer volume of user-generated videos poses significant challenges for platforms, necessitating advanced
4 minutes

Enable and Scale AI for Podcast Moderation

The podcasting industry has experienced an explosive growth in recent years, with millions of episodes being published across various platforms every day. As the volume of audio content surges, ensuring a safe and trustworthy podcast environment becomes a paramount concern. Podcast moderation plays a crucial role in filtering and managing podcast episodes to prevent the
4 minutes

Ready or Not, AI Is Coming to Content Moderation

As digital platforms and online communities continue to grow, content moderation becomes increasingly critical to ensure safe and positive user experiences. Manual content moderation by human moderators is effective but often falls short when dealing with the scale and complexity of user-generated content. Ready or not, AI is coming to content moderation operations, revolutionizing the
5 minutes

How to Protect the Mental Health of Content Moderators? 

Content moderation has become an essential aspect of managing online platforms and ensuring a safe user experience. Behind the scenes, content moderators play a crucial role in reviewing user-generated content, filtering out harmful or inappropriate materials, and upholding community guidelines. However, the task of content moderation is not without its challenges, as it exposes moderators
4 minutes

Scaling Content Moderation Through AI Pays Off, No Matter the Investment

In the rapidly evolving digital landscape, user-generated content has become the lifeblood of online platforms, from social media giants to e-commerce websites. With the surge in content creation, content moderation has become a critical aspect of maintaining a safe and reputable online environment. As the volume of user-generated content continues to grow, manual content moderation
4 minutes

Overhaul Image Moderation with the Power of AI

In today's digital world, visual content plays a significant role in online platforms, ranging from social media to e-commerce websites. With the exponential growth of user-generated images, ensuring a safe and inclusive user experience has become a paramount concern for platform owners. However, image moderation poses unique challenges due to the sheer volume, diverse content,
4 minutes

Outsourcing Content Moderation

Outsourcing content moderation has become an essential aspect of managing online platforms in the digital age. With the exponential growth of user-generated content, businesses are faced with the challenge of maintaining a safe and inclusive environment for their users while protecting their brand reputation. To address this, many companies are turning to outsourcing content moderation
4 minutes

Designing for Trust in 2023: How to Create User-Friendly Designs that Keep Users Safe

The Significance of designing for trust in the Digital World In today's digital landscape, building trust with users is essential for operating a business online. Trust is the foundation of successful user interactions and transactions, it is key to encouraging users to share personal information, make purchases, and interact with website content. Without trust, users
5 minutes

Content Moderation: A Comprehensive Guide

Content moderation is a crucial aspect of managing online platforms and communities. It involves the review, filtering, and approval or removal of user-generated content to maintain a safe and engaging environment. To navigate this landscape effectively, it's essential to understand the terminology associated with content moderation. In this article, we'll delve into a comprehensive glossary
7 minutes

How Predators Are Abusing Generative AI

The recent rise of generative AI has revolutionized various industries, including Trust and Safety. However, this technological advancement generates new problems. Predators have found ways to abuse generative AI, using it to carry out horrible acts such as child sex abuse material (CSAM), disinformation, fraud, and extremism. In this article, we will explore how predators
4 minutes

The Future of Dating: Embracing Video to Connect and Thrive

‍In a rapidly evolving digital landscape, dating apps are continually seeking innovative ways to enhance the user experience and foster meaningful connections. One such trend that has gained significant traction is the integration of video chat features. Video has emerged as a powerful tool to add authenticity, connectivity, and fun to the dating process. In
4 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert