« Back to All Posts

Are You Doing Moderation All Wrong?

Posted by Rachel Cook on April 13, 2016 • ... comments

These days, the comments section has become a standard feature on nearly every publisher website. But do Internet comments really add value, or are they just a necessary evil?


For better or for worse, online discussions have changed the way publishers interact with their audiences. In the past, publishers would produce content and readers would consume it. End of story. But with the rise of comments sections, personal blogs, and open forums, that one-way system of communication is long gone. Today, any publishers can interact with their audiences, establish meaningful connections, and keep passive readers engaged longer. These changes translate to real monetary value for publishers who can generate more revenue from highly engaged audiences.

Unfortunately, the open and diverse nature of the Internet—the very qualities that make the it ideal for lively discussions—also provide opportunities for delinquents to wreak havoc on online discussions. With no guidelines for behavior, the average comments section can leave even the most experienced users drowning in a sea of trolling comments and unsavory memes.

What’s really at stake?

For many publishers, the possibility of exposing their readers and their work to Internet bullies is not worth the risk, while others see it as a necessary evil. So, the question remains: is having a comments section worse than not having one?

It’s a dilemma that nearly every online publisher has wrestled with. Their choices are to either:

  1. Include a comments section and risk it being overtaken by Internet trolls.
  1. Or, don’t include one, but know that it comes at a price: the loss of meaningful exchanges and valuable reader engagement.

When science meets moderation

Seems pretty dismal, doesn’t it? But there’s a third option we haven’t covered yet:

  1. Include a comments section. Make it great. Don’t worry about trolls.

That last option seems too good to be true, but the founders at The Coral Project think it’s possible, and we have to agree. They recognize the value of engaging online discussions, but they understand the reality of trolling and bullying that happens online. In partnership with The Mozilla Foundation, The New York Times, and The Washington Post, The Coral Project has developed open source tools to help publishers have great conversations with their readers.

Perhaps more exciting than the technology itself is their scientific approach to analyzing and solving problems related to online moderation. They apply an academic rigor to studying topics like toxic conversations online, identifying trolls, and determining patterns of quality conversations.

Nipping the problem in the bud

The reality is that moderation in its current form is a hard thing to get right. While commenting platforms can help publishers manage online discussions, these tools aren’t nearly as effective when there’s no strategy in place. Unfortunately, that means that moderation, like most difficult tasks, often gets pushed aside as an afterthought, something to return to when there’s a better plan for management.

I once had the brilliant idea of starting an herb garden in my backyard. Within a few short weeks the mint I had planted completely demolished the other herbs and was threatening to take over the entire yard. I foolishly let it continue growing until it became unmanageable, pushing its way through the fence into my neighbor’s yard. I knew I needed to do something about it, but didn’t know where to start. I halfheartedly hacked at pieces of it, spending hours tackling the unwieldy plant, with nothing to show for it aside from dirty fingernails and a vaguely minty aura. Eventually, I couldn’t handle it myself and I ended up hiring someone else to clear the whole garden so I could start from scratch.

I knew it would be a challenge, so I kept putting it off until I reached a breaking point. But that didn’t make things any easier. In fact, it made it harder and more expensive to manage once I actually decided to act.

Online moderation can pose the same challenges for publishers, who might not always feel in control of their comments section. Often, it can seem easier to just put the comments on the backburner and leave them for another day when there might be a better solution. With this approach however, there’s no way to win. Either you spend your days trying to suppress something that’s not working, or you acquiesce and let it take over completely.

Getting to the root of the problem

My end goal with the herb garden wasn’t to destroy the mint plant (even though it was frustrating, aggressive, and annoyingly determined). More than preventing that plant from besting me, I was trying to create a space for the rest of the plants to grow so that I could have something useful that would hopefully make my cooking a little more exciting.

That’s exactly what the Coral Project seeks to do in their approach to comment moderation. They believe that by shifting from a “stop the trolls” mentality to a “build an awesome community” mentality, publishers will attract better commenters and foster better conversations.

A better way to deal with trolls

At Disqus, we’ve spent a long time thinking about these issues. We’ve worked with millions of publishers who have helped us better understand how to build great discussions on the open and diverse Internet.

In the early days, we focused a lot on comment-level moderation by developing tools for deleting and editing comments. We quickly realized that while these tools were necessary for moderators to be successful, they would never be at the core of a larger moderation strategy. On the surface, online discussions look like they’re made up of comments. But look just a little deeper and it becomes clear that it’s not the individual comments that matter, but the commenters. It’s the people behind the comments who fuel online culture and add value to communities.

As a result, we’ve taken a more commenter-centric approach in our moderation software. We’ve introduced things like user reputation, user profiles, and trusted & banned user lists to help publishers moderate commenter-level behavior rather than getting bogged down by individual comments.

But that’s just the beginning. These tools are the result of conversations we’ve had with thousands of publishers, but we know there’s still room to grow. So let us know: Where do you think the future of online community engagement is headed? We’re listening.

Subscribe to the Disqus Blog

Something Powerful

Tell The Reader More

The headline and subheader tells us what you're offering, and the form header closes the deal. Over here you can explain why your offer is so great it's worth filling out a form for.


  • Bullets are great
  • For spelling out benefits and
  • Turning visitors into leads.

Subscribe to Email Updates

Recent Posts