We recently turned story comments back on at Philly.com. We turned off comments, for the most part, late last year because, well, our comments system sucked. It encouraged anonymity, it didn't have confirmed user registration, there were no usernames displayed—it was pretty lousy. The new system solves all those problems and is far better.
Managing comments and other forms of online community isn't rocket science, but it always seems to flummox newspapers. Thousands of online sites have been running thriving, friendly, well-behaved communities for years, but newspapers seem allergic to the idea of letting readers have their say online—and almost invariably seem do it badly when they do allow reader interaction.
Anytime a newspaper has problems with comments, it doesn't take long to figure out why: It happens because the site managers allowed anonymity, or they didn't think to employ a profanity filter, or they didn't put "report abuse" buttons on the comments to let readers self-police the feature. Fail to do any one of these and you get chaos. Online community managers have known this for years. Newspapers are still learning.
It's also possible for newspaper sites to go too far, and require editorial approval of every single comment (New York Times, I'm looking at you). That's just nuts—it's an enormous resource hog and a horrible reader experience (because comments aren't posted in real time, stifling the conversation). More importantly, 99.9 percent of comments are fine if you have the proper protections in place. Sifting through all the acceptable comments to find the offensive ones is like looking for the proverbial needle in a haystack. It's just not worth it.
Somehow, I think all of the problems that newspaper sites have with implementing comments is a metaphor for the old-school newsroom thinking that's crippled the transition to a more conversational style of journalism. Or maybe it reflects a deep-seated psychological issue. Uncomfortable with the idea of letting readers participate—which challenges journalists' long-held position as the arbiters of news and information—newspaper sites unconsciously sabotage their reader interaction by failing to put simple protections in place to make them easy to manage and to create a good experience for readers and journalists alike.
The trick in running a successful comments system is to get to a point where all you have to manage is the tiny fractional percentage of contributors who are going to abuse the system (and a small number will always abuse the system. It's a given). You do that by putting protections into place that support the vast majority of commenters who want to behave decently, and that minimize the number of possible mischief-makers.
You have to trust the readers—something that's hard for a lot of journalists to do, I'm afraid. And you have to move away from the Victorian newsroom attitude that the world will end if something offensive appears on the site, however briefly. It's going to happen. Deal with it. Don't shut down reader interaction because you're uncomfortable with it.
At Philly.com, our new comment system employs a number of best practices to assure that readers can interact in a pleasant, friendly environment, and that we can concentrate our efforts on managing back the handful of unavoidable miscreants. These include:
• Required registration, with a confirmable e-mail address. This is fundamental—it allows you to manage individual contributors, see what they're doing, and delete them if they misbehave. Pure anonymity is a recipe for disaster. We allow readers to be anonymous online (see next entry), but we know who they are behind the scenes, with simple registration that requires an e-mail address, zip, age and gender. New registrants must respond to a confirmation e-mail before they can begin commenting—a step that also slows down drive-by commenters who just want to make trouble.
• Unique usernames. This is another protection against anonymity. We suggest that readers use their real names; some do, but that's a subliminal effort to make it clear that posters' identity is important to us. It also lets readers become familiar with individual commenters and their personalities. (Unfortunately, if you stop anonymity completely by requiring real names, you effectively shut down comments, because most people don't want to be so publicly identified. There's a delicate balance here.)
• Profanity filter. You have to block people from using profanity and racial or ethnic slurs. Be sure to account for creative spelling. Filtering comments and rejecting them if they use a banned word is one of the simplest ways to control comments. It's amazing how many sites miss that.
• "Report abuse" buttons. Very important—this enlists the community in the effort to police what's happening in comments or on discussion boards. If something offensive appears, you'll hear about it immediately from the community, and can deal with it. This greatly reduces the need for ongoing moderation of the comments, especially if you're willing to get comfortable with something offensive appearing briefly before it can be pulled down.
• Clear, upbeat language about behavior. The Philly.com comments form says, "Philly.com comments are intended to be civil, friendly conversations. Please treat other participants with respect and in a way that you would want to be treated. You are responsible for what you say." Yeah, just words. But they set a tone and clear expectations for behavior.
• Selectivity about what stories get comments. Personally, I'd prefer that all stories have comments. And we'll get there eventually. But for a variety of reasons, we're turning on comments on a story-by-story basis. We're avoiding stories that might be racially charged, for instance. For now, the Philadelphia Inquirer newsroom is being very selective about what stories it attaches comments to; by contrast, our other paper, the Philadelphia Daily News, is adding comments to just about every one of its stories. We have that flexibility. Over time, I believe, we'll create a strong commenting community and culture that will make it clear how people should behave when responding to stories or blogs, and at that point, putting comments on every story will be easy.
• And no, we're not formally moderating comments. We keep an eye on them, but not in a systematic way. We trust the "report abuse" function to let us know if there are problems. So far, the problems have been minimal, and that's not a unique experience—the Sacramento Bee recently turned on unmoderated comments (without some of the protections we have), and lo and behold, the paper discovered that the world didn't end. "The sky has not fallen. The First Amendment remains intact. The raucous ruckus of anonymous Internet debate gets a little louder," reports Bee Public Editor Armando Acuna. "Surprisingly, few comments have been flagged." That's been our experience, too. We tested an early version of the new comments system on one of our more controversial and popular blogs, PhillyGossip, and got just a couple of abuse reports over several weeks. And that was before we turned on the full registration and username system.
What happens if offensive behavior does break out in comments or a discussion board? That's an interesting question, and the obvious answer—immediately move to remove the offensive material—may not be the only answer. Gannett's very bright VP of new media content, Jennifer Carroll, has an interesting take on this: She suggests that newspapers should dig deeper into the community attitudes that cause this behavior, rather than simply being revolted by it. It may be offensive, but it reflects a level of discourse in the community, however distasteful.
I realize this is a long post, but it's a very important topic. Comments and discussion boards are an important way to more fully engage the community in a newspaper Web site, and they can also be significant traffic drivers. Newspaper sites continue to tiptoe into this area, worrying too much about the downside, without understanding that there are things they can do to minimize problems. At Philly.com, I believe we've put in a comments system that maximizes reader interaction while minimizing the fuss over it.
Postscript:
There's more good reading on this topic here, from Howard Owens, who also believes that anonymity is anathema to successful comments and reader interaction; and here, from Mark Glaser, who's a bit too admiring of moderated comments systems like that at the New York Times site. Glaser's post also includes a good overview of Section 230 of the Communications Decency Act, which gives sites a great deal of protection against what happens in reader-contributed content. Journalists worried about libel and defamation in comments should spend some time learning about Section 230—they'll find that, somewhat counterintuitively, a hands-off policy toward managing comments provides more legal protection.
Good points, Mark.
A few critiques of the system you've implemented:
- There's no way to see all of the comments by one person. If you're focused on community building and unique identity, it'd be useful to click on a comment you like/hate and see what that person has had to say on other stories. Link those comments back to the other stories and you've got a recirculation generator.
- It seems a bit heavy handed to require people to sign in to report abuse. You're dramatically limiting the pool of people who can help you police.
The biggest issues with comments I see (not just the Philly.com implementation) is that the conversation dies based on the news cycle.
Would love to see someone develop a topic oriented discussion that layers in related news stories instead of just having threads on each story.
Posted by: Rocky | March 01, 2008 at 12:43 PM
Thanks, Rocky. We'll be adding the ability to see all of a member's posts in a future implementation. I think that's an important element as well. We just couldn't get it done for this round!
Posted by: Mark Potts | March 01, 2008 at 01:09 PM
Nice work, Mark (and team).
For more best practices on managing online comments successfully, check out (disclosure: I wrote this report):
http://www.naa.org/Resources/Articles/Digital-Media-Cookbook/Digital-Media-Cookbook.aspx
For what it's worth, I did not make the decision to release it in PDF form, in 3 separate sections ...
Rich Gordon
Posted by: Rich Gordon | March 03, 2008 at 01:04 PM
Mark: I believe in moderation, and I believe in moderating every.single.comment. I don't find it to be a resource hog but I certainly see how you perceive it that way. I did too, until I hired part-time moderators. I manage a team of them and yest, their job is to moderate all comments. It is also important that they do it in a timely matter so as not to stifle the conversation. I think the idea of the community policing itself is a theory. Does it happen in some places? Yes. But news sites are inherently different and we have a unique set of issues that Linux or AOL or slashdot do not have. We have a brand that means something in the communities we serve and that's important. Hillary Schneider of Yahoo spoke valiantly about self-correction at ONA last year and how they were able to convince Pontiac to allow comments on a microsite becasue of this. Well, it does not work that way on news stories, and if and when it does it takes too long to come to pass. My goal here is not to hijack your blog so I will stop here. We've been there--choosing only certain stories for moderation and that was okay to a certain extent but it's been better since we opened them all. We know what to expect and some days are killer for the moderators. But you know, we are writing the rules along the way. And maybe that's what the newspaper industry finds stifling. Creating the tules as you go, as opposed to having set rules to follow from the beginning.
Posted by: Angela Connor | September 12, 2008 at 07:43 AM