Lara Logan is shown in Tahrir Square on the day she was attacked by a mob. (CBS News )
Remember when you threw that raging party in high school? Being at the center of the action felt so good — until a couple of guests parked on your lawn. And someone emptied your dad's liquor cabinet. And an unknown visitor burned a hole in the sofa.
That's a bit like the dynamic confronting big news outlets that bring crowds of people together online. All media love the traffic and engagement that Internet comment boards draw — that is, until too many louts crash the party, murdering the communal vibe and driving off the cool guests.
Comment board cretins went into overdrive again last week, leaving their ugly mark beneath many news stories about the attack in Egypt on CBS correspondent Lara Logan. Some of the knuckle-walkers insisted on blaming the victim, saying she was too blond or too female to be in a danger zone. Others sought to mete out their digital revenge by blaming all Muslims for the attack on Logan.
The ugly response to the Tahrir Square assault renewed the debate about how much latitude to grant the public when it comments on news stories online, including commentaries by the reader's representative at the Los Angeles Times and the ombudsman for National Public Radio.
NPR intends in March to move to "more tightly moderated comments, in some cases before they are posted," ombudsman Alicia Shepard wrote last week. Martin Beck, reader engagement editor for latimes.com, told me the newsroom's website would like to find a better way to manage conversations that too often "get out of control or ugly."
It seems long past time for reputable news sites to clamp down on the gutter talk. Otherwise the open-door policy at npr.org, latimes.com and many other sites drives down the quality of the conversation and alienates the kind of thoughtful guests that make the party worth coming to in the first place.
Why not even require online guests to post comments under their real names — as newspapers have required letter writers to do for decades?
The conversation underscores the wild pendulum swings roiling most big media outlets. Not so long ago, readers had very few opportunities to have their say but the birth of blogging and social media sites like Facebook unleashed a pent-up demand by the masses to be loud and proud.
Rushing to embrace the new ethic, news outlets began to open the gates on their "walled gardens." They wanted to expand audiences and keep them engaged longer — measurable metrics that also can pay off in higher ad revenue.
The conundrum for traditional news sites has been how to combine mass and class — promoting the best of a free-wheeling "Webbiness," while preserving the sort of thoughtful preserve that their readers had come to expect.
Many news sites, including latimes.com, began by having reporters and editors review the public's comments before publication. But journalists who spend hours vetting thousands of submissions don't have much time left to tend to Job One — gathering, analyzing and reporting original news. That's a problem.
In an effort to widen the discussion and solve the oversight dilemma, The Times last April instituted a new program that cleared the way for comments on more stories while mostly leaving the policing of content up to users.
If two users click a "Report Abuse" red flag next to an latimes.com comment, it's kicked off the page. That takes care of a good number of the garbage comments, but some nasty, degrading and even concocted information gets through — not to mention the odd pitch for a fly-by-night insurance company.
News outlets have turned to a bunch of options to eliminate or marginalize junk submissions. The Washington Post last year announced a system that relies mostly on reader reviews to push inappropriate comments (and those who repeatedly offer them) off the main comment pages or even off the paper's website altogether.
Huffington Post, which boasts a whopping total of nearly 4 million comments a month, employs 30 full- and part-time workers to monitor what may be the most active discussion boards on the Web. The screeners are aided by a computer system (known in house as "Julia") that sniffs out words, phrases, even context, that hint at potentially problematic posts.
Human monitors then make the final call on which missives are out of bounds, said Jai Singh, HuffPo's managing editor. Somewhere between 5% and 10% of comments don't make the cut.
The New York Times employs a smaller staff of eight comment screeners, only two of them full-time employees. But with a go-slow approach nytimes.com allows comments only on a few of its stories. It averaged a relatively modest 4,700 comments a day in 2010. The monitors tossed out about 14% of the missives, said Sasha Koren, deputy editor for community and social media.