MediaBugs — Sharing our final report to our funders at Knight

Today, the Knight Foundation released a comprehensive report evaluating the class of Knight News Challenge winners of which MediaBugs was a (happy and grateful) part.

The report has plenty of food for thought about the challenges MediaBugs faced and the efforts we made to overcome them during the two years of our grant.

Its appearance is a good opportunity also for us to share the topline summary of our final report to Knight. We filed this a while ago and I meant to post it sooner. Here it is, in the interest of transparency, for those who’d like to hear the full version of how our wins and losses looked from our perch here!


At the end of 2011 MediaBugs is pivoting from a funded project to a volunteer effort. We’ve racked up some considerable successes and some notable failures. Here’s a recap covering the full two years of project funding:

Successes:

* We built and successfully launched first a Bay Area-based and then a national site for publicly reporting errors in news coverage. These projects represented a public demonstration of how a transparent, neutral, public process for mediating the conversation between journalists and the public can work.

* We surveyed correction practices at media outlets both in our original Bay Area community and then nationally and built a public database of this information.

*We built and maintained a Twitter account with approximately 500 followers to spread awareness of both MediaBugs itself and other issues surrounding corrections practices.

* We maintained our own MediaBugs blog and contributed frequently to the MediaShift Idea Lab blog, where our posts were selected by the 2011 Mirror Awards as a finalist.

* We led a campaign to improve those correction practices in the form of the Report an Error Alliance, collaborating with Craig Silverman to promote the idea that every story page should have a button dedicated to inviting readers to report the mistakes they find. The practice has gained some momentum, with adoption at high-traffic websites like the Washington Post and the Huffington Post.

* We handled 158 error reports with the two largest outcomes being closed: corrected (59) and closed: unresolved (68). Those results included corrections across a range of major media including The New York Times, Wall Street Journal, Washington Post, USA Today, Fox News, CNN, National Public Radio, CBS News, the Associated Press, Reuters, Yahoo News, TechCrunch, and others.

* We took one high-profile error report involving “American Taliban” John Walker Lindh, KQED, and the New York Times, and used it as a kind of teachable moment to publicize some of the problems with existing corrections practices as they collide with digital-era realities. Our extended effort resulted in the Times correcting a story that the subject (through his father) had failed to get corrected for nearly a decade. The full write-up of this story was published on The Atlantic’s website in July, 2011.

* Our surveys of correction practices and related public commentary led many news outlets to revamp and improve their procedures. And many of the specific error-report interactions that led to corrections helped shed light on formerly closed processes in newsrooms, leaving a public record of the interaction between members of the public who brought complaints and journalists who responded to them.

* We partnered with other organizations, including NewsTrust and the Washington News Council, on efforts to correct inaccuracies in news coverage and establish regional MediaBugs organizations. Our software platform became the basis for Carl Malamud’s Legal Bug Tracker project.

Failures:

Our single biggest failure was our inability to persuade any media outlet with a significant profile or wide readership to adopt our service and install our widget on their pages. This limited our reach and made it difficult to spread our ideas. Users had to know about our service already in order to use it, instead of simply finding it in the course of their media consumption.

Our efforts to solve this problem — outreach to friends and colleagues in media outlets; public and private overtures to editors, newsroom managers, and website producers; back-to-the-drawing-board rethinks and revamps of our product and service — occupied much of our time and energy through the two years of the Knight grant.

We did find some success in getting MediaBugs adopted by smaller outlets, local news sites and specialty blogs. In general, it seemed that the people who chose to work with us were those who least needed the service; they were already paying close attention to feedback from their readership. The larger institutions that have the greatest volume of user complaints and the least efficient customer feedback loops were the least likely to take advantage of MediaBugs.

We identified a number of obstacles that stood in our way:

* Large news organizations and their leaders remain unwilling even to consider handing any role in the corrections process to a third party.

* Most newsroom leaders do not believe they have an accuracy problem that needs to be solved. Some feel their existing corrections process is sufficient; others recognize they have a problem with making errors and not correcting them, but do not connect that problem with the decline in public trust in media, which they instead attribute to partisan emotion.

Our other major failure was that we never gathered the sort of active community of error reporters that we hoped to foster. Our efforts included outreach to journalism schools, promotion of MediaBugs at in-person events and industry gatherings (like Hacks and Hackers and SPJ meetups), and postings at established online community sites whose participants might embrace the MediaBugs concept. But our rate of participation and bug-filing remained disappointing.

One explanation we reluctantly came to consider that we hadn’t originally expected: Much of the public sees media-outlet accuracy failures as “not our problem.” The journalists are messing up, they believe, and it’s the journalists’ job to fix things.

A final failure is that we have not, to date, made as much progress as we hoped in transforming journalists’ way of thinking about corrections. We imagined that public demonstration of a more flexible view of errors and corrections would encourage a less secretive, less guilty-minded, more accepting stance in newsrooms. But two years after MediaBugs’ founding, getting news organizations to admit and fix their mistakes in most cases still demands hard work, persistence and often some inside knowledge. Most of the time, it still feels like pushing a boulder up a hill. This needs to change, for the good of the profession and the health of our communities, and MediaBugs intends to keep working on it.

Report an Error Report an error

The case of the New York Times’ terror error

[This article, which is a collaboration between Scott Rosenberg and Mark Follman, originally appeared on the Atlantic’s website. Since then it has been the subject of a MediaBugs error report filed by Frank Lindh. Yes, at MediaBugs, not only do we eat our own dogfood, we find it tasty!]


It is hard to describe the interview that took place on KQED’s Forum show on May 25, 2011, as anything other than a train wreck.

Osama bin Laden was dead, and Frank Lindh — father of John Walker Lindh, the “American Taliban” — had been invited on to discuss a New York Times op-ed piece he’d just published about his son’s 20-year prison sentence. The moment host Dave Iverson completed his introduction about the politically and emotionally charged case, Lindh cut in: “Can I add a really important correction to what you just said?”

Iverson had just described John Walker Lindh’s 2002 guilty plea as “one count of providing services to a terrorist organization.” That, Frank Lindh said, was simply wrong.

Yes, his son had pled guilty to providing services to the Taliban, in whose army he had enlisted. Doing so was a crime because the Taliban government was under U.S. economic sanctions for harboring Al Qaeda. But the Taliban was not (and has never been) classified by the U.S. government as a terrorist organization itself.

This distinction might seem picayune. But it cut to the heart of the disagreement between Americans who have viewed John Walker Lindh as a traitor and a terrorist and those, like his father, who believe he was a fervent Muslim who never intended to take up arms against his own country.

That morning, the clash over this one fact set host and guest on a collision course for the remainder of the 30-minute interview. The next day, KQED ran a half-hour Forum segment apologizing for the mess and picking over its own mistakes.

KQED’s on-air fiasco didn’t happen randomly or spontaneously. The collision was set in motion nine years before by 14 erroneous words in the New York Times.

This is the story of how that error was made, why it mattered, why it hasn’t been properly corrected to this day — and what lessons it offers about how newsroom traditions of verification and correction must evolve in the digital age.

[Read more…]

Report an Error Report an error

Three pillars of trust: Links, revisions, and error buttons

The journalism industry ships lemons every day. Our newsrooms have a massive quality control problem. According to the best counts we have, more than half of stories contain mistakes — and only three percent of those errors are ever fixed.

Errors small and large litter the mediascape, and each uncorrected error undermines public trust in news organizations. In Pew’s last survey in Sept. 2009, only 29 percent of Americans believed that the press “get the facts right.”

Yet the tools and techniques to fix this problem are known and simple. I’ve been working in this area for the last two years. Here’s a distillation of what I’ve learned: three basic steps any online news organization can take today to tighten quality control, reduce errors and build public trust.

    Link generously

    A piece without links is like a story without the names of its sources. Every link tells a reader, “I did my research. And you can double-check me.”

  • Read more on the value of links: In Defense of Links.
    Show your work

    The news isn’t static, and online stories don’t have to be, either. Every article or post can and should be improved after it’s published. Stay accountable and transparent by providing a “history” of every version of each story (a la Wikipedia) that lets readers see what’s changed.

  • Read a longer argument for the value of versioning.
    Or try out the WordPress plugin.
    Help people report your mistakes

    The Internet is a powerfully efficient feedback mechanism. Yet many news organizations don’t use it. Put a report-an-error button on every story: It tells readers you want to know when you’ve goofed. Then pay attention to what they tell you.

  • Get some report an error buttons at the Report an Error Alliance.
    Or use the MediaBugs widget.

Why aren’t these practices more widely adopted? Here are four reasons:

(1) Workflow and tools: In many newsrooms, especially those still feeding print or broadcast outlets, it’s still way too hard to fix errors or add links to a story for its Web edition. And content-management systems don’t yet offer corrections and history tools “out of the box.”

(2) Denial and avoidance: Other people make errors. Many editors and reporters don’t believe the problem is serious, or think it doesn’t apply to them. And most don’t understand how badly their Web feedback loop is broken.

(3) Fear of readers: Many journalists view readers as adversaries. The customer they feel they’re serving is an abstraction; the specific reader with a complaint is “someone with an agenda” whom they have a duty to ignore.

(4) Where’s the money? Many media companies are in financial free-fall. Correction systems and trust-building tools don’t bring in revenue directly, and they eat up product-development time and money.

These are serious obstacles. But journalists will never regain public trust unless we overcome them.

Ask journalists what sets them apart from everyone else sharing information online and we’ll say: We care about accuracy. We correct our mistakes. In a changing media economy that’s challenging the survival of our profession, we need to follow through on those avowals. Otherwise, we shouldn’t be surprised when Pew’s next biennial survey of public trust in the media shows even more dismal results.

[Crossposted from PBS MediaShift Idea Lab.]

Report an Error Report an error

Time to bake smart correction tools into news platforms

[cross-posted from the PBS MediaShift Idea Lab]

A window of opportunity is open right now for online journalists to build accuracy and accountability into the publishing systems we use every day. To understand why this is such a big deal, first hop with me for a minute into the Wayback Machine.

It’s the mid-1990s. Journalists have just arrived on the web. They’re starting sites like Hotwired and Pathfinder, Salon and Slate. They’re doing good work, but also, inevitably, making mistakes. Their customary corrections routine — post a notice in the next edition or issue — makes no sense in the new medium, where stories are just files on servers or data in databases, and fixes can take effect instantly and invisibly.

Editors at the dawn of the web understood they had to be accountable for changes they made to published stories, and so improvised a routine for handling substantive corrections: Fix the problem; place a notice on the story page indicating that you’ve fixed it; and — this step was only taken by extra-conscientious organizations — add a notice to a separate page logging the fact of the correction (and linking to the corrected story).

>p>Fast-forward to the present. The web’s publishing environment is vastly more complex, flexible and elaborate. But when it comes to corrections, virtually every news site still handles things the way we did 15 years ago: Go into the story, often by hand (i.e., by adding to the body of the story text), fix the error, and append a correction notice to the story top or bottom. Then, if your site has a separate corrections-listing page, go into that by hand and add the notice there. Insert any cross-links. Republish the story and the corrections page. And you’re finally done.

The process is cumbersome, to be sure; it’s also not smart. Most publishing systems don’t actually “know” that the story has been corrected. There’s no data stored that distinguishes a corrected story from, say, one that’s been altered in some other way. The typical content-management system software package will track each successive edit or revision to a document, but it doesn’t distinguish garden-variety edits from formal corrections.

For years now, I’ve dreamed of a smarter publishing software tool that would handle corrections intelligently and seamlessly as part of the publishing cycle and editorial workflow, rather than as a clumsy kludge. One goal, certainly, is to make editors’ lives easier. If corrections can be handled with less fuss, maybe news sites will be less reluctant to make them.

But an even more important goal is to give journalists and the public better information about corrections. Once corrections are treated as data, developers can do things with them — say, allow readers to sign up to be notified of corrections for a site, individual story or story category; or create display boxes that automatically link to the half-dozen most recent corrected stories. The ultimate purpose of all this is for news organizations to demonstrate accountability and transparency to a public that views them with sparse and dwindling trust.

Armstrong CMS project

So when I read about the new Armstrong CMS project, I got excited. Armstrong is an effort by the software teams at the Bay Citizen and the Texas Tribune to build a new-model, open source publishing system for local news sites. It’s working off the highly regarded Django content-management framework, funded by a $975,000 grant from the Knight Foundation, and building on existing work already in use at the two sites.

The Armstrong project has a chance to create a new standard for corrections for the entire field of web journalism. I asked Brian Kelley, the Bay Citizen CTO who is a co-leader of the project, whether Armstrong had plans for corrections yet. He suggested that, because many organizations have different needs, Armstrong’s open plug-in and extension options might be the best way to handle the corrections process.

Maybe so. At MediaBugs we certainly plan to explore this route with Armstrong as we have with other partners; our MediaBugs widget and WordPress plug-in are already in use on a handful of news sites.

But there’s a bigger opportunity for the Armstrong community here: They can build a smart correction-handling process into the heart of the tool they’re creating. The best practices in this area are widely understood and agreed upon; why not bake them into the technology? No one, to my knowledge, has done this before in a free, open source publishing system. (If there are proprietary systems that do a better job, I’d love to hear about them.)

Here are the basic features I’d want any corrections tool to provide:

  • Editors should be able to correct published stories by checking a box or clicking a button on an edit screen. If the system has a permissions hierarchy, then managers should be able to enable or disallow the option of making a correction.
  • Editors who are correcting a story are taken to a screen or overlay that lets them enter the text of a correction notice. The software would automatically record the date and time the correction was made.
  • Once the correction notice is entered, the editor is prompted to make whatever edits are required in the story text itself, and to save them. Editors would then have to republish the story, following whatever their site’s routine might be.
  • Ideally, a corrections system like this is part of a larger scheme for tracking and presenting all post-publication changes to each story. The database would record the changes made to a story as part of the correction process in a special way — that is, it would know that this particular revision is not just any old change but a formal correction.
  • Site designers and managers have the option of building a self-updating corrections page that automatically pulls in corrections notices and links back to the corrected stories.

That’s it! None of this is particularly challenging as programming or design work. My experience is that when I describe what’s needed to most developers, they’re not interested — the problem’s too “trivial.” Maybe it is — but not to the editors I’ve talked to, who groan about the pain their software inflicts on them whenever they try to do a correction the right way.

Each time we rewrite the software used to publish news on the web we have another chance to raise the bar for the whole field. I’m crossing my fingers that Armstrong will be the project to make smart corrections a reality.

Report an Error Report an error

‘There’s no problem!': Newsrooms in denial about rampant errors

Jonathan Stray has opened a new conversation about measuring accuracy in news reports. Stray, who works at the Associated Press and blogs on the side, comes at the issue with a refreshingly analytical, data-driven perspective. His in-depth post, which I urge you to read, does a couple of things. It summarizes important research:

There seems to be no escaping the conclusion that, according to the newsmakers, about half of all American newspaper stories contained a simple factual error in 2005. And this rate has held about steady since we started measuring it seven decades ago.

And it offers some useful ideas:

We could continuously sample a news source’s output to produce ongoing accuracy estimates, and build social software to help the audience report and filter errors.

Stray understands that it’s no good to count correction rates without tracking error rates, and vice versa — you need to know both if you want to assess a news organization’s performance. So he imagines a not-too-distant future in which many or most newsrooms sampled their story output regularly to gauge the frequency of errors and encouraged readers to submit (and rank) error reports. With some sort of standardization of both metrics, and if newsrooms could get comfortable with publishing these numbers, we’d finally have a useful yardstick for accuracy in news coverage.

I’m all for Stray’s vision. At MediaBugs, we’ve spent the last 18 months building one engine to power this accuracy-enhancing machine — the part about “social software to help the audience report and filter errors.” It’s been a rewarding but difficult quest. So I’ve spent considerable time thinking about the same issues as Stray, and I’d like to respond to his post by proposing a framework for thinking about the big question here — which, it seems to me, is, “What’s the holdup?”

Why are we still so far away from this vision? I think the answer lies not only where Stray looks, with issues of measurement and methodology, but also in the direction that Jay Rosen pointed us in his recent exploration of the interminable feud between journalists and bloggers.

In short, I don’t think this is purely a data problem. It’s equally a psychological dysfunction. It’s not just the numbers that are hard; it’s the feelings.

Here, as I see it, are the feelings in the newsroom that stand in the way of building Stray’s accuracy machine:

1. Denial: “There’s no problem here!”

Let’s start by acknowledging, as Stray does, that there’s nothing new about the problem of accuracy in news coverage. We’ve long known the dismal bottom line from the research in this area. Roughly half of newspaper stories contain errors; only a tiny fraction of those errors ever get corrected. The work Stray reviews to find this data — much of it by Scott Maier of the University of Oregon — is the same research I reviewed in starting MediaBugs. It’s the same stuff Craig Silverman highlighted in his definitive book on this subject.

accuracy chart.jpg

These results aren’t secrets! They ought to be the baseline for discussion of the issue in every newsroom in the country. Yet time and time again, we find that journalists’ jaws drop in disbelief when they encounter these statistics. And when pollsters report dismal drops in public faith in news coverage, the same journalists will fail to see any connection between high error rates and low trust.

In numerous lengthy conversations with journalists, I’ve encountered a litany of excuses, from “those aren’t real errors” to “people just want to read news they agree with.” Instead of fixing the problem, we blame the messenger.

Why is the field of journalism in such stubborn denial? Why isn’t the profession doing anything about what from any reasonable perspective is unacceptably poor performance?

Journalists routinely declare that their work rests on a foundation of public trust. Yet readers regularly tell us that they don’t trust journalists. Something is broken here.

I’d suggest that it’s time journalists stop insisting that their readers are confused or stupid or partisan and start getting their own house in order. The first step is simple: admit that the problem is real.

2. Overload: “There’s too much on our plate.”

At the very moment when every element of journalism — the business, the craft, the calling — seems to be undergoing violent metamorphosis, many practitioners view the effort to improve the correction process as an unaffordable luxury.

Why dot your “i”s when the roof is caving in? Is fixing errors just an exercise in Titanic deck-chair arrangement?

It’s easy to sit on the outside of organizations in turmoil and tell them what to do. But moments of convulsive institutional change are also opportunities to reform entrenched practices and install new routines.

Far-sighted leaders in newsrooms large and small have already begun to move the correction process from the margins of their work flow to the center. All management is about priorities. Journalists will start to improve their accuracy and win back public trust when their organizations signal them that these goals come first.

3. Pride: “We’ll deal with this on our own.”

Journalists who admit there’s an accuracy problem and prioritize solving it face another mental hurdle that may well be the toughest to leap. The newsroom ethos is usually a competitive one: Individuals and organizations both motivate themselves by trying to beat somebody else. We gauge our success by printing or posting the scoop first, topping the circulation numbers or unique user charts, or nabbing the prize. All this works well enough up to a point. But it gets in the way when we try to deal with a problem whose solution demands humility and openness more than sharp elbows.

Any newsroom that’s serious about improving its accuracy needs to accept Dan Gillmor’s dictum that “our readers know more than we do” and open up its processes to make use of that knowledge. This means relinquishing a little of the profession’s fierce independence. No editor is going to, or should, give up the right to decide whether a correction is warranted each time a problem gets flagged. But the smartest editors will accept that they need to give up the chokehold they’ve traditionally kept on the process of making that decision.

In the field of corrections as anywhere else, “openness” isn’t binary — it has gradations and nuances. I like to imagine these as a sort of ladder of transparency that news organizations need to climb.

On the first rung of this ladder, journalists readily fix mistakes they learn about and conscientiously disclose and record the details of each fix. (Most newsrooms declare allegiance to this ideal but, sadly, our MediaBugs research shows, the majority still fail to live up to it.)

One rung up, news outlets effectively solicit error reports from their audiences, making it clear that they welcome the feedback and will respond. The Report an Error Alliance is trying to push more news organizations to climb up here.

On the next rung up, newsrooms also willingly expose their own internal deliberations over particular controversies, explaining why they did or didn’t correct some issue readers raised and leaving some sort of public trail of the decision. At some publications, the ombudsman or public editor takes care of some of this.

On the final, topmost rung, the news organization will assure accountability by turning to a neutral third party to maintain a fair record of issues raised by the public. This shows external critics that the newsroom isn’t hiding anything or trying to shove problems under the rug. This is a key part of our model for MediaBugs.

Plainly, we’ve got a ways to go. At the conclusion of his accuracy essay, Stray writes, “I’d love to live in a world where I could compare the accuracy of information sources, where errors got found and fixed with crowdsourced ease, and where news organizations weren’t shy about telling me what they did and did not know.”

Me too! And I think Stray is correct to say that we won’t get there without admitting the seriousness of the accuracy problem, devising standardized accuracy metrics and improving the feedback loop for reporting errors. Yes, yes, and yes.

But since we keep bumping into invisible barriers on the way to this destination, we need to go further. We must put ourselves on the couch. Journalists aren’t very good at self-scrutiny, and the hardbitten old newshound in each of us might scorn such work as navel-gazing. Maybe it would help if we think of it, instead, as accountability reporting — on ourselves.

[Crossposted from the MediaShift Idea Lab blog]

Report an Error Report an error

MediaBugs t-shirts? We got t-shirts!

We’ve worked very hard to make MediaBugs an easy to use, inviting, and reliable service. But recently we realized that we were falling down on the job in a critical area.

Oh, we had widgets and Facebook buttons, videos and Twitter badges, email alerts and RSS feeds. But we were missing one vital digital-era feature that, plainly, was hobbling our efforts at outreach.

Friends, this failure has now been remedied. Today we announce the availability of the one-and-only MediaBugs t-shirt — a real stunner in lime-green on white. No collar tag to, um, bug you! Sized to your needs! (As long as what you need is X-Large.)

Like it? Want it? Can’t buy it, sorry. The only way to get your very own MediaBugs T-shirt is to file an error report. If we pick your bug report to feature in our “Bugs to Watch” list, you’ll get a shirt!

The only catch is that, obviously, this won’t work if you’ve chosen to file the bug report anonymously. If we don’t know who you are, we can’t ask you where to send your shirt. We’re good, but not that good.

Report an Error Report an error

MediaBugs, now in a WordPress plugin

Announcing the new MediaBugs plugin for WordPress. It’s for anyone who’s running a WordPress-based site that does journalism and wants readers to know that correcting errors is a priority.

Now adding a MediaBugs “report an error” button to any website that runs WordPress is a super-simple, 30-second process. If you know how to install a plugin, you can do it. (Alas, this will only work with self-hosted WordPress installations — or “WordPress.org” sites — and not with WordPress.com blogs, which don’t run plugins.)

We’ve had a MediaBugs widget that played nice with WordPress for some time now (it’s what I’ve been running on my personal blog), but the plugin makes it much easier to add to your site — you don’t need to mess with your theme templates unless, you know, that’s something you enjoy. (Hey, some of us do!)

Here’s what the plugin does: It adds a link to the bottom of every post for users to report errors. The link is customizable — you can use text or an icon or both, and you can edit the text easily, too. When a user clicks on the link, the MediaBugs error-reporting form pops up as an overlay, with the page’s Web address and headline automatically filled in. When the user has filled out the form, the error report gets filed at MediaBugs. (Wanna see? Just click on the little “Report an Error” icon at the bottom of this post!)

If you install the plugin, you can also sign up at MediaBugs to receive an email or RSS notification each time someone reports an error on your WordPress site.

The MediaBugs plugin lives here in the WordPress.org plugin directory. Let us know if you install it — we want to know how it goes!

Report an Error Report an error

MediaBugs teams up with NewsTrust’s Truthsquad to fix the news

Beginning this month we’re starting a partnership with two other great organizations that share MediaBugs’ vision of how to improve the error-correction process in journalism.

For several years now, NewsTrust.net has provided a platform for users to rate the quality and trustworthiness of news reports. Recently NewsTrust started a project called Truthsquad that presents a user with statements in the media by public figures and pundits, then asks for help in fact-checking the statements. Users contribute links and arrive at a collective judgment on the truth of the statement, then NewsTrust editors deliver a “verdict.”

NewsTrust founder Fabrice Florin and I have been talking for some time about how our two organizations might collaborate, and we think we’ve found a model that’s worth trying. We’re also delighted to be working with Craig Silverman’s RegretTheError.com as part of this project. Craig, of course, is probably the world’s leading expert on corrections in the news (he’s also a MediaBugs adviser).

For the next several weeks, NewsTrust’s Truthsquad will run one statement a week through its fact-checking process. Then, each week, if the Truthsquad concludes that a media report was in error, we’ll file a bug here at MediaBugs and see if we can elicit a response or get it corrected.

For instance, our first Truthsquad report focused on a clip from Fox Business that criticized President Obama’s India trip for its $200-million-a-day tab. As you can see, the Truthsquad consensus and verdict concluded that the $200 million a day figure was completely unsupported. So today we filed a bug report relating to the story. We’ll keep you posted on its progress.

In other words, we’re wedding an organized fact-checking process on the one hand with an accountable error-correction process on the other. We think this is something that has never been tried before! If it sounds valuable to you, head over to NewsTrust and join in the Truthsquadding — then over to MediaBugs to help us fix the errors we find.

Report an Error Report an error

Survey: News websites across US botch error reporting, corrections

The second of our MediaBugs surveys of correction practices — this one nationwide — confirms the pattern we found in our first, Bay-Area-only study: Most news websites make it hard for readers to report errors and find corrections. Here are the gory details.

Interestingly, the cable news networks have the best overall record — a better one than newspapers or magazines. There’s one exception, however: the Fox News website is entirely lacking in any corrections-related content or information: no way to find out if they fixed something and no way to tell them they got it wrong.

As a result of what we found in our first survey, we made a point of incorporating information about the error-correction practices of each media organization right in the MediaBugs interface — you can find it as part of each listing on our Browse by Media Outlet page.

If you’re involved in running one of these websites, have a look at MediaBugs’ best practices page — and know that repairing these problems really isn’t that much work.

If you’re a reader or user of these sites, consider taking the step of telling them about that page: assuming they haven’t buried their email address!

Report an Error Report an error

Why MediaBugs won’t take the red or blue (state) pill

We’re excited about the expansion of MediaBugs.org, our service for reporting errors in news coverage, from being a local effort in the San Francisco Bay Area to covering the entire U.S.

But with this expansion we face an interesting dilemma. Building a successful web service means tapping into users’ passions. And there’s very little that people in the U.S. are more passionate about today than partisan politics.

We have two very distinct populations in the country today with widely divergent views. They are served by separate media establishments, and they even have their own media-criticism establishments divided along the red and blue axis.

So the easiest way to build traffic and participation for a new service in the realm of journalism is to identify yourself with one side or the other. Instant tribe, instant community. Take a red-state pill or a blue-state pill, and start watching the rhetoric fly and the page views grow.

I’m determined not to do that with MediaBugs, though it’s sorely tempting. Here’s why.

I don’t and can’t claim any sort of neutrality or freedom from bias as an individual, and neither, I believe can any journalist. Anyone who reads my personal blog or knows my background understands that I’m more of a Democratic, liberal-progressive kind of person. This isn’t about pretending to some sort of unattainable ideal of objectivity or about seeking to present the “view from nowhere.”

Instead, our choice to keep MediaBugs far off the red/blue spectrum is all about trying to build something unique. The web is already well-stocked with forums for venting complaints about the media from the left and the right. We all know how that works, and it works well, in its way. It builds connections among like-minded people, it stokes fervor for various causes, and sometimes it even fuels acts of research and journalism.

What it rarely does, unfortunately, is get results from the media institutions being criticized. Under the rules of today’s game, the partisan alignment of a media-criticism website gives the target of any criticism an easy out. The partisan approach also fails to make any headway in actually bringing citizens in the different ideological camps onto the same playing field. And I believe that’s a social good in itself.

It would be easy to throw up our hands and say, “Forget it, that will never happen” — except that we have one persuasive example to work from. Wikipedia, whatever flaws you may see in it, built its extraordinary success attracting participation from across the political spectrum and around the world by explicitly avowing “a neutral point of view” and establishing detailed, open, accountable processes for resolving disputes. It can get ugly, certainly, in the most contested subject areas. But it seems, overall, to work.

So with MediaBugs, we’re renouncing the quick, easy partisan path. We hope, of course, that in return for sacrificing short-term growth we’ll emerge with a public resource of lasting value. The individuals participating in MediaBugs bring their own interests and passions into the process. It’s the process that we can try to maintain as a fair, open system, as we try to build a better feedback loop for fixing errors and accumulate public data about corrections.

To the extent that we are able to prove ourselves as honest brokers in the neverending conflicts and frictions that emerge between the media and the public, we will create something novel in today’s media landscape: An effective tool for media reform that’s powered by a dedication to accuracy and transparency — and that transcends partisan anger.

I know many of you are thinking, good luck with that. We’ll certainly need it!

Crossposted from MediaShift Idea Lab

Report an Error Report an error