Fish wrappers and encyclopedias: how can we fix newspapers’ update problem?


There is so much to love in Craig Silverman and the Tow Center’s new report, “Lies, Damn Lies and Viral Content” — from the very first sentence.

“News organizations are meant to play a critical role in the dissemination of quality, accurate information in society.”

Indeed! I feel a bit like that dorky kid who plays Dungeons & Dragons on his own for years, until he finds out that there’s actually a small D&D group that meets in someone’s basement. There’s been millions of words written about the news media’s struggle for viable business models in an online world, but I feel there’s very few people saying that,

  1. There’s a lot of misinformation out there,
  2. That matters because the news media’s job is to inform, so
  3. Even if social media content didn’t itself make its way into newspapers (which it does), papers have a responsibility to correct the public record and improve the state of public knowledge.

More data, more emotion


Silverman and his team have done great research here, driven by the data captured through their rumor-tracking tool, Emergent. There’s some eye-opening stuff on news outlets’ love of misleading headlines, and a handy list of recommendations for newsrooms.

Other insights about debunking needs that really grabbed me:

  • “Debunking efforts in the press are not guided by data and learning drawn from work in the fields of psychology, sociology, and political science… An evidence-based, interdisciplinary approach is needed.” Hear, hear.
  • There’s problems inherent in debunking the person, rather than the idea. I wonder, where does this leave the major political fact-checking sites (i.e. PolitiFact,, Washington Post Fact Checker)?
  • Viral hoaxes appeal to emotion, but so can debunking stories. For example, most stories about a rumored pumpkin-spice condom made it clear that Durex was planning no such product — but the stories still managed to be eye-catching and funny.
  • Hoaxes with geographic focus can inspire action — this indicates that we would be wise to foster ever more local fact checking.
  • Silverman’s report is the first major work on journalistic fact-checking that I’ve seen bring in major voices from the skeptic movement (such asDoubtful News and Tim Farley of What’s the Harm). As a participant in both communities I often link these efforts in my mind, but have seen few others do so — and I think there is so much useful engagement that could happen between fact-checking journalists and skeptics.

Update, archive and correction fails

But what I want to focus on is the problem of updates and corrections, the persistence of web content and the double-edged sword that is the online news archive.

Silverman — who is, after all, an authority on corrections — notes severalmajor problems in news outlets’ updates to rumor-based stories:

A particularly egregious example of the news media’s failure to update rumor articles. From Lies, Damn Lies and Viral Content, by Craig Silverman of Columbia Journalism School’s Tow Center for Digital Journalism.
  1. The updates don’t happen very often. Silverman and his team analyzed six of the claims that they tracked on Emergent, comparing the number of news organizations that originally covered the claim to those that followed up with a resolved truth-status — and percentage that followed up varied tremendously, but on average only hit slightly more than 50%.
  2. Such stories are often updated badly. Most notably, many news outlets will update the body text and then simply slap the word “Updated” on the headline — which results in headlines that make a rumor sound true, even when the body text makes clear that the rumor’s been debunked.
  3. Readers probably won’t see the update. “Obviously, there is no guarantee readers will come back to an article to see if it has been updated,” Silverman says — indeed, there’s very little guarantee, and very little chance.
  4. Mistaken articles persist — forever. “…Online articles exist permanently and can be accessed at any time. Search, social, and hyperlinks may be driving people to stories that are out of date or demonstrably incorrect,” Silverman writes. Even though news organizations followed up on a rumor with a debunking story roughtly half the time, they did so by writing a new story. “This means the original, often incorrect story still exists on their websites and may show up in search results,” Silverman points out. Rarely were follow-up links added to the initial article.

Why are we in this mess?


How can we innovate out of it?

I think these concerns all tap into some major ways in which newspapers have failed to adapt to and take advantage of their new digital homes — and ways they can push forward:

Fish wrappers no more

The best use for error-filled stories. “Fish n chips” by Canadian Girl Scout. Licensed under Public Domain via Wikimedia Commons.


First, news outlets forget that old stories don’t disappear from the public consciousness like, well, yesterday’s newspaper. If only last week’s and last month’s articles could still be wrapped around take-out fish and chips, the oily residue rubbing out hastily written paragraphs and unwarranted presumptions. Those days are gone. Not only do the archives linger on newspaper’s websites, but they’re linked to by other pages that will probably never die, and they’ll also get brought up by the right combination of words in a Google search.

Of course, I’m being a bit facetious claiming that archives are simply a liability. They also represent an enormous opportunity (more on that below). But right now, archives are a double loss for newspapers: their liability is unmitigated and their potential is untapped.

The news encyclopedia

Which brings us to: most newspapers lack a smart strategy for leveraging and commoditizing their archives.


Putting aside for a moment our acknowledgment that archive stories could have been wrong to begin with, or could now simply be outdated, they still contain vast amounts of useful information that is not being used and will hardly be seen. If properly checked, coded and collated, a newspaper’s archives on a particular event would form a powerful living encyclopedia entry that would rival Wikipedia for accuracy and completeness.

If thinking about coding and collating the New York Times’ 13 million articles back to 1851 is too overwhelming, think about this: every story we write today is part of tomorrow’s archive. The best way to build that future archive into something useful and informative is to follow Silverman’s advice, that “asking reporters and producers to own a particular claim and maintain a page is a reasonable request.” Making substantial updates means an opportunity to reshare — and as Silverman points out, that drives traffic.

Silverman doesn’t spell out the shape of this claim page, but I’m thinking of two options. One, each article on the given topic is linked into (and receives links out from) a “hub page” that shows the development of the reporters’ investigation. There’s a prominent hyperlinked box on the top of every news story saying something like:

Or in a more streamlined fashion, maybe there aren’t new articles on the topic — maybe there’s one evolving article. To my mind, this goes way beyond just rumor-checking stories. It means moving from the outmoded and now relatively useless idea of a static article, in which we pretend that every story gets set in hot metal and therefore will never, ever be altered, to a system of constantly updated pages — whose changes, by ethical necessity, must also be completely transparent and easily comprehended. (And to prevent link rot and reference rot, you would need an easy way for people to link to a particular version of the page, pinned to a particular date.)


We’re no longer hemmed in by this. Photo credit: The Original Movable Type via photopin.
We’re no longer hemmed in by this. Photo credit: The Original Movable Type via photopin.


There are many ways to do this (perhaps most naively, dare I suggest we bring back the idea of atime axis for HTTP?). Some experimentation on this began ages ago and is already feeling old hat — I’m thinking in particular of the Guardian’s Live pages, which are great for conveying the latest on quickly developing news, but aren’t the best format for giving people a quick overview of the most pertinent facts. Anyway, news organizations have got complacent in this arena, and there is much more that can be done.

Corrections suck… but don’t have to

ALERT! ALERT! Corrections should be inescapable.

Corrections were never a particularly effective vehicle, and their minuscule power has arguably diminished still further as the demand for them has increased. In the past, corrections hid in a little box around page A2 or so of your paper. Maybe you stumbled over that box, maybe you didn’t. Now, it’s still more flawed. Without a physical paper in hand, there’s little chance of anyone just accidentaly casting their eye over the equivalent of page A2.

Yet our potential to get corrections to readers is so much greater nowthan 20 years ago. We have the technological tools to alert readers who’ve read the articles in question. Why don’t we harness this? There are so many ways it could work. A couple that spring to mind:

  • When you register for a site like the New York Times, you agree that the site will keep a log of what pages you visit so that, when something is corrected, it can ping you to let you know (you could select an email or a social media alert — or the news outlet could experiment to see what works best). Or maybe just by using the site (without registration) you have to accept these pings, as we accept cookies today.
  • Major news organizations could collaborate to launch a one-stop corrections notification shop. (Or a third party could develop with news org buy-in.) This app or plug in would be voluntarily downloaded by the user and track her news consumption, and compare this to a growing online corrections bank populated by the participating news orgs.

Let’s talk

Those are just a few of the ideas wheels I’ve got spinning in the wake of this very important (and may I add, eminently readable) report. Systematic research and creative innovation are both, sadly, such rare beasts in journalism today — but they don’t have to be. Share your thoughts, and let’s move this conversation forward!

Cross-posted to Medium.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s