Feeds:
Posts
Comments

Archive for the ‘Skepticism’ Category

Burzynski outside his SOAH hearing last November. (Credit: Tamar Wilner)

Burzynski outside his SOAH hearing last November. (Credit: Tamar Wilner)

Two Texas judges released their findings Oct. 12 in the case of Stanislaw Burzynski, the controversial Houston cancer doctor who’s been accused of deceptive advertising, making medical errors, and treating patients with untested combinations of drugs that amounted to “experimentation.” On most of these charges, the judges failed to find enough evidence of wrongdoing to discipline Burzynski.

The State Office of Administrative Hearings did find, however, that Burzynski is subject to sanctions for:

  • Aiding and abetting the unlicensed practice of medicine by one practitioner;
  • Failing to properly supervise three unlicensed physicians who misrepresented themselves as doctors;
  • Inaccurately reporting the size of one patient’s tumor, causing a misclassification of the patient’s response to treatment;
  • Failing to gain properly informed consent from several patients;
  • Failing to disclose his ownership interest in the pharmacy that prescribed patients’ drugs;
  • Failing to maintain adequate records to support charges made to four patients.

The conclusions from the judges at Texas’ State Office of Administrative Hearings (SOAH) are not binding, but come in the form of recommendations to the Texas Medical Board. It’s the usual practice for the board to accept and implement SOAH recommendations.

A hot take

I honestly didn’t know what conclusion to expect in this case, despite the many months I spent researching it for my Newsweek article last February. Burzynski has faced sanctions many, many times before, and it’s never impeded his practice much. So I don’t find myself particularly shocked by this result.

From my quick skim of the findings, it seems that SOAH failed to find against Burzynski on some of the most important charges because of missteps by the Texas Medical Board (who acted as the prosecution).

The board’s expert witness on the standard of care allegations, such as the use of untested drug combinations, was Dr. Cynthia Wetmore of Children’s Healthcare of Atlanta. But the judges found her testimony “troubling” when she accused Burzynski of violating the standard of care for Patient D — a patient who never received treatment at the clinic. The judges also noted that Wetmore is a pediatric oncologist in an academic, rather than private practice, setting.

They therefore concluded, “Dr. Wetmore’s expertise about the treatment of adult cancer and the standard of care used by physicians in private practice is limited and will be given little weight.”

That must have been a harsh blow for the medical board.

Has Burzynski “saved lives”?

The other surprising finding was this: “Respondent’s [Burzynski’s] treatments have saved the lives of cancer patients, both adults and children, who were not expected to live.”

With the caveat, again, that I have only skimmed the decision, it’s hard to see what reliable scientific evidence the judges had for this claim. Certainly I’ve never come across any. Yes, there are dozens of patients who claim that Burzynski did exactly that — save their lives. But did the judges really base this conclusion on anecdote alone?

The judges further listed, among “mitigating factors” meant to inform eventual sanctions against Burzynski:

If Respondent is unable to continue practicing medicine, critically ill cancer patients being treated with ANP [antineoplastons, the drug that has been the focus of Burzynski’s research] under FDA-approved clinical trials or a special exception will no longer have access to this treatment.

Amazing that after decades of research, with not a single randomized controlled trial demonstrating the effectiveness of ANP, the judges still see the loss of such drugs as some kind of harm to public health.

Note: The decision can be viewed by visiting http://www.soah.texas.gov/, clicking “Electronic Case Files,” then “Search Public Case Files,” then searching under Docket Number 503–14–1342.

Cross-posted from Medium, with minor alterations.

Read Full Post »

file3581277331142

There is so much to love in Craig Silverman and the Tow Center’s new report, “Lies, Damn Lies and Viral Content” — from the very first sentence.

“News organizations are meant to play a critical role in the dissemination of quality, accurate information in society.”

Indeed! I feel a bit like that dorky kid who plays Dungeons & Dragons on his own for years, until he finds out that there’s actually a small D&D group that meets in someone’s basement. There’s been millions of words written about the news media’s struggle for viable business models in an online world, but I feel there’s very few people saying that,

  1. There’s a lot of misinformation out there,
  2. That matters because the news media’s job is to inform, so
  3. Even if social media content didn’t itself make its way into newspapers (which it does), papers have a responsibility to correct the public record and improve the state of public knowledge.

More data, more emotion

surprised

Silverman and his team have done great research here, driven by the data captured through their rumor-tracking tool, Emergent. There’s some eye-opening stuff on news outlets’ love of misleading headlines, and a handy list of recommendations for newsrooms.

Other insights about debunking needs that really grabbed me:

  • “Debunking efforts in the press are not guided by data and learning drawn from work in the fields of psychology, sociology, and political science… An evidence-based, interdisciplinary approach is needed.” Hear, hear.
  • There’s problems inherent in debunking the person, rather than the idea. I wonder, where does this leave the major political fact-checking sites (i.e. PolitiFact, FactCheck.org, Washington Post Fact Checker)?
  • Viral hoaxes appeal to emotion, but so can debunking stories. For example, most stories about a rumored pumpkin-spice condom made it clear that Durex was planning no such product — but the stories still managed to be eye-catching and funny.
  • Hoaxes with geographic focus can inspire action — this indicates that we would be wise to foster ever more local fact checking.
  • Silverman’s report is the first major work on journalistic fact-checking that I’ve seen bring in major voices from the skeptic movement (such asDoubtful News and Tim Farley of What’s the Harm). As a participant in both communities I often link these efforts in my mind, but have seen few others do so — and I think there is so much useful engagement that could happen between fact-checking journalists and skeptics.

Update, archive and correction fails

But what I want to focus on is the problem of updates and corrections, the persistence of web content and the double-edged sword that is the online news archive.

Silverman — who is, after all, an authority on corrections — notes severalmajor problems in news outlets’ updates to rumor-based stories:

A particularly egregious example of the news media’s failure to update rumor articles. From Lies, Damn Lies and Viral Content, by Craig Silverman of Columbia Journalism School’s Tow Center for Digital Journalism.
  1. The updates don’t happen very often. Silverman and his team analyzed six of the claims that they tracked on Emergent, comparing the number of news organizations that originally covered the claim to those that followed up with a resolved truth-status — and percentage that followed up varied tremendously, but on average only hit slightly more than 50%.
  2. Such stories are often updated badly. Most notably, many news outlets will update the body text and then simply slap the word “Updated” on the headline — which results in headlines that make a rumor sound true, even when the body text makes clear that the rumor’s been debunked.
  3. Readers probably won’t see the update. “Obviously, there is no guarantee readers will come back to an article to see if it has been updated,” Silverman says — indeed, there’s very little guarantee, and very little chance.
  4. Mistaken articles persist — forever. “…Online articles exist permanently and can be accessed at any time. Search, social, and hyperlinks may be driving people to stories that are out of date or demonstrably incorrect,” Silverman writes. Even though news organizations followed up on a rumor with a debunking story roughtly half the time, they did so by writing a new story. “This means the original, often incorrect story still exists on their websites and may show up in search results,” Silverman points out. Rarely were follow-up links added to the initial article.

Why are we in this mess?

and

How can we innovate out of it?

I think these concerns all tap into some major ways in which newspapers have failed to adapt to and take advantage of their new digital homes — and ways they can push forward:

Fish wrappers no more

Fish_n_chips

The best use for error-filled stories. “Fish n chips” by Canadian Girl Scout. Licensed under Public Domain via Wikimedia Commons.

 

First, news outlets forget that old stories don’t disappear from the public consciousness like, well, yesterday’s newspaper. If only last week’s and last month’s articles could still be wrapped around take-out fish and chips, the oily residue rubbing out hastily written paragraphs and unwarranted presumptions. Those days are gone. Not only do the archives linger on newspaper’s websites, but they’re linked to by other pages that will probably never die, and they’ll also get brought up by the right combination of words in a Google search.

Of course, I’m being a bit facetious claiming that archives are simply a liability. They also represent an enormous opportunity (more on that below). But right now, archives are a double loss for newspapers: their liability is unmitigated and their potential is untapped.

The news encyclopedia

Which brings us to: most newspapers lack a smart strategy for leveraging and commoditizing their archives.

2015-02-19_08-58-39

Putting aside for a moment our acknowledgment that archive stories could have been wrong to begin with, or could now simply be outdated, they still contain vast amounts of useful information that is not being used and will hardly be seen. If properly checked, coded and collated, a newspaper’s archives on a particular event would form a powerful living encyclopedia entry that would rival Wikipedia for accuracy and completeness.

If thinking about coding and collating the New York Times’ 13 million articles back to 1851 is too overwhelming, think about this: every story we write today is part of tomorrow’s archive. The best way to build that future archive into something useful and informative is to follow Silverman’s advice, that “asking reporters and producers to own a particular claim and maintain a page is a reasonable request.” Making substantial updates means an opportunity to reshare — and as Silverman points out, that drives traffic.

Silverman doesn’t spell out the shape of this claim page, but I’m thinking of two options. One, each article on the given topic is linked into (and receives links out from) a “hub page” that shows the development of the reporters’ investigation. There’s a prominent hyperlinked box on the top of every news story saying something like:

Or in a more streamlined fashion, maybe there aren’t new articles on the topic — maybe there’s one evolving article. To my mind, this goes way beyond just rumor-checking stories. It means moving from the outmoded and now relatively useless idea of a static article, in which we pretend that every story gets set in hot metal and therefore will never, ever be altered, to a system of constantly updated pages — whose changes, by ethical necessity, must also be completely transparent and easily comprehended. (And to prevent link rot and reference rot, you would need an easy way for people to link to a particular version of the page, pinned to a particular date.)

 

We’re no longer hemmed in by this. Photo credit: The Original Movable Type via photopin.

We’re no longer hemmed in by this. Photo credit: The Original Movable Type via photopin.

 

There are many ways to do this (perhaps most naively, dare I suggest we bring back the idea of atime axis for HTTP?). Some experimentation on this began ages ago and is already feeling old hat — I’m thinking in particular of the Guardian’s Live pages, which are great for conveying the latest on quickly developing news, but aren’t the best format for giving people a quick overview of the most pertinent facts. Anyway, news organizations have got complacent in this arena, and there is much more that can be done.

Corrections suck… but don’t have to

ALERT! ALERT! Corrections should be inescapable.

Corrections were never a particularly effective vehicle, and their minuscule power has arguably diminished still further as the demand for them has increased. In the past, corrections hid in a little box around page A2 or so of your paper. Maybe you stumbled over that box, maybe you didn’t. Now, it’s still more flawed. Without a physical paper in hand, there’s little chance of anyone just accidentaly casting their eye over the equivalent of page A2.

Yet our potential to get corrections to readers is so much greater nowthan 20 years ago. We have the technological tools to alert readers who’ve read the articles in question. Why don’t we harness this? There are so many ways it could work. A couple that spring to mind:

  • When you register for a site like the New York Times, you agree that the site will keep a log of what pages you visit so that, when something is corrected, it can ping you to let you know (you could select an email or a social media alert — or the news outlet could experiment to see what works best). Or maybe just by using the site (without registration) you have to accept these pings, as we accept cookies today.
  • Major news organizations could collaborate to launch a one-stop corrections notification shop. (Or a third party could develop with news org buy-in.) This app or plug in would be voluntarily downloaded by the user and track her news consumption, and compare this to a growing online corrections bank populated by the participating news orgs.

Let’s talk

Those are just a few of the ideas wheels I’ve got spinning in the wake of this very important (and may I add, eminently readable) report. Systematic research and creative innovation are both, sadly, such rare beasts in journalism today — but they don’t have to be. Share your thoughts, and let’s move this conversation forward!

Cross-posted to Medium.

Read Full Post »

Note: I have joined the “virtual class” component of Dan Kahan‘s Science of Science Communication course at Yale University. As part of this I am endeavoring to write a response paper in reaction to each week’s set of readings. I will post these responses here on my blog – my response for week five is below. Week 1 is here, and week 2 is here. (I was away for weeks 3 and 4.)

I will also be participating in the discussion on Kahan’s own blog.


This week I seek to examine several myths that Kahan and Discover blogger Keith Kloor say news media have perpetuated in the wake of the Disneyland measles outbreak.

Trends in MMR immunization

MMR immunization rates

The most easily disprovable myth is that measles, mumps and rubella (MMR) immunization is falling. (This myth has been perpetuated by, among others, the Los Angeles Times.) The best source for this data is the CDC’s National Immunization survey. This data shows that since 1999, the percentage of children aged 19-35 months who had received the MMR vaccination held roughly steady, at between 90 and 93 percent (see chart). The slight fluctuations in that range have shown no particular trend.

While national immunization rates are important, more localized rates are crucial because pockets of low immunity can allow outbreaks to take hold. As the CDC notes, MMR coverage was below the 90 percent threshold in 17 states in 2013, and that definitely presents a problem. But is it a growing problem? I used CDC data to put together a quick spreadsheet of five- and ten-year trends, state by state, and at a glance found many states whose rates have dropped – but not by a statistically significant margin. The confidence intervals here are pretty high. (If anyone has done a serious trend analysis, I’d love to see it.)

Are parents concerned about vaccine safety? What does that mean?

In a January 22 piece on the Washington Post’s Wonkblog, Christopher Ingraham blames “the anti-vaccine movement” for the worrying rise in measles cases, citing an AP-GfK survey that found, as Ingraham puts it, “only 53 percent of Americans were confident that vaccines are safe and effective.” For a start, that’s a pretty big misrepresentation of the survey, in which 53 percent were very or extremely confident that childhood vaccines are safe and effective. Another 30 percent were somewhat confident.

In any case, Kahan argues that the AP-GfK survey isn’t a good measure: “Indeed, no public opinion survey of the general public can give anyone useful information on vaccine risk concerns. The only valid evidence of that generally is the National Immunization Survey, which uses actual vaccine behavior to determine vaccination rates,” he told Kloor.

I think we can agree that the NIS represents the best way to measure what proportion of parents are affected by concerns or other factors strongly enough to substantially affect their children’s immunization program. After all, if the concern is strong enough to actually affect vaccination outcomes (and surely those are the concerns we’re most interested in) then we should see that in a measure of vaccination outcomes!

However, there are several important things the NIS can’t tell us. Notably, it doesn’t give insight into the reasons behind non-vaccination. We are right to ask what economic, psychological and social factors are behind parents’ failure to immunize against measles, particularly in geographic or socioeconomic pockets that fall below target immunity, because knowing the causes of missed immunizations will help us formulate the best science communication response.

One study that has engaged with this question is Dempsey et al, “Alternative Vaccination Schedule Preferences Among Parents of Young Children.” The results of this study have been misrepresented by the Advisory Board Company and subsequently by the Post’s Petula Dvorak – the latter, for example, says “one in 10 parents are avoiding or delaying vaccines in their children because of safety concerns.”

Dempsey’s findings are more complex. She found that 13 percent of parents of young children reported using an alternative vaccination schedule (that is, they reported that did not completely adhere to the CDC vaccination schedule). Of these, a strong majority of 82 percent – but by no means all – agreed that, “Delaying vaccine doses is safer for children than providing them according to the CDC-recommended vaccination schedule.” Some parents who followed an alternative vaccination schedule may have done so because of difficulty accessing medical care, or because they simply failed to immunize on time.

I would note Dempsey’s small-ish sample size of under 800 (compared to over 13,000 in the CDC’s survey), and also potential motivated reasoning on the part of her respondents. That is, a parent whose original reason for delaying vaccine was a lack of time or simply forgetfulness might convince himself retroactively that the reason was “delaying vaccine doses is safer for children.” But, even given all these caveats, I think Dempsey makes a decent case for the power of safety concerns to drive non-vaccination – so I would be interested to hear any rebuttals.

Who doesn’t vaccinate, and why?

Having said that, safety concerns are but one part of a complex set of drivers – and some of the most destructive myths around non-vaccinators are about what drives them and, even more potently, who they are.

Many commentators have described non-vaccinators as “anti-science” or lacking “trust” in science and medicine. For the most part these are labels that the non-vaccinators would not themselves recognize. I could write another post entirely on whether that matters, but let me sum up for now by saying it does matter, at the very least, because such language is polarizing and alienating. For example, an NPR caller challenged Paul Offit on what she saw as his presumption that non-vaccinators are “yahoos who just don’t look at the scientific process at all.” Did she show a less than scientific mindset when she rejected Offit’s explanations as “pap,” arguing that “a two-year-old cannot accept this kind of chemical onslaught”? Perhaps. But she doesn’t see herself as rejecting science. She sees herself as a critical thinker – indeed, as she says, “an educated adult.”

As if that weren’t polarizing enough, we now have several stereotypes emerging about who non-vaccinators are: either rich hippies, religious nuts, or conspiracy theorists. As Dvorak writes: “The fringe who didn’t believe in medicine for religious and other reasons has exploded into a 10 percent, largely yuppie epidemic.” In the same article, George Washington University public health professor Alexandra Stewart says the non-vaxxers are primarily “white, educated populations of people with computers.”

The reality is much less homogenous – and less ideological. First, on a broad scale, MMR vaccination rates are roughly equal across most racial lines (91.5 percent for whites, 90.9 percent for black non-Hispanic, 92.1 percent for Hispanics) and a little higher for Asian-Americans (96.7 percent) and American Indians (96.3 percent). Second, as Kahan found in his study of 2,316 adults, Vaccine Risk Perceptions and Ad Hoc Risk Communication: An Empirical Assessment, “There is no meaningful correlation between vaccine risk perceptions and the sorts of characteristics that usually indicate membership in one or another cultural group.” These groups measures included a sliding scale of political orientation (liberal-conservative, Democrat-Republican) as well as two latent measures of risk-perception: tendencies to perceive risk to public safety and to social deviancy. (These risk perception measures were generated from subjects’ perception of risk from a variety of other issues, including climate change, marijuana legalization and gun ownership, and correlate with the common Hierarchy-Egalitarianism and Individualism-Communitarianism worldview scales.)

I would also point to some interesting work done by Yvonne Villanueva-Russell, who in extensive interviews found a variety of motivations for those who failed to vaccinate their children to the CDC schedule. Some of these parents seemed to fit the stereotypes (“crunchy mommas,” “ideologues”) but others simply put off making a decision for too long, or have children with health issues. Her sample size of 67 parents is small, but Villanueva-Russell’s work gives us some idea of the range of motivations we should take into account when designing further research – and before we speak out of hand about “anti-vaxxers.”

Read Full Post »

Rainbow brain frogsI’ve just returned from The Amazing Meeting, a gathering for people who want to promote critical thinking and science literacy. The event, organized by the James Randi Educational Foundation, was a great place to make connections and gain a fresh perspective as I seek to brainstorm ways to reduce misinformation in the media.

This year’s theme was Skepticism and the Brain, so a lot of talks brought up psychological processes that perpetuate misinformation – things like confirmation bias and the backfire effect. But the intersections between critical thinking, science communication and journalism go well beyond that. Some highlights and reflections:

  • Chris Guest, in a short paper, discussed some of the pitfalls of Bayesian analysis, and suggested that this kind of misunderstanding of probability may have fuelled the fire of AIDS denialism. This area seems ripe for more exploration, and there is much work to be done on improving journalists’ understanding of probability – especially as story choices are so often based on ideas of the “unusual” (eg “man bites dog”).
  • Donald Prothero, speaking on the back of his recent book Reality Check: How Science Deniers Threaten Our Future, rejected the idea that “there’s woo on the left and woo on the right – they’re just the same.” Only the GOP, he contended, adopts rejection of science as its official policy. He also offered some hope for climate change communicators, saying recent polls have shown that 60-80% of Americans have come to accept that climate change is real. Hard-core climate deniers, he said, are at most 10-20% of the population – but they are the loudest, and they control politics.
  • Scientific American editor-in-chief Mariette DiChristina took us through a history of notable articles in the publication – including a 1959 piece about the connection between carbon dioxide and climate. One member of the audience asked about an amateur scientist column that SciAm used to run, and interestingly DiChristina seemed to share his regret at its cancellation: “Boy did we blow it, because look at the maker movement today.” She said the SciAm website is exploring more ways to get the public engaged in science. Currently, the main methods are through its Citizen Science channel which links to crowdsourced research projects such as Galaxy Zoo and FoldIt; and Bring Science Home activities, which parents can use with their children. But it sounds like before too long, we might see new initiatives.
  • Sharon Hill, the creator of Doubtful News and the Media Guide to Skepticism as well as the Sounds Sciencey column for the Committee for Skeptical Inquiry, said that she tries to make her brand positive, not positioning herself as a “debunker” or “skeptic.” She has found traction for her science-based news by following article trends and playing to SEO. For example, Hill says she was the only person who wrote skeptically about an article drawing a link between MSG and autism. This is a great example of where news’ obsession with “balance,” so often an editorial pressure that leads readers away from the truth, can actually work in favor of the truth. Reporters are looking for someone to take a counter-position, and skeptics can help to fill that gap, thereby helping to bring scientific points of view to the public.
  • Karl Kruszelnicki (Dr. Karl), a scientific polymath often described as the “Australian Bill Nye,” described journalists as being so overloaded with information that they can’t concentrate. It certainly did comfort me to feel I’m not alone in this regard – and to know I’m not as badly affected as some reporters, whom Karl said couldn’t even sit down and read a novel anymore!
  • Finally, I was gratified to hear that many people are interested in my idea for a Misinformation Science wiki: a website that will summarize the major psychological and industry factors helping to perpetuate the spread of misinformation in the media, and various efforts to combat this problem. I hope to have the beta version ready in a few weeks. Feel free to message me if you’d like to be notified when this goes live.

It was a highly enjoyable and thought-provoking conference, and I encourage any and all of my fellow attendees to get in touch with me to continue the conversation. See you at TAM 2015!

photo credit: “lapolab” via photopin cc

Read Full Post »

question mark girl croppedGood TLDR podcast this week (as every week) from On The Media. Alex Goldman and PJ Vogt looked at two attempts to slow the spread of viral hoaxes on the web: Paulo Ordoveza’s PicPedant Twitter feed, and Adrienne LaFrance’s Antiviral column on Gawker.

My favorite bit was this exchange:

LaFrance: I think as humans we probably care more about stories and storytelling than truth, on some level.

Vogt: You’re always going to be a niche thing, a little bit, right? The truth market’s always going to be smaller than the story market.

LaFrance: Right. I think that at the same time if you can get people to stumble upon it and they start to think about the way they consume information differently — we’re having to train people to be a little more nuanced in this digital age, and I think that is going to make everybody smarter, actually.

TLDR also made reference to Charlie Warzel’s January BuzzFeed column, in which he argued that 2014 will be the “year of the viral debunk.” I have serious doubts about that glib headline. Yes, new models for debunking misinformation are developing. I think they will pick up steam in the next few years, and that will be an exciting development indeed. But I wouldn’t expect a tidal wave of skepticism before December.

I did appreciate Warzel’s rundown of recent debunkings, and especially this tidbit from The Atlantic’s Alexis Madrigal, which mirrors LaFrance’s thoughts above:

Debunking jobs shouldn’t just do the basic stuff — checking out where the media came from, looking at the accounts on which it was posted, tracing the route it took to popularity — but teach people how to do it themselves. Once you’ve done it a couple times, the skepticism becomes reflexive. (And bonus: When something turns out to be real, you like it even more because you haven’t been duped by bullshit over and over).

One thing didn’t ring true in the podcast – why do Alex and PJ talk as if attempts to counter web-borne misinformation are only just beginning? Seems like the fact checking sites, and Snopes before that, represent the first iterations of this phenomenon.

photo credit: Alex Bellink via photopin cc

Read Full Post »

Is the water safe to drink? It seems such a simple question. Yet the people of West Virginia have been given no easy answers. Lax chemical regulation is the ultimate cause – but poor science communication, particularly on the part of public officials, has made matters worse.

First, the background: on January 9, a substance called crude MCHM spilled from a Freedom Industries facility into the Elk River. Officials declared the local water unsafe for any use other than flushing the toilet, and this guidance remained in place for several days.

Now, a month later, public officials seem confused as to what they should be saying.

Everybody has a different definition of safe,” said Letitia Tierney, commissioner of the state Bureau for Public Health, according to USA Today. “I believe the water, based on the standards we have, is usable.”

(Note that Al Jazeera’s version of that second sentence sounds a little more positive: “You know, I believe the water, based on the standards we have, is usable for every purpose, and that includes drinking, bathing and cooking.”)

The federal Centers for Disease Control and Prevention has sent its own mixed messages, first advising that pregnant women not use the water, then saying all residents can drink and bathe in the water (although an advisory based on the earlier guidance is still up on a state website, The Nation says).

Meanwhile, residents are understandably confused and untrusting. Many say they’re still not drinking the water, and many also complain that it retains the licorice smell associated with the spilled chemicals.

Why is it so hard to say if the water is safe? First and foremost, because we really don’t know for sure. The better reporting and analysis has looked into what studies we have on 4-MCHM (the primary component in crude MCHM – but more on this later), and it’s pretty scant. In fact, before the spill occurred, no one had determined a safe exposure limit for humans.

State officials have not been able to give a consistent and coherent story about how they came to the exposure limit of 1 part per million that they have been using since the spill. According to the Charleston Gazette – which ran a pretty exhaustive story on this – it was pretty much the work of one man, in one hour. And he based his estimate on one study that used just 40 rats, according to NRDC’s Jennifer Sass.

So far, so scary. It bears mentioning that the reason we don’t know much about MCHM – or the tens of thousands of other chemicals in use today – comes down to the US’s lax system of chemical regulation. Basically, there’s a catch-22 – the EPA won’t require studies unless the chemical is of concern, and it doesn’t know if the chemical is of concern without the studies.

But even keeping in mind how in-the-dark officials were here, the poor communication with the public is rather shocking. Agencies such as the CDC should know better than to send mixed messages on public health. Admittedly, explaining risk is difficult, especially given low levels of science literacy. But the on-again, off-again nature of agencies’ warnings will probably leave thousands of people forever distrustful of what they hear said in the name of public health.

As EDF senior scientist Richard Denison says, “This episode should serve as a case study in how not to handle both public health and public trust in the aftermath of a chemical spill.”

One final note: the media, as usual, bears at least some responsibility in this affair. I’d read about a dozen articles on the spill before I realized that Freedom Industries hadn’t spilled pure 4-MCHM – nomenclature which, as the chemically literate will know, denotes a single compound – but crude MCHM, which comprises seven chemicals. Again, the Charleston Gazette did good work on this – pointing out that the rat study in question only tested 4-MCHM.

More questions for public officials to answer – and please, this time, clearly.

Read Full Post »

I’ll be at NECSS!

Just a quick note to say I’ve registered for the Northeast Conference on Science and Skepticism, April 11-13 in New York City. Very excited for my first skeptical event – I’ve been toying with going for years. I’m interested to see what they come up with for the Science Communication workshop stream, and most of all in discussing ideas with my fellow attendees. Perhaps a journalists’ tweet-up would be in order?

Read Full Post »

Older Posts »