Credit: https://www.flickr.com/photos/porsche-linn/

“You can’t correct fake news!” “People don’t care!” Research findings about misinformation this week have made for somewhat depressing reading, especially if we’re too glib in our summaries.

But I would urge anyone who cares about news and truth not to fall into a pit of despair, just yet. Misinformation science is still very young. Many of the results we saw this week were first attempts to answer very large and vexing questions.

That means these results are far from conclusive. And even if we could be certain that these findings are correct, there are still many unanswered questions. Within these gaps may yet lie the path towards addressing our modern information conundrum.

What do these latest findings tell us?

A study by Gordon Pennycook and David Rand found that tags on Facebook showing stories as “disputed by 3rd party fact-checkers” — one of the platform’s most prominent anti-misinformation initiatives — only had a modest effect on readers’ evaluation of the stories’ accuracy. At the same time, the study revealed a serious unintended consequence: When some fake stories were tagged as disputed, participants were more likely to believe the un-tagged stories were true (compared with participants in a control condition, who saw no tagging whatsoever). This side effect was particularly strong among Trump supporters and 18- to 25-year-olds.

It seems that since flags are applied to some, but not all, stories, many people then infer that any story without a flag is true.

So far, so worrying. Still, as Politico points out, there are a number of reasons not to throw our hands up in defeat. The “disputed” tag is just one part of Facebook’s efforts, which also include adding fact-checks to “related articles” modules, shutting down fake accounts (not just Russian ones) and reducing financial incentives for fake news and other misinformation.

Then there’s news literacy. A Facebook spokesman said the platform’s efforts also include “helping people make more informed choices about the news they read, trust and share” — but Facebook has done little such work. (Back in April, they seeded people’s news PSAs whose tips could be a lot more helpful than they are.)

In fact, so far, very few actors have made any serious attempts to improve adult news literacy. This is frustrating, because the work is crucial. After all, if people so easily assume that stories without a disputed tag must be true, that demonstrates a major gap in understanding – not just in the very specific matter of what fact-checkers are capable of, but in how every member of our citizenry approaches every article they read. The default reading mode needs to be something like “open, but critical” — not “swallowing whole unless I’m told otherwise,” and also not “believing nothing.” (Disclosure: I’m the co-creator of Post Facto, a game designed to teach adults news literacy skills.)

You can grumble that it’s a lost cause, or you can see this as one of the biggest social challenges of our generation, and get to brainstorming solutions. I choose the latter.

Do people want news literacy training?

Oh, but what’s this? A Pew study, also out this week, uses survey data from 3,015 adults to outline a five-fold typology of the modern American, based primarily on a) their level of trust in information sources and b) their interest in learning, especially about digital skills. Pew estimates that about half of U.S. adults fall into the Doubtful and Wary typologies — and these people are relatively uninterested in building their news literacy skills.

What’s more, another 16% of Americans fall in the Confident group — and as their name implies, these highly educated folks don’t feel they need news literacy training. They could well be wrong, given the power of the Dunning-Krueger effect. And while I’m not aware of studies that specifically investigate the correlation between educational level and misperceptions, there’s evidence that high levels of numeracy and science literacy are no protection against misinformation about science.

To quote Eleanor Shellstrop, “Oh, fork.”


Credit: goodplacegifs.tumblr.com / NBC Universal

But wait! What does the data really say? How did the researchers ask about people’s appetite for news literacy improvement? Well, respondents were asked:

  • How much they think their decision-making would be helped by training on how to use online resources to find trustworthy information
  • How much they think their decision-making would be helped by training to help them be more confident in using computers, smartphones and the internet
  • Whether they “occasionally” or “frequently” need help finding the information they need online.

Yes, I certainly am disappointed that so many Americans answered “not much.” But I’d argue that this handful of questions cannot reveal the complexity of Americans’ relationship with their information needs and challenges. Not only are the questions few in number, but they’re also pretty abstract – it’s possible they just didn’t mean much to the respondents, or really make them think about their particular information challenges.

It’s possible, too, that the word “training” is pretty off-putting. How much would the answer change if we asked whether they’d like “help”? Or “resources”? There are probably people who can’t commit to a formal training program, but would welcome the occasional assistance. Understanding that will help news literacy efforts to meet these people where they are.

Here’s some more questions we could ask:

  • Do you always know whether the information you read is true?
  • How often would you say you confidently make that determination?
  • What information would help you to make that determination?
  • How much do you think you’d be helped by more information on news organizations’ practices? On what makes reporting good or bad? By easier access to databases of facts? By help evaluating whether a story is true?… and so on.

A news literacy research agenda

A research agenda for investigating these questions would probably combine quantitative and qualitative approaches. Focus groups could help us understand what questions resonate with people, and what assumptions we’ve made. Surveys and experiments will help to quantify the attitudes that people hold.

By noting the limitations of Pew’s question set, I mean no criticism of their methodology. The researchers were trying to understand the interaction of a number of psychological and sociological factors, without writing a survey so long that participants would tend to drop out.

What I am suggesting, though, is that this is just the beginning of what must be an intense investigation of how people view their own information literacy skills and gaps. In the past decade or so, we’ve started to build a picture of how people process misinformation and corrections. This work is vital, but if we are to craft solutions, we need to also understand people’s attitudes towards their own information literacy, or lack thereof.

That’s why, while I sometimes find myself catching my breath at the sheer scale of the problem before us, I also see a reason to roll up my sleeves and pour that concern into the serious work of research. There is no time to lose.

Cross-posted at Medium

Edited intro Sept. 15, to make clear that those quotes aren’t good summaries of the findings.


If I thought there was too much daily information for me to absorb and process before Nov. 8 – well hoo, boy.

I’m trying out some techniques to a) read a sliver of what’s important to me, b) keep track of the research and story ideas that generates and c) try and occasionally go back to my notes, re-read, synthesize and think about misinformation problems on a deeper level.

Technique 1 is to keep a daily research journal. I’ve made a few entries. It’s a start.

Technique 2 is to write VERY QUICK blog posts on what I’m pondering and/or hope to research, with links to the stories that sparked the ponders. That way I can hopefully connect with people interested in the same ideas.

OK, today’s VERY QUICK thoughts:

  1. What can we learn from Wikipedia? What exactly are the “rigorous logic and rules” cited here? Could fact-checkers apply these? Can we teach these in news literacy courses for kids and the general public?
  2. What kind of research do we have about what techniques work in teaching news literacy to adults? I don’t mean broad-brush ideas like “don’t lecture, don’t insult” but really nitty-gritty approaches. I get the feeling from my interactions on Facebook that a lot of people assume the really simple debunk checks are beyond them, when they’re really not. How do we break through that assumption?
  3. Do we have anything like the well-rounded understanding of how people in the U.S. acquire news about the world in today’s info environment (e.g. social media vs. mainstream news sources vs. Wikipedia vs. personal conversation vs….?) I’m guessing no, though I intend to read up. Where do communication scholars think the biggest gaps are? It seems like we are fighting so blindly, trying to combat misinformation and misperceptions when we don’t fully understanding the components and interplay of people’s media diets.
Burzynski outside his SOAH hearing last November. (Credit: Tamar Wilner)

Burzynski outside his SOAH hearing last November. (Credit: Tamar Wilner)

Two Texas judges released their findings Oct. 12 in the case of Stanislaw Burzynski, the controversial Houston cancer doctor who’s been accused of deceptive advertising, making medical errors, and treating patients with untested combinations of drugs that amounted to “experimentation.” On most of these charges, the judges failed to find enough evidence of wrongdoing to discipline Burzynski.

The State Office of Administrative Hearings did find, however, that Burzynski is subject to sanctions for:

  • Aiding and abetting the unlicensed practice of medicine by one practitioner;
  • Failing to properly supervise three unlicensed physicians who misrepresented themselves as doctors;
  • Inaccurately reporting the size of one patient’s tumor, causing a misclassification of the patient’s response to treatment;
  • Failing to gain properly informed consent from several patients;
  • Failing to disclose his ownership interest in the pharmacy that prescribed patients’ drugs;
  • Failing to maintain adequate records to support charges made to four patients.

The conclusions from the judges at Texas’ State Office of Administrative Hearings (SOAH) are not binding, but come in the form of recommendations to the Texas Medical Board. It’s the usual practice for the board to accept and implement SOAH recommendations.

A hot take

I honestly didn’t know what conclusion to expect in this case, despite the many months I spent researching it for my Newsweek article last February. Burzynski has faced sanctions many, many times before, and it’s never impeded his practice much. So I don’t find myself particularly shocked by this result.

From my quick skim of the findings, it seems that SOAH failed to find against Burzynski on some of the most important charges because of missteps by the Texas Medical Board (who acted as the prosecution).

The board’s expert witness on the standard of care allegations, such as the use of untested drug combinations, was Dr. Cynthia Wetmore of Children’s Healthcare of Atlanta. But the judges found her testimony “troubling” when she accused Burzynski of violating the standard of care for Patient D — a patient who never received treatment at the clinic. The judges also noted that Wetmore is a pediatric oncologist in an academic, rather than private practice, setting.

They therefore concluded, “Dr. Wetmore’s expertise about the treatment of adult cancer and the standard of care used by physicians in private practice is limited and will be given little weight.”

That must have been a harsh blow for the medical board.

Has Burzynski “saved lives”?

The other surprising finding was this: “Respondent’s [Burzynski’s] treatments have saved the lives of cancer patients, both adults and children, who were not expected to live.”

With the caveat, again, that I have only skimmed the decision, it’s hard to see what reliable scientific evidence the judges had for this claim. Certainly I’ve never come across any. Yes, there are dozens of patients who claim that Burzynski did exactly that — save their lives. But did the judges really base this conclusion on anecdote alone?

The judges further listed, among “mitigating factors” meant to inform eventual sanctions against Burzynski:

If Respondent is unable to continue practicing medicine, critically ill cancer patients being treated with ANP [antineoplastons, the drug that has been the focus of Burzynski’s research] under FDA-approved clinical trials or a special exception will no longer have access to this treatment.

Amazing that after decades of research, with not a single randomized controlled trial demonstrating the effectiveness of ANP, the judges still see the loss of such drugs as some kind of harm to public health.

Note: The decision can be viewed by visiting http://www.soah.texas.gov/, clicking “Electronic Case Files,” then “Search Public Case Files,” then searching under Docket Number 503–14–1342.

Cross-posted from Medium, with minor alterations.


Stanislaw Burzynski, the Houston doctor who has treated thousands of cancer patients with unproven medications, is set to take the stand to defend his medical license today.

As I wrote for Newsweek recently, Burzynski is a celebrated figure in the alternative medicine world, and credited by some with saving their lives. The Texas Medical Board, however, says Burzynski overbilled, misled patients and made numerous medical errors – and the board’s expert witness said Burzynski’s use of untested drug combinations amounted to “experimenting on humans.”

The hearings started last November, when the Texas State Office of Administrative Hearings (SOAH) heard the board’s case. Now Burzynski’s lawyers will bring their own witnesses, beginning with the doctor himself. A document filed with SOAH estimates that Burzynski’s testimony will take two days.

According to the document, Dr. Burzynski will testify about the standard of care at his clinic, the use of antineoplastons, compliance with the FDA and charges of false advertising, among other issues. He will reply to “allegations of non-therapeutic treatment,” “inadequate delegation,” and “the aiding and abetting of the unlicensed practice of medicine.”

A number of patients and Burzynski Clinic employees will testify on Dr. Burzynski’s behalf, as will several outside physicians who treated clinic patients, and Mark Levin, a board-certified oncologist.

The hearing resumes just weeks after the Burzynski Research Institute announced that it has started patient enrollment in an FDA-approved phase 2 study of antineoplaston use in diffuse intrinsic brainstem glioma, a cancer that mainly attacks children.

The hearing is scheduled to run until May 12.

Hear more about the Burzynski story in my interview with the Point of Inquiry podcast.

Too often, we talk about fact-checking as an elite activity. First, it was a job for the magazine’s in-house fact-checker (back when there were such things). Then it was a job for non-partisan fact-checking organizations or dedicated journalistic operations like FactCheck.org and PolitiFact.

As important as those organizations are, fact-checking is also something we all need to practice each day as news consumers. It’s really akin to media literacy: a critical reflex, backed up with a few modern tools, that we must bring to bear when we consume modern media in all its forms (but especially social media).

With that in mind, I put together this round-up of fact-checking tools that we all can use. It started as a talk for the Skeptic’s Toolbox, a workshop held two weeks ago in Eugene, Ore., and grew with feedback from participants. I hope you’ll find some useful suggestions in these lists. Please let me know anything you think I’ve missed!

Picture credit: NOAA.

Picture credit: NOAA.

Note: I have joined the “virtual class” component of Dan Kahan‘s Science of Science Communication course at Yale University. As part of this I am endeavoring to write a response paper in reaction to each week’s set of readings. I will post these responses here on my blog – my paper for week 11 is below. Previous responses are here. I will also be participating in the discussion on Kahan’s own blog.

This week’s focus:

What is/should be the goal of climate science education at the high school or college level? Should it include “belief in” human caused climate change in addition to comprehension of the best available scientific evidence?

I started off thinking I had not changed my mind since writing my evolution education post two weeks ago. I planned to contend that, as with evolution, there is a reason that we are not satisfied for students to simply acquire knowledge about climate change. If they were to cogently describe what the theory of anthropogenic global warming (AGW) entails, but flat-out deny the truth of the theory, that would leave us unsatisfied – not just because global warming is a pressing issue which requires political will and thus voter backing to tackle (though that’s certainly true) but because we’d be left with the feeling that on some level the student still doesn’t “get it.”

Unpacking my argument from last week – which proposed that we should aim for students to believe the following…

(proposition p) Evolution, which says x, is the best supported scientific way of understanding the origins of various species, the way species adapt to their environment, etc etc.

… I can identify three reasons for this to be our aim:

  • First, because science *is* the best scientific explanation for these phenomena, and thus by knowing this, students know a true fact about the world;
  • Second, because armed with that knowledge, they are better equipped to apply the theory of evolution to scientific and other real-world problems; and
  • Third, (as I outlined in my comment to Cortlandt on the next post) because we wish students to understand the scientific justification for the theory of evolution, and if they understand that, then belief in proposition (p) necessarily follows. (It occurs to me now, however, that this is not the most terrific argument, because necessity does not flow in the other direction. Believing that p does not necessarily mean the student understands the scientific justification for evolutionary theory; he could take (p) on faith.)

The consensus problem

The climate equivalent of proposition (p) might be something like:

(q) The theory of anthropogenic climate change is the best scientific explanation we have for observed increases in the mean global temperature, and the theory predicts that if man continues to produce greenhouse gases at a similar rate, the temperature will continue to rise.

Proposition (p) could have included a stipulation about predictive power – indeed, to be a valid scientific theory, the theory of evolution must have predictive power. But while I didn’t think that needed to be spelled out for (p), I have done so for (q), because climate change is a subject whose vital importance – and whose controversy – truly rests on its predictions.

But there’s a problem here, and maybe a mismatch. In proposing that we aim for student belief in proposition (p), I figured we were disentangling identity from knowledge. Any student, taught well enough, could come to see that proposition (p) is true – and still choose not to believe in evolution, because their identity causes them to choose religious explanations over scientific ones.

For climate change, however, we may not get that far. There seems to be mixed evidence for the effectiveness of communicating scientific consensus on AGW.

As previously discussed, Lewandowsky et al found that subjects told about the 97 percent scientific consensus expressed a higher certainty that CO2 emissions cause climate change. Dan Kahan counters that this finding seems to bear little external validity, since these are not the results we’ve seen in the real world. From 2003 to 2013, the proportion of the US public who said human activities were the main cause of global warming declined from 61 to 57 percent.

In Cultural Cognition of Scientific Consensus, Kahan finds that ideology, ie “who people are,” drives perceptions of the climate change consensus. While 68% of egalitarian communitarians in the study said that most expert scientists agree that global warming is man-made, only 12% of hierarchical individualists said so.


From Kahan, Jenkins-Smith and Braman, Cultural Cognition of Scientific Consensus. Journal of Risk Research, Vol. 14, pp. 147-74, 2011.


On the other hand, as Kahan said in a lecture at the University of Colorado last week (which I live-streamed here – unfortunately I don’t think they’ve posted the recording), most people who dismiss AGW nonetheless recognize that there is a scientific consensus on the issue. At least on the surface this seems at odds with Kahan’s previous findings, so I’d like to look further into these results. (I think the difference may come down to what Kahan describes, in Climate-Science Communication and the Measurement Problem, as the difference between questions that genuinely ask about what people know and those that trigger people to answer in a way that aligns with their identity. Why one of Kahan’s consensus questions fell in the former camp and one in the latter, I do not yet know.)

How is it possible that someone can recognize the scientific consensus on AGW, but still dismiss the truth of AGW? The most natural answer is that such people can readily dismiss the scientific consensus, perhaps arguing the scientists are biased and untrustworthy. This, by the way, points strongly that we should have always expected consensus messaging to fail!


So, if the aim is not consensus…?

Returning to education, I think this warning about consensus messaging points to the importance of creating a personal understanding of the science – i.e., exposing students to the reasoning and evidence behind climate change theory, and walking them through some of the discovery processes that scientists themselves have used. There may be serious limits to what this can achieve, because smart students may perceive that the arguments being used in the classroom have been developed by the scientists that they distrust. But undecided students may be persuaded by the fundamental soundness of the scientific arguments.

There is another danger: conservative students (especially the smart ones) may also reject the scientific arguments advanced in class because they will perceive that at a certain point they must taking things on authority; that the processes involved are too complex and the amount of data too large for a non-specialist to come to a solid independent judgment on. Furthermore, the students can entertain the idea that there is a viable alternative scientific theory because there are many prominent voices that back up this view.


Back to evolution

Again looking back at last week, I realize now that the same problem exists for evolution. The genius of “intelligent design” and “creation science” is that they allow an exit from the scientific-religious conflict in what many of us would call the wrong direction. Students can use this “out” to accept the science they like, reject that they don’t, and view it all as a “scientific theory.” Rather than accept (p) and then be forced to either choose religion over science, or somehow partition these parts of themselves (which Hermann, as well as Everhart, indicate is how many people cope), students may use religion *as* science and reject (p) altogether.

So now I’m beginning to doubt whether my aim in that essay really was achievable. It’s probably still a good idea to aim for beliefs of type (p), because this is a means of encouraging scientific literacy and nature of science understanding. But religious students with a good grasp of the nature of science will probably still find that “out” and will not agree with the evolution proposition. And other, less scientifically oriented students will simply say, “OK, this is the best science, but I trust religion over science.”

OLYMPUS DIGITAL CAMERANote: I have joined the “virtual class” component of Dan Kahan‘s Science of Science Communication course at Yale University. As part of this I am endeavoring to write a response paper in reaction to each week’s set of readings. I will post these responses here on my blog – my paper for week ten is below. Previous responses are here. I will also be participating in the discussion on Kahan’s own blog.

This week’s “questions to consider” (a reading list is here):

1. What is the relationship—empirically—between “accepting”/“believing in” the evolution and “understanding”/“comprehending” it?  Are they correlated with one another? Does the former have a causal impact on latter?  Vice versa?
2. What is the relationship—psychologically— between “accepting”/“believing in” the evolution and “understanding”/“comprehending” it? Is it possible to “comprehend” or “know” without “believing in” evolution?  Can someone “disbelieve” or “not accept” evolution and still use knowledge or comprehension of it to do something that knowledge of it requires?  Are there things that people are enable to do by “belief” or “disbelief in” evolution?  If the answer to the last two questions are both “yes,” can one person use knowledge or comprehension to do some things and disbelief to do others?  What does it mean to “believe in” or “disbelieve in” evolution? Is it correct to equate the mental operation or state of “believing” or “disbelieving” in evolution with the same mental state or operation that is involved in, say, “believing” or “disbelieving” that one is currently sitting in a chair?
3. What—normatively—is (should be) the aim of teaching evolution: “belief,” “knowledge,” or “both”?
4. If one treats attainment of “knowledge “or “comprehension” as the normative goal, how should science educators regard students’ “beliefs”?
5. If one treats attainment of “knowledge” or “comprehension” as the normative goal of science education, how should one regard political or cultural conflict over belief in evolution?

My response:

The empirical relationship between knowledge, understanding and belief

The evidence points strongly towards a distinction between knowledge and belief, for the simple reason that so many students have been able to demonstrate the former without the latter (or vice-versa):

  • In Hermann’s (2012) study, there was no statistical difference in understanding of evolution concepts between two extreme sub-groups of students, one believing in evolution and the big bang theory, and one not.
  • Sinatra 2003 (cited in Hermann) similarly suggests no relationship between belief and understanding of evolution
  • Blackwell found what could be termed a disjoint between understanding/application and belief: although an overwhelming majority of students selected appropriate answers when asked to categorize examples of specific evolutionary processes, the percent considering evolution the primary basis for progression of life on earth was 34-35 percent (with slightly different percentages for the two classes surveyed). The percent considering evolution compatible with their belief system was 27-29%, but only 6-9% said they could never believe in evolution.
  • Other studies found that it is common for students to believe in evolution without understanding it (Lord and Marino 1993, Bishop and Anderson 1990, Demastes-Southerland et al 1995, Jakobi 2010, all cited in Hermann).
A section from Blackwell's study, in which students had to categorize examples of various evolutionary processes.

A section from Blackwell’s study, in which students had to categorize examples of various evolutionary processes.

On the other hand, according to Hermann, several studies (Lawson and Worsnop 1992, Sinclair and Pendarvis 1997, Rutledge and Mitchell 2002) found that adherence to a religious belief system influenced the extent to which evolution was understood.

Defining our terms: the psychological relationship between knowledge, understanding and belief

The vagueness of the terms “belief,” “understanding” and “knowledge” obviously should give us pause when we are trying to make sense of these empirical findings.

I think we should try to define the terms in the way that is most fruitful to the problem at hand (while also seeking as much as possible not to create conflict with existing common usage, and to allow the above empirical findings to be applied). That problem is often put thusly, “Many students understand evolution, or demonstrate knowledge of evolution, without believing in it. Does this matter, and if so, what should we do about it?”

With this in mind we can come up with some rough working definitions:

  • Knowledge: Retained and retrievable true facts about physical properties and processes.
  • Understanding: A deeper form of knowledge accomplished by forming multiple connections between true facts on the subject at hand (evolution) and between the subject and others. (Similar to Gauld 2001’s definition, cited by Smith 2004.) An example of one such connection may be between a scientific theory and its supporting evidence (as suggested by Shipman 2002, cited by Hermann).
  • Belief: A committed, often emotional attachment to a proposition, which itself may be true or untrue, falsifiable or not. I would argue that it is sensible to talk both of faith-based belief in religious precepts, and of belief in scientific theories, which can be driven by thorough understanding of their scientific basis, or by blind faith in scientists. (I subscribe to the summary by Gess-Newsome 1999 [cited by Smith], of knowledge as “evidential, dynamic, emotionally-neutral”, and belief as “both evidential and non-evidential, static, emotionally-bound.”)

These definitions mesh well, I believe, with most of the empirical findings we read about this week, including Hermann, Smith and Everhart (2013). Hermann, for example, builds on Cobern’s partitioning concept to conclude that religious students view science ideas as “facts,” categorized differently than beliefs, which have a stronger emotional attachment. This helps students compartmentalize because they have created an “emotional distance” between scientific conceptions and religious beliefs.

In creating these definitions I have had to dismiss definitions that I think are unhelpful for the problem at hand. For example, according to Hermann, Cobern (1996) stated that knowing “is the metaphysical process by which one accepts a comphrehended concept as true or valid.” But this definition is actually much more like belief, as most of this week’s reading understands it.

I’ve also had to discard the philosophical convention that belief is a necessary condition of knowledge (Smith). When describing the way that people learn, and knowledge acquisition’s interaction with existing belief systems, this stipulation just doesn’t make sense (given the evidence we have of knowledge without belief). By casting off the impractical philosophical definition, I resolve a problem that Smith recognized – that if knowledge is dependent on belief, science education must foster belief.

There will always, I think, be messy edges and overlap between these realms. For example, it is hard to think of much useful knowledge that we can retain as utterly isolated “facts.” Facts that are part of a coherent schema are easier to retain or retrieve. We do, however, remember groups of facts that are connected in greater or lesser degree, both to each other and to other facts and schema in our brains. The difference between knowledge and understanding is thus one of degree.

Is lack of belief a problem? Or is it lack of understanding?

It should be noted that the issue with religious students’ non-belief in evolution is not merely one of semantics or a confusion of terms. The problem is we are not satisfied with students merely believing evolution in the way that they believe in discredited Lamarckian or Ptolomeic ideas. We don’t want them simply to believe “that evolution says x”: that implies that evolution has no special empirical status and it may as well be false, as those outdated scientific theories are. A student who can say only “that evolution says x” is merely parroting scientific language. She is in truth only a historian of science rather than truly a scientist herself – and I think that’s what so bothers us about the learning outcomes exhibited by students like Aidan and Krista, in Hermann’s study. We come away with the sense that their knowledge falls short of true scientific understanding.

I agree with Smith, however, that we should not go so far as to seek or require belief – or perhaps, I might say, “complete belief.” It is not and should not be the goal of a science class to completely overhaul students’ worldview, religion and all.

What we are seeking is for students to believe something like:

“Evolution, which says x, is the best supported scientific way of understanding the origins of various species, the way species adapt to their environment, etc etc.” (A conclusion similar to Smith 2004.)

And this requires an understanding of evolution, in the strong sense of understanding, which encompasses comprehension of justification. One may even argue that this type of belief follows necessarily from strong understanding: that is, if you understand mechanism of and scientific basis for evolution, and the comparative paucity of scientific explanation for other theories of species’ origins, then you will necessarily believe that “Evolution, which says x, is the best supported… etc, etc.” This could be a neat logical maneuver to employ because it means that we can avoid talking about the need for students to “believe in” evolution – which carries a lot of nasty cultural baggage – and just talk about understanding instead.

While several empirical studies have demonstrated that students can easily demonstrate knowledge of evolution without belief in evolution, understanding is a much more slippery eel. As previously alluded to, understanding encompasses a wide spectrum, starting from a state barely stronger than plain knowledge. But I would argue that understanding evolution, in its strong form, encompasses an understanding of the scientific justification for the theory of evolution – and that necessitates an understanding of the nature of science (NOS) itself.

Nature of science: the path to strong understanding of evolution

The best tactic for accomplishing this right kind of evolution belief, or strong understanding – and happily, a key to solving much else that is wrong with science education today – is to place much more emphasis on the scientific method and the epistemology of science. This includes addressing what sorts of questions can be addressed by science, and what can’t; and also the skeptical, doubtful tension within science, in which things are rarely “proven” yet are for good reason “believed.” Crucially this involves helping students to understand the true meaning of “scientific theory,” whose misunderstanding often underpins further misconceptions about evolution’s truth status.

This effort also involves exploring the tension between self discovery and reliance on authority – acknowledging that it is important for students to learn to operate and think like scientists, and we want as much as possible for them to acquire knowledge in this way: but that the world is far too complex for us all to gather our own data on everything. So students must learn how to judge the studies and reasoning of others, how to determine what evidence from others is applicable to what conclusions or situations, and how to judge who is a credible expert.

Misunderstandings of the nature of science (as well as certain broad scientific concepts) often lie at the heart of disbelief in evolution, as Hermann illustrates. In his qualitative study, both students showed a poor understanding of the methods and underlying philosophy of science, displaying a need for truth and proof – despite their good science knowledge performance.

Smith, rather inadvertently, gave another example of this problem. He cites a student who wrote to Good (2001):

I have to disagree with the answers I wrote on the exam. I do not believe that some millions of years ago, a bunch of stuff blew up and from all that disorder we got this beautiful and perfect system we know as our universe… To say that the universe “just happened” or “evolved” requires more faith than to believe that God is behind the complex organization of our solar system…”

Good uses this passage to justify making belief a goal of science education. Smith takes a contrary view, that “meaningful learning has indeed occurred when our four criteria of understanding outlined above have been achieved – even if belief does not follow” (emphasis in original). Instead I would argue that the student does not understand evolution in a meaningful way, having false impressions of underlying scientific and philosophical concepts such as entropy, order, and Occam’s razor.

Will nature-of-science education work with all students?

The research outlined above shows a mixed prognosis for our ability to overcome these issues and foster belief in the evolution proposition. Everhart’s work with Muslim doctors suggests that most participants considered subtly different meanings of the theory of evolution, and could consider evolution in relation to different contexts, such as religion and practical applications, with attitudes to evolution changing when the relative weights of these meanings were shifted. These meanings include a professional evaluation of the theory that could be held distinct from other evaluations. This suggests that participants may recognize the truth of evolution within a science epistemology framework, which should be sufficient for belief in our proposition, and not give evolution the same status within other, more personal epistemologies.

But Hermann suggests that students ultimately fail in integrating science and religion, which creates a fear of losing religious faith, causing the student to cling to the religious view while further compartmentalizing science concepts. This drives at the hard, hard problem at hand: even with a perfect understanding both of evolution and of the nature of science, religious students are likely to run into areas of conflict that create psychological discomfort. This is because the epistemic boundaries of science and religion are neither hard nor perfect. Some of the areas that science claims as well within its remit to explain – such as the age of the earth – run into competing claims from religion.

One way out of this conundrum is for a student to redraw the boundaries – to say, OK, I accept the scientific method where it does not conflict with my faith; but on this matter I must reject it. Hermann’s subjects appear to have done this to a certain extent, but run up against limits. I would hypothesize that this line-drawing process itself leads to further discomfort, especially among students who are brighter and/or show greater understanding of the nature of science, because they would consciously or unconsciously recognize the arbitrary nature of line-drawing. And unfortunately, one good way to resolve that discomfort would then be to discredit the scientific method.