By Grant Jacobs 17/01/2019 9


A study just out shows that the strongest opponents of GM (genetic modification) think they know the subject well, but in fact know the least.

What does this means for science communication? Especially contentious topics. Doubly so where deliberate misinformation is being offered. Similarly are their lessons for politicians? In New Zealand politicians seem too timid to try correct the legislation and resolve the issue.

Invisible faces

The paper in question isn’t looking at the Dunning-Kruger effect but it helps to know what this is first.

You have to laugh at the example that prompted what has become known as the Dunning-Kruger effect, a

bank robber who was baffled to be caught after rubbing lemon juice into his face in the belief it would make him invisible to security cameras

At first this seems to just be an example of sheer stupidity, but there’s a subtle point too. (Less subtle is that the hapless robber apparently tested his invisibility by taking a Polaroid photograph of himself and failed to find himself in it…)

As Errol Morris wrote in The Opinonator blog,

As Dunning read through the article, a thought washed over him, an epiphany.  If Wheeler was too stupid to be a bank robber, perhaps he was also too stupid to know that he was too stupid to be a bank robber — that is, his stupidity protected him from an awareness of his own stupidity.

It might seem incredibly unkind, but you can measure this stuff. You can record how well people think they are doing, and compare it to their actual ability.

The Dunning-Kruger effect revisited

The Dunning-Kruger effect is a cognitive bias where people of low ability mistakenly think they have better ability than they do. Their lack of ability also means they’re not good at recognising their lack of ability.

Conversely, those with great knowledge tend to underrate their ability. One finding was that capable people assume others find the tasks easy too, and so they don’t rate themselves so highly. They don’t recognise how much better they really are.

I’d add from personal experience that people with deep knowledge tend to be critically aware of what they don’t know. Knowing what you don’t know is a tricky thing. You’ve got to be aware of what there is to know first.

An upshot can be paradoxical communication styles. The poorly informed person will present incorrect or misleading ‘advice’ with great gusto, where those who are very well informed are tentative and cautious.

Those of us who follow contentious topics see this all the time.In many ways vaccines discussions are a clearer example of this than GM food.It’s more obvious the benefits that vaccines bring. Despite this, those strongly opposed to vaccines will say ‘definitively’ something is flat-out wrong.

Strongest opponents to GM overrate their ability

Philip Fernbach and colleagues are looking at the psychology of extremism rather than the Dunning-Kruger effect. What does it take for people to hold extreme views about scientific topics?

In many ways what they find is that in order to hold extreme opposition to a scientific topic people have too a poor knowledge of the subject but be convinced they are knowledgable. Good knowledge introduces nuances and complexities that preclude a simplistic extreme opposition to the thing.

They compared public surveys from the US, Germany and France, comparing people’s views, their self-assessed belief of how good their science understanding was, and their actual science understanding. The research also checks out a number of potential pitfalls, for example –

  • the order of the questions
  • confounding from considering things other than food safety more important
  • differences in education levels

Overall they find that,

The interaction is statistically significant, indicating that the relationship between objective knowledge and self-assessed knowledge differs by extremity of opposition

Basically, people’s tendency to be extremely opposed to GM food was intrinsically linked to their tendency to over-estimate their knowledge.

I’m interested in their focus on food safety as my (anecdotal) perception is that this is a dwindling concern. They find that,

Extreme opponents were actually more likely to cite food safety/health concerns than moderates and the main results replicate when we restrict analysis to the subset of participants citing food safety/health concerns.

It’s a minor result for their work, but an important one for dealing with the topic. Concerns about food safety are not uniformly spread, but are mostly in those with extreme opposition to GM food.

Also for gene therapy, too

They also looked a medical applications for genetic engineering. People are more accepting of this. It was possible that the trend of a disconnect of perceived and actual knowledge and extreme opposition was something mostly true of more strongly held things like opposition to GM foods.

Their results show a lower overall level of opposition, but the same the same trend of a disconnect of perceived and actual knowledge in those with the most extreme opposition.

They also investigated views on climate change. There the effects are similar, but people were readily split based on political views. Essentially, people fell in with their ‘in group’. (Climate change is political issue in the USA in particular.)

Where to from here?

What can we learn from this for science communication?

Before considering how we might communicate to these people, or if we even should, I think it helps to remember that for each person there will be other more factors at play too.

  • Confirmation bias: rejecting what they don’t want to hear People may so much want a point of view to be ‘right’, that they simply dismiss anything that doesn’t fit.
  • Digging those heels in Having rejected other’s information, some people with tightly-held views then dig in.
  • Holding views as personal I often consider if someone has defined themselves in terms of the view they hold. People presenting information on the subject are then seen as personally attacking them, even when they are plainly not.
  • Active misinformation Somewhere in all the mess is that a few political lobby groups actively spread misinformation. These groups often present themselves as holding up high moral positions, or ‘bravely’ challenging ‘false facts’ – seemingly unaware of the deep irony they hare presenting falsehoods. The biggest player in this is probably Greenpeace.
  • Echo chambers Unfortunately one thing that strongly assists spreading extreme views are social media echo chambers. There those with strong views can lock out anyone they don’t want their followers to hear. Local (NZ) groups opposed to GMOs act this way, too. (As a biologist I can’t help but see an analogy or parallel between groups like this and reservoirs of infectious disease.)

These and other things will complicate how each person reacts. Studies like Fernbach and colleagues work introduce broad principles, but individual people will be a mix of a lot of things.

How to communicate to these people?

The author suggest,

Our findings highlight a difficulty that is not generally appreciated. Those with the strongest anti-consensus views are the most in need of education, but also the least likely to be receptive to learning; overconfidence about one’s knowledge is associated with decreased openness to new information. This suggests that a prerequisite to changing people’s views through education may be getting them to first appreciate the gaps in their knowledge.

If you’re familiar with interacting with people with opposing views on GM foods, this won’t surprise you much. It strikes me as a conundrum, though. These people don’t want to appreciate the gaps in their knowledge, thank you very much. They insist they already know everything they need to know.

Some thoughts, then

My thoughts are not meant to be definitive, but openings for discussion. Readers are welcome to suggest their thoughts in the comments below.

Can you? Can you communicate with these people with extreme views at all?

Are there enough of them to matter? If they are a tiny minority, do they matter? What might matter more is to point out that they are a tiny minority. A minority at odds with everyone else for that matter. Strong opponents of vaccines might be an example.

Simply ignore them. Following the previous thought, ignoring them is an option. Write as if they’re not there. Of course, you won’t shift their views doing this; you’re no even engaging with them. A question might be how much influence do they have on others? If they have influence, you might want to deal with their influence. (This isn’t the same thing as dealing with the people.)

Try pointing to who they learnt their views from. Communicate with them, but refer to the views they hold as views others have encouraged them to hold, and criticise those people. This tries to avoid that some people bind their views to who they are, personalising any discussion. Make it someone else. You’re trying to get them—and any readers—to question what others have told them, a little of what Fernbach and colleagues suggested.

Use them as examples for those slightly less opposed. A bit nasty in some ways, but you could hold them up as examples of “you really don’t want to go to this place”. Deconstruct what they way, but for the benefit of others. This and slightly gentler approaches can aim to address the ‘silent readers’—people who are less vocally involved, and perhaps more likely to be open to new information.

Lessons for politicians

I think there are always a few that you’ll never really get through to. Life is like that.

You just wish politicians wouldn’t be such pathetic sods about this. You can’t make a policy that genuinely works for every last person. There will always be a few who are just ‘in an awkward place’. It may not be politically correct or the grand ideal, but it is reality.

We do reasonably well with this for vaccines. Why not for GMOs?

For vaccines in New Zealand the emphasis has been strongly on education, accepting that a tiny minority will just persist with unorthodox views whatever you do.

Anecdotally there seems to be widespread recognition that GM food is safe, and that the opposition to GMOs is overplayed.* An Otago survey indicated most New Zealanders thought GM food safe.

The persistence of laws ‘countering’ something that has no sound scientific basis appears to be largely political laziness. The previous government took the easiest option, one the EPA advised would be inherently temporary

They suggested #4 is good and should be approached at some time, but the current problems want immediate attention and that options #2 or #3 would give immediate attention to this. They noted that options #2 and #3 could only be temporary as a longer-term, proper, review is needed.

They suggested #2 would address the immediate concerns, was a bare-bones temporary ‘patch’: “is a bare minimum and is not considered a long term solution.” Cabinet elected to take this approach.

I’ll write about this some other time, but I feel a key is to recognise that the real objections to GMOs are not over science issues, but ‘values’ aspects that should be treated in the same way that secular governance deals with those with different religious views. People can choose to be ‘organic’, but they don’t have a right to inflict their wishes on others, or demand a monopoly.

Suggestions for my blog

If you have any thoughts, ideas or suggestions for my blog, you’re welcome to add them to my earlier post. (You can also send a message using the ‘Get in touch with the authors’ link in the top-right corner of each blog post.)

Other articles on Code for life

GMOs and the plants we eat: neither are “natural”

Regulating GMOs: time to move forward

Public opinion of gene editing and enhancement

Finding platypus venom

Footnotes

* I was struck by the (initial) comments to this recent opinion piece, for example. I’m also struck by that I don’t recall the publisher (the NZ Herald) ever offering an equivalent opinion piece from someone advocating we move forward on GMOs. I’d happily do the honours (admittedly preferably paid).

About the featured image

Nicotiana benthamiana plant. Original from Wikimedia Commons, public domain. The plant shown is almost certainly not a GMO. I’m including it as it’s a nice example of the range of GM applications, one I might more about another time. One problem in communicating about GMOs is the focus (some would say fixation) on herbicide tolerance. In practice there are a very wide range of applications of GM in plants. This relative to the tobacco plant can be used to grow antibodies for medical use, such as the Zmapp treatment for Ebola. It is not a food plant, and it is not gene therapy, but a GM plant is an important part of producing the drug.


9 Responses to “Strongest opponents of GM think they know best but actually know the least”

  • Steven Novella has a useful take on the research I mentioned in my previous comment:

    https://theness.com/neurologicablog/index.php/gm-foods-and-changing-minds/

    Thanks to Alison for pointing it out.

    Quite a few others sources have written on this topic. Here’s a take from ArsTechnica: https://arstechnica.com/science/2019/01/on-gmo-safety-the-fiercest-opponents-understand-the-least/

    Anecdotally (with the confirmation bias that can have) I like this observation –

    But it could also be that a desire to demonize GMOs led people to latch on to questionable claims about the underlying technology, leading to their misinformed state.

    My perception is that there is quite a bit of ‘demonising’ going on. To my impression it’s evidenced by the extent that those opposed point at things that aren’t in fact about GM plants.

    This also appeals, as it’s a point I’ve made previously and that I touched on briefly in my piece –

    Advocates of various approaches to improving the public’s understanding tend to present the issue as monolithic and, therefore, something that can be tackled by a single solution (generally the one they’re advocating for). But this study and the past research it cites highlight how we have an entire collection of public misunderstandings, each with distinct causes, dynamics, and potential solutions.

    I get a bit frustrated at what I see is a tendency for the academic science communication sector to try find ‘one’ or ‘a main’ issue that might resolve things. It’s a long story, but my views have been informed in part from a crude, informal analysis of sorts I did a long time ago of about a half-dozen creationist’s accounts of how they came around to seeing creationism was nonsense. Among the common things they all pointed to was how it was a long process, with different types of “communications” playing a role in different stages of their journey. I recall that all said reading a textbook critically contributed, but at a later stage. That’s the deficit model! More on this one day, perhaps.

  • Grant: Thanks for the link to Steven Novella. The original idea/paper seems to come from (Conversion messages and attitude change: Strong arguments, not costly signals.
    Lyons, Hasell, Tallapragada, Jamieson. DOI:
    10.1177/0963662518821017) but is behind a paywall so I will follow that up at work on Tuesday.

    Grant, you will be pleased to hear that you have broadened my view to accept the need for multiple approaches to science communication. Including accepting the place for the “Deficit model” (Thanks). One of my more positively received recent Toastmasters was a history of the universe from the Big Bang to the Present. Many club members had no idea of the beauty of this central science story.

    Actually, I think science communication is like sex; it works better if we remember we are having fun together. (And not taking it so seriously).

    Thanks again.

  • Thanks.

    FWIW I’d love a copy of the Lyons et al paper – paywalls are a nuisance to me. I should write to the authors.

    I should also write an article specifically about about my wider view sometime… somehow… It’ll be somewhere in that long list of drafts I mentioned in my end-of-year post inviting suggestions for the blog.

    I agree that it helps to be approachable. There’s a catch with topics like vaccines, climate change, etc. that have serious ‘things’ happening with real impacts. For those topics I think you do have to be firm about the harm, and the stuff that is promoting the incorrect ideas. It makes it hard in that you’re trying to both be approachable, but also firm at the same time.

    (I have a bit of quibble with Novella citing Sagan as an example of the deficit model, but he’s probably thinking of the documentaries and temporarily forgetting Sagan also spent quite a bit of effort trying to give people tools to identify nonsense themselves.)

  • I’ve just read on Twitter than apparently Fernbach et al are now tackling the distinction btw ignorance vs. misinformation.

    (There’s a podcast Kevin Folta has made that says this apparently: https://twitter.com/mem_somerville/status/1086648555534602240 I have to admit I rarely if ever listen to podcasts & I’m unlikely to listen to this one either.)

    That could be interesting. On the face of it you’d think those with ‘extreme’ views tend to repeat misinformation uncritically.

    These things aren’t limited to the ‘uneducated’. For example, I recently saw a Professor ‘slamming’ how a BfR report was (paraphrasing) ‘plagiarising Monsanto’ (seemingly) based on the title & lede of a newspaper piece, with no effort to check what the story was first. Seems to me that you can’t be a ‘critic and conscience of society’ if you don’t do the ‘critic’ bit properly and fairly. That’s ‘misinformation’ rather than ‘ignorance’ and demonstrates a lack of willingness to make sure they have things right first.

    Anyway… it’ll be interesting if Fernbach et al also comment about the widely-used phrase ‘wilful ignorance’, which is usually taken to mean (deliberately) not investigating stuff that doesn’t fit the person’s bias. I suspect both that and touting misinformation happen at the same time for many on the more extreme end of these “debates”.

  • This disjunct between actual and perceived understanding is by no means a new, internet era, phenomenon

    “If a man is offered a fact which goes against his instincts, he will scrutinize it closely, and unless the evidence is overwhelming, he will refuse to believe it.

    If, on the other hand, he is offered something which affords a reason for acting in accordance to his instincts, he will accept it even on the slightest evidence.

    The origin of myths is explained in this way.”
    Bertrand Russell

  • @AndyW – for sure.

    I’m also pretty sure you can look much earlier in time for similar quotes too! I vaguely recall someone putting forward a Socrates quote as an example.

    Just for fun — IIRC the pamphleteers started before Russell’s time, too: I often like to think of them as the equivalent of today’s more “relaxed” bloggers. (I can’t think of the right word at the moment.)

    I think most people would say the internet has contributed, though, if nothing else through sheer global reach and speed.

  • Please fix these errors in this otherwise effective article:

    1. Dunning*-Kruger effect
    2. Nicotiana* benthamiana