A recent study found most of the 10 most popular ‘health’ articles of 2018 were inaccurate. Similar but less skewed results were seen for the top 100 articles. I dislike pointing fingers. For these things, they tend to point at journalists and social media. Instead, I’d like to highlight something less talked about when discussing inaccurate health and science news: could editors help?
Many newspapers position themselves as fighting fake news. That’s mostly for politics, but there’s a lot that news outlets might do for science and health news, too.
Ideally these inaccurate stories would never published in the first place. Then we wouldn’t have to worry about what’s in social media! In reality there will always be some inaccurate stories in social media, but news outlets can play their part.
It’d be great to see more editors offer strong gatekeeping. Perhaps also encouraging writers to do better. Not an easy thing to do in these times with non-existent budgets. But perhaps it’s a necessary thing? Science and health stories affect people too, after all.
The study that brought this mind looks at health news. The brief for health overlaps science, so they’re taken together here. Many of the issues from one spill over to the other.
The Health Feedback and Credibility Coalition study
They have chosen the ‘most popular’ articles from news outlets judged by the number of social media engagements (comments, likes, how often they were shared), then checked their accuracy. For the top 10 articles they invited experts to review them; for the top 100 they used Health Feedback’s science editors.
If you’re curious you can scan their list of all 100 articles. The most popular and not credible article is Federal Study Finds Marijuana 100X Less Toxic Than Alcohol, Safer Than Tobacco.
The results for the top ten most popular articles are skewed towards poor quality articles. Having said that with a sample size that small it’s hard to say much. It might be a useful reminder that because something is popular, it’s not necessarily sound.
For the top 100 articles, it’s split down the middle. Factually inaccurate stories spread roughly as readily as accurate material. (Bear in mind reviewing the top 100 is less in-depth than for the top 10.)
I’m not surprised to read that food and nutrition articles did especially badly, “only 5 out of 18 articles received a positive credibility rating.”
The strongest category was ‘disease and disease treatment’,
Although 4 of the 21 articles originated from websites of questionable credibility (these 4 accounted for all the negative ratings in this category), the rest of the articles came from established news organizations, such as Time, The Atlantic, CNN and ABC News. This indicates again that well-established news sources are more likely to report science accurately, likely as a result of higher journalistic standards and stronger oversight over quality of sources and content.
They don’t say this directly, but editorial standards may be a factor. It would be interesting to examine the role of editors in these stories.
A role for editors: better gate-keeping and ownership?
As newspapers increasingly step up to fake news, it’d be great to the see the same for science and health news. You have to empathise with editors, though. They’ll be frantically marshalling everything as it is. Any solutions want to be practical.
A few loose thoughts include,
- The survey notes use of dubious websites. A simple check of the source(s) used may go a long way.
- If you can’t hire specialist editors, can you out-source checking of individual articles?
- Perhaps attending Science Media Centre training weekends for editors? (I have no idea as to what extent they offer these or not.)
- Editors might try use specialist writers for specialist topics. (Related to this a simple rule of thumb I use is ‘can I critique this?’ If I can’t it’s either a topic I should avoid, or I’m looking at serious amount of backgrounding and checking. Editors might ask the same of writers: are they able to critique the claims?)
- To what extent do editors ask journalists to present fact-checking material? Do they list the source backing each factual statement? Are those sources sound?
Whatever the ‘solution’ to poor quality material in health coverage, it’d be great to see editors step up to meet the challenge. We all (writers and editors) can do so much better.
Feel free to add your thoughts or suggestions in the comments.
Other articles on Code for life
The survey authors point at The Guardian for running opinion pieces, including book excerpts. Often these are only clear if the reader checks the blurb immediately after the article. It might help if these were at the top!
They also briefly point at headlines (and by proxy ledes). This is something they have tackled previously. They found that clickbait headlines correlate with poor credibility. Headlines (and ledes) are often written by sub-editors, not the journalist. Ideally they’d reflect the article in a balanced way, but an aspect of headlines is to ‘trap’ readers skimming a page, and start them reading the article. As a consequence headlines easily veer towards clickbait.
1. I’m a bit leery of what may be a dodge. Many articles read as if quotes—from a person, website, scientific paper, etc—are used to ‘get around’ not knowing the science. Potentially that leaves that journalism as uncritical transcription.
About the featured image
The title page of the world’s first newspaper,
Title page of the Relation aller Fürnemmen und gedenckwürdigen Historien from 1609. The German-language ‘Relation’ had been published by Johann Carolus at the latest since 1605 in Strassburg, and is recognized by the World Association of Newspapers as the world’s first newspaper.
Source: Wikipedia. Public domain.