In 2014 a study was published that challenged an oft-cited criticism that journalists are to blame for hyped-up health stories.
Sensational headlines, breathless reporting, caveats buried so far down the story that most readers never find them. We hear these complaints all the time about the media.
But this study, published in The BMJ, turned the claims on their head. Cardiff University researchers found the exaggerated claims in new stories was strongly linked with those same exaggerations in institute press releases.
Particularly in the modern media landscape, with fewer journalists filing more stories in a race to keep up with the 24/7 news cycle and the insatiable appetites of online news sources, there’s a certain amount of good faith that a press release from a respected institute – say, a university – is robust.
Of course, it’d be preferred if specialised science journalists had the time to delve into each study they reported on, reading the paper with a careful eye for exaggeration and consulting independent experts. But the reality of modern media is it’s more likely a general reporter will be covering a story they may have limited time or background to thoroughly report on it.
The BMJ study focused on three types of health-related claims: advice to readers to change behaviour, causal statements drawn from correlations, and human inferences from animal research.
Over a third of the press releases studies contained at least one of the three above claims. When this happened, resulting news stories were more likely to contain exaggerated claims, compared to the journal article: 56 times more likely when it came to conflation of animal studies to human relevance.
“Although it is common to blame media outlets and their journalists for news perceived as exaggerated, sensationalised, or alarmist, our principle findings were that most of the inflation detected in our study did not occur de novo in the media but was already present in the text of the press releases produced by academics and their establishments.
“The blame—if it can be meaningfully apportioned—lies mainly with the increasing culture of university competition and self promotion, interacting with the increasing pressures on journalists to do more with less time.”
An interesting aspect of this 2014 study was that the researchers found no (statistically significant) link between the exaggeration in a press release and media uptake of the story, which might be assumed to be the driving force for hyping up a story. But of course, this was all retrospective and hard to say for sure, which led the researchers to a clear question: what happens if you control for other factors?
Press releases as research subjects
Which leads us to the follow-up research published this week in BMC Medicine. The same research team conducted a randomised trial, aiming to find out whether inserting caveats in press releases and moderating causal claims changed the resulting media coverage, either by improving the stories or diminishing the news value.
Working with nine UK press offices, biomedical and health-related press releases were sent to the research team and randomly assigned to one, both or neither of two interventions. In the first, suggestions were made to bring the headline and release’s claims in line with the type of evidence in the study: for instance, using words like ‘might’ and ‘may’ where data were correlational. The other intervention was to insert an explicit caveat about causality, e.g. this was an observational study, which does not allow us to conclude that drinking wine caused the increased cancer risk.
The press offices were free to accept or reject the changes, then issued their releases to the media as usual. Perhaps unsurprisingly, given the group’s previous findings, news headlines and stories were more likely to use appropriate language around causality when the press releases’ headlines and text did so too. When press releases contained caveats, about 20 per cent of the news stories followed suit, compared to almost none when the caveats were missing from the release – this point is worth highlighting:
“Explicit causality statements have almost never been seen in news previously and almost never occurred in our large sample unless the press release contained it. Most of these statements were caveats and were not within quotes, making it more remarkable that they carried through to news (it is likely that carry-through for quotes would be higher).”
In the prior study, the research team also searched for caveats or justifications; for instance, that the study couldn’t say for certain, or that even when other factors were controlled there was still a clear finding. But they were unable to draw many conclusions because such caveats were rare in the press releases studied.
To go from next to no caveats, to caveats in a quarter of news stories just by using them in a press release is a remarkable finding. To me it suggests that journalists aren’t unwilling to include these points, but they are taking their lead from the institute’s press release.
Once again, the more cautious language didn’t appear to have an impact on news uptake, which indicates that it’s feasible and reasonable for press offices to include such caution in their work:
“Clinicians, scientists and press officers can take encouragement that deft caution and clear caveats are unlikely to harm news interest and can penetrate through to news and even to news headlines.”
It’s easy to blame the media for over-hyped headlines when it comes to science and health news, so these studies should give us cause to reassess those assumptions. It’s heartening to have evidence that shows how easy it is to improve media coverage and for researchers and institutes to play a greater role in ensuring their research is promoted responsibly. Other initiatives, including the UK Science Media Centre’s press labelling system, should be encouraged and adopted to continue the work improving coverage of science and health in the media.