As I start to write this an hour after the embargo for the release of the news that a new iteration of Séralini’s widely criticised paper is being published, the paper has somewhat belatedly become available.
In addition to republishing their earlier work, four of the eight authors (including Séralini) offer a companion piece, Conflicts of interests, confidentiality and censorship in health risk assessment: the example of an herbicide and a GMO.
Originally published in the scientific journal Food and Chemical Toxicology, Séralini and colleagues’ study examined if rats with Roundup or ‘Roundup-ready’ GM maize in their diet showed markers of toxicity. The paper received wide criticism. After some time the journal withdrew the study, citing these concerns.
The republished paper may not raise many new criticisms: the criticisms it’s earlier iteration faced will still be true – republishing it won’t take these away.
Some passages of the companion piece, on the other hand may draw some fresh objections.
The new paper has been republished in Environmental Sciences Europe. As a measure of it size, prior to today the journal has published 10 papers this year – less than two papers a month.
The original paper was widely criticised, including in letters to the editor response to the original publication. Rather than repeat that here, I would prefer to cite responses offered by scientists to Science Media Centres here in New Zealand and elsewhere. To keep this brief, I will choose just one. Alan McHughen offers this fairly blunt criticism:
“The number of rats used was too small to detect a meaningful difference in treatments. In this ‘new’ study, the number of rats remains the same, too small to yield meaningful results. To illustrate for those not familiar, it’s as if Seralini tossed a coin two times, and the coin came up ‘heads’ both times. With this result, Seralini is trying to convince us that he has a magic coin that only comes up ‘heads’.
“The strain of rats used (Sprague-Dawley) was inappropriate for this type of two-year long study, as these rats have a natural predisposition to form tumors, regardless of the treatment. Séralini has not and can not justify this fatal error in experimental design
“Séralini now asserts that he follows all European ethical guidelines for animal care. But he still shows rats with massive tumors, and the European ethical standards requires rats be euthanized when tumors reach 4mm diameter. Clearly the rats in the photos have tumors larger than 4mm, about the size of a small pea.
“There’s no dose response. In toxicity or carcinogenicity studies, increasing the dose of an actual toxin or carcinogen leads to greater effect. But Séralini’s data do not show such dose effects, and Séralini still does not properly explain why.
“In short, the ‘new’ paper will have the same impact as the original, retracted paper, because the original data were useless, and there is no new data. The methodology was faulty then, and, as there is no new methodology, it remains faulty now.”
Disclosure statement for Alan McHughen: I am happy to advise that I am a public sector academic scientist serving the public interest, and as such, my research program is funded entirely from public sources; I do not accept private funds. As a result, I have no research connection to either Mr Séralini (or his coauthors), or CRIIGEN, or Monsanto.
Brief thoughts on the new publication
I haven’t read the new publication and haven’t a copy of the original to compare it with. Accounts to date indicate the new publication is substantially the same as the original, with the same data and results but a few passages revised, presumably for what the authors believe is better wording.
If true, previous criticism of the methodology would stand, as nothing has substantially changed in the work.
I am a little curious about how the review process accepted the paper. The journal offers reviewers’ guidelines, including:
2. Are the data sound and well controlled?
3. Is the interpretation (discussion and conclusion) well balanced and supported by the data?
8. Are there any ethical or competing interests issues you would like to raise?
The original publication brought strong comment about the lack of suitable controls, the statistics (small sample sizes), the ethics of continuing to use rats as their tumours grew and so on. Some of this criticism was offered formally as letters to the editor of the original publication. How were these issues resolved in reviewing this paper in a new publication, especially given these criticisms have been available for some time? (Update: I’ve written a follow-up with some early thoughts on the editor writing that there was no scientific peer review of the new manuscript.)
Some have suggested that a fresh start taking on board the criticism would have been better. Cami Ryan, for example, wrote –
“If Séralini’s goal here was the pursuit of good, quality science, he would have accepted the original retraction, paid mind to the broader criticisms that he received from subject-matter scientific experts and organizations and executed a new study (using an appropriate methodology) before attempting to publish again.”
A few thoughts on the retraction of the original publication
I’m not a fan of papers being removed from the literature.
There are valid reasons to formally label content as ‘retracted’ for fraud, misrepresented data (which can be unintentional) or the like. While in the case of fraud a paper would be pulled by the editors regardless of objections by the author(s) in other cases the authors may call the retraction themselves, feeling that their paper does not accurately reflect the situation. Editors may invite such retractions.
While there were reasons to be skeptical of the original Séralini paper (and remain skeptical given little has altered in this respect) the original retraction was not for these reasons. With that in mind, it’s fair to argue the retraction may be unwise or unhelpful.
Leaving aside what is (or is not) true for Séralini’s original paper, in place of pulling papers I prefer that journals kept them, but clearly marked them as retracted – for example watermarking each page with ‘RETRACTED’. If nothing else it would let interested researchers, and science historians, learn what it was that was retracted at a later date. There’s value in learning from things that went wrong, too.
Some early thoughts on the companion piece
I lack the time at this late hour to read this slowly. With that in mind I will offer a brief teaser to encourage readers to try their own hand and hope that I later can find time to make good sense of it. (I have briefly skimmed the entire article.)
Their bookended thesis—occurring at the start and end—that the retraction was questionable because of a lack of clear wrongdoing could be raised (although one might argue why someone would want to raise this, especially given it was being published). The penultimate point about access to data is a valid concern, including, for example, from unsuccessful human drug trials. The material in the middle, however, has me uncomfortably wondering how much is valid and how much might be overwrought ideas from sour authors.
The authors offer this as,
The present opinion is a summary of the debate resulting in this retraction, as it is a historic example of conflicts of interest in the scientific assessments of products commercialized worldwide.
They offer this reason for republishing their research paper –
Censorship of research into health risks undermines the value and the credibility of science; thus, we republish our paper.
Censorship? An editor feeling the pressure of widespread criticism I can understand. Calling ‘censorship’ would want very sound evidence I would have thought.
(Updated at 8am the following day to include one-paragraph synopsis of original paper; paragraph three, “Originally published in…” Further updated at 12 noon to add link to scientists’ commentary to Science Media Centres.)
I’d offer more but I’m due my beauty sleep. Besides, there’s football to had in the wee hours…!
1. I can’t understand why journals cannot issue their papers in the same time as any embargoes on media publication about it. On the face of it there should be little excuse for it. It should be straight-forward to configure their web services to release it at a set time in the future. Lest readers think I’m picking on a small journal, the very high-standing Proceedings of the National Academy of Sciences USA sometimes does the same.
2. If the google publication times are to be trusted, at least one online source published four hours (or more) ahead of the embargo. The Examiner looks to have been a few minutes early, too. (The Examiner piece leans too far to one side and overlooks aspects that would offer a more balanced and accurate depiction of what too place.)
3. There are few journals that publish infrequently that are of high standard; Retraction Watch suggests this journal is a more modest affair.
4. As another parallel, I never liked that the Protein Data Bank removed the manually modelled structures, despite that they were readily seen to be wrong later. They were examples of sincere efforts to create a model manually and potentially could have served as useful negative controls for testing assessments of models, for example.
Some related articles on Code for life:
Gene editing and GMOs in NZ, part one (a take on gene editing for non-scientists)
Gene editing and GMOs in NZ, part two – is the law out of date? (the court ruling and the GMO aspects of NZ’s HSNO Act)
Gene editing and GMOs in NZ, part three (some additional thoughts and an introduction to a perspective article)