By Mark Hanna 31/10/2016 4


The New Zealand Herald and Jimbo’s have provided us with an idealised “bad science” case study.

No bones about bones
The New Zealand Herald article

Today, the Herald published an article about a “trial” published by pet food manufacturer Jimbo’s: No bones about bones

The trial was intended to evaluate how eating bones affects the dental health of dogs. Thankfully the article makes it pretty clear why Jimbo’s would be looking into this, although it reads more like a quote from a press release than the declaration of a conflict of interest that it really is:

Jimbo’s sells over 300 tonnes of bones per year which help thousands of cats and dogs keep healthier teeth.

This trial seems rather special in that it’s a rare composite of just about every aspect of poor methodology all put together at once. I think it makes for an excellent “bad science” case study, which could hopefully be a good resource for journalists who might find themselves in danger of reproducing the Herald’s results.

And it’s not just journalists that can benefit from understanding this. Being aware of the potential shortcomings of research can make everyone more savvy when it comes to parsing science news. None this is particularly hard to understand at a high level.

Pared way down, designing a study is about two things:

  1. Finding a way to test a hypothesis by attempting to disprove it.
  2. Taking measures to account for as many sources of bias as possible.

Jimbo’s failed the first of those objectives spectacularly, but at least they were up front about it:

The Jimbo’s Dental Trial was carried out because we wanted to prove what we already knew – that a species-appropriate diet including a bone a day can improve or maintain dental health in our furry friends.

Jimbo’s Dental Trial – 2015

It’s roughly possible to pair up different aspects of good methodology to the source of bias they’re trying to account for. For example, having a large sample size is a way to diminish the effects of random variation within your sample population.

Here’s a list of the methodological problems with this Jimbo’s trial, and the corresponding sources of bias that they aren’t accounting for:

  • Source of bias: Publication bias, where positive results are more likely to be published than negative results.

    How you should account for it
    Register your trial ahead of time, and ensure it gets published in a peer-reviewed scientific journal.

    What Jimbo’s did in their trial
    As far as I can find, the trial wasn’t pre-registered. Instead of being published in a peer-reviewed scientific journal, it was published as a PDF on the Jimbo’s website.

  • Source of bias: Random variation within your sample population.

    How you should account for it
    Have as large sample size as possible. Of course larger sample sizes makes research more expensive, but if your sample is too small you won’t be able to reliably detect an effect.

    What Jimbo’s did in their trial
    The study used a sample of eight dogs. This was further reduced to seven after one dropped out for not following the diet.

  • Source of bias: Regression to the mean, changes unrelated to the experiment, Hawthorne effect etc.

    How you should account for it
    Have an appropriate control group, for example a group of dogs not on the special diet.

    What Jimbo’s did in their trial
    The study did not include a control group.

  • Source of bias: Bias, unconscious or otherwise, from researchers making measurements.

    How you should account for it
    Blind researchers making measurements so they don’t know whether the participant they’re evaluating was in the control group or the experimental group.

    What Jimbo’s did in their trial
    There was only an experimental group, so blinding was not possible.

  • Source of bias: Differences between the populations in the control and experimental groups.

    How you should account for it
    Randomise which group each study participant ends up in.

    What Jimbo’s did in their trial
    There was only an experimental group, so randomisation was not possible.

The trial also lacked any sort of statistical analysis. Without a control group, there isn’t really a good way to do this, but it seems like Jimbo’s didn’t even try to figure out how likely it was that their result was a false positive.

I always find it amusing to see research that fails so spectacularly to be well-designed, as this has, but there’s a downside as well. This was picked up completely uncritically by the New Zealand Herald. In fact their story reads to me more like an advertisement or press release than the critical analysis I’d expect to see from a high quality media outlet.

Although in the end, the Herald did one thing right. They provided a link to the original research so all of its readers could see for themselves how spectacularly bad it is. 


4 Responses to “Bad Science Case Study: Dog Bones”

  • As I write, the things the Herald did right in publishing that ‘story’ is standing at zero. Clicking on the link to the original research now resolves to this: “We are sorry that we are unable to display the page you requested” (although to be fair I suspect it’s because Jimbo’s has killed the page).
    Many thanks for your article, btw — an extremely useful reminder

  • Hmmm, I’m not sure about:

    “Pared way down, designing a study is about two things: 1) Finding a way to test a hypothesis by attempting to disprove it”

    Not necessarily. It seems to me that many (most?) scientific studies are not simply attempts to refute a hypothesis! They are rather attempts to demonstrate a significant difference. So, in this case, they should have n randomly selected dogs in a treatment group and n randomly selected dogs in a control group (where n is as big as possible). It would be best to check that the average health of teeth is about the same in both groups. Then they should give the treatment group a “bones only” diet for a suitable length of time and see if the treatment group’s teeth are healthier than those of the control group at the end of the trial. If so, then we can conclude with high confidence that eating bones makes teeth healthier (or at least that a “bones only” diet does). I don’t see anything here about trying to refute any hypothesis. In fact, a bad batch of bones could lead to teeth problems which would falsely appear to refute the hypothesis!

    I suppose you could say that they were attempting to disprove the null hypothesis, but that is mere pedantics! The problem with what was written in the blog post is that it wrongly implies that because Jimbo’s wanted to show that bones do improve oral health in dogs, they weren’t trying to disprove a hypothesis so they weren’t doing the science right (i.e. “Jimbo’s failed the first of those objectives spectacularly, but at least they were up front about it!”) Well, no, they didn’t fail it at all!

Site Meter