By Seamus Hogan 08/03/2015

The mantra that “the biggest sin a team batting first in an ODI can commit is to not bat our its overs” has long been a bugbear of mine. As Dan Liebke noted in a rant about net-run-rate the other day,

We’ve had Duckworth Lewis for decades now and, even if the mathematics of it is beyond most casual fans, the basic concept that wickets remaining are a resource that need to be considered along with overs remaining is pretty well established.

Yes, a team has two resources. If it is a sin to not use one of those two resources to the max, why is not also a sin to bat out 50 overs leaving capable batsmen in the pavilion with their pads on? A batting team has to manage both declining resources with no certainty as to the effect that its actions will have on either the rate of scoring or the loss of wickets.
So I was very happy to see Chris Smith take on this mantra in his Declaration Game blog, and also to see him quote a former player, Geoff Lawson, who was prepared to take a contrarian view.

`Why?’ asked Geoff Lawson, who went on to rationalise that if all the batting side attempted was to survie the 50 overs, they were very unlikely to set a winning total. `Wouldn’t it be better’, Lawson argued, `to hit out wiht the aim setting a challenging targe, accepting the risk that they could be bowled out, than to crawl to an unsatisfactory total?’

Lawson is right, although maybe not quite. In this quote, he seems to be suggesting that a team that is heading towards a very low score might as well start taking more risks to get to a competitive total. This is a manifestation of a mathematical theorem known as Jensen’s inequality, when optimising over a relationship that is not linear, but actually, the relationship between the total score and the probability of winning is pretty much linear over the range of possibilities that can occur on any particular ball. That means, that a batting team should always ignore the current score, accept bygones as bygones, and base their level of aggression on how many balls and wickets they have remaining.

As it happens, we can quantify this decision reasonably precisely. The graph below gives a measure of what I like to call “deathness” for the first innings. The particular metric I use is the payoff to a risky single. Imagine that the batsmen have to choose between trying for a run or not. If they choose not to run, they will score 0 runs but not lose a wicket. If they try for the run, there is some probability that attempt will fail and one batsman will be run out, or they might succeed. What probability of being run out would be too high to make the risk not worth the cost. The graph shows that cross-over probability as a function of the number of overs bowled, for each possible number of wickets lost. The higher is the probability, the greater is the risk that it is worth taking and so the greater is the level of deathness (so called, because the final overs in an innings where batsmen start to take higher levels of risk is often termed “the death”). The actual numbers aren’t particularly interesting (most decisions on aggression are about striking the ball, not about whether to attempt a run), but the comparison across different lines in the graph is revealing. So, for example, the graph reveals that if a particular level of aggression is warranted after 40 overs when a team is 5 wickets down, then the same level can be justified at 23 overs if no wickets have been lost.