It’s time we had a national conversation about if the way police dogs are currently used does more harm than it’s worth.
Earlier today, I published a feature on the use of police dogs in New Zealand, based on Tactical Options Reporting (TOR) data detailing police use of force which was released under the Official Information Act.
A shorter version has also been published on The Spinoff.
I hope you’ll take the time to read my full article on this. I think it’s a very important topic that isn’t talked about enough in New Zealand, and we should very seriously consider whether or not current practices are worth the harm they cause. But I’d also like to talk a little about the process of writing it.
Getting this data has been difficult. Each year, NZ Police publishes a summary of the use of force data for this year – its Tactical Options Research Reports – but they don’t contain everything. When the first set of raw data was released in March 2017, covering July-December 2016, I took it apart and discovered things I hadn’t known by reading the reports.
Using that data, and later including data covering later periods, I experimented with my first attempt at interactive data visualisation with a focus on how NZ Police’s application of force has a disproportionate impact on Māori.
That interactive ended up accompanying an NZ Herald article about how NZ Police use tactical options, including an incident during which an attack dog was set on an unarmed 12 year old girl. This incident only came to light because NZ Police released the raw data.
Despite it being 2021 now, my recent feature relies on TOR data covering 2018. After releasing the third set of TOR data, NZ Police began to refuse my requests for more data unless I would pay hundreds of dollars first. It took several requests over two years, and a complaint to the Ombudsman that took 9 months to resolve, before NZ Police finally released TOR data covering 2018.
I think this is data NZ Police should release proactively each year, but I’ve yet to see how they will react when I inevitably request TOR data covering 2019.
When they released the 2018 data, NZ Police also asked that:
To ensure data accuracy and integrity, NZ Police recommends that members of the public access use of force data directly via the OIA process and/or through the NZ Police Annual Tactical Options Report series
Unfortunately, I haven’t found the OIA process to be very quick. Practically every OIA request I’ve sent to NZ Police has taken over a month to get a response, and often that delay has been unlawful.
There are some cases, when you know exactly what questions you want to ask, when asking their OIA team can be a good course of action. Though if you have a good understanding of the data, having access to it can often help you answer questions yourself in a fraction of the time, without taking up anyone else’s resources.
For example, in November 2019 when the Mental Health Foundation was speaking out about its opposition to Police’s “Armed Response Teams” trial, it cited figures from Police’s 2016 TOR report about the disproportionate use of tasers against mentally ill people:
(Despite my expectation at the time, I didn’t have the 2018 data released to me for over a year after this tweet.)
Though Police did report this statistic in their 2016 report, they chose not to include it in their 2017 report.
Though, when TASER was deployed, it was more likely to be discharged at subjects with perceived mental distress, irrespective of whether the subjects were armed or not, with a show to discharge ratio of 4:1 (compared to 6:1 for subjects with no perceived distress).
Because I had access to the TOR data covering 2017, I was able to quickly determine what the relevant statistic was for 2017:
In 2017, police discharged tasers in 14.4% of incidents where they used them against a person who did not have a mental illness. But they discharged tasers in 25.4% of cases where the person did have a mental illness.
But this is a comparatively minor positive from having access to the raw data. The real benefit comes from the ability to do exploratory analysis.
Simply put, you don’t know what you don’t know. The answer to one question can prompt five more, and only one of those answers might be of interest. Going through the OIA process for this, with delays of a month or more for each answer, would be a painfully slow process.
Having had access to the full 2018 data, however, allowed me to do an exploratory analysis. I had intended for some time to write about the use of police dogs, since seeing the alarmingly high injury rate of police dogs when compared with other tactical options. This injury rate has been consistently reported in Police’s summary reports, and some related statistics can be calculated from the figures presented in them. But more complex and specific figures are absent. The data set is simply too large to present all of it in a summary report.
Even the cut down data that was released to me under the OIA, after Police recoded and withheld many columns in consultation with the Ombudsman to sufficiently anonymise the data, it was still 4,324 rows and 422 columns.
Alongside my police dogs feature, I published a “codebook” containing all the code I used to process the data, and to produce each of the figures and charts used in the article. But I wasn’t able to also document the whole process of my exploratory analysis here, including all the questions and answers that didn’t make it into the final article. Perhaps the closest I came was in printing out the summary of each column to see what it contained.
Most of the interesting insights in this data comes from interactions between columns. One of the two core figures in my article was one that I hadn’t expected to find, and only discovered through exploratory analysis while looking at the “PCA” columns describing how police officers categorised subjects’ behaviour.
In 76% of incidents when police officers set attack dogs on people in 2018, those people were below the “assaultive” threshold. That is, they were not expressing any intent to cause harm, either verbally or through their body language or actions.
Almost every time that happened — 73% of all attack dogs uses in 2018 — NZ Police recorded those people as being “active resistant”. They were pushing away, pulling away, or running away.
When I had begun writing my article, I hadn’t yet realised that most of the moderate and serious injuries inflicted by police were caused by police dogs, but this is something I could have determined from the information in Police’s annual TOR reports.
Once I saw this, I expected it to be the central figure in my article, but finding that the vast majority of the time police dogs were directed to bite people they were just trying to get away it was clear that this added a very important context to their use, and to the harm they do. If I didn’t have access to the data, I doubt I’d ever have learned that this was the case.
Knowing the right question to ask is often the hardest part of investigating something like this. Without being able to do this sort of exploratory analysis, finding the right question can be a very difficult task.
I really hope NZ Police will continue to release their use of force data year on year. It’s too important a topic to not be subjected to independent scrutiny.