Joel’s long term project has been to look at differences in outcomes across students and schools, using the administrative data held in the StatsNZ data lab to adjust for a rather broad assortment of things that students bring with them into the classroom.
Naive league tables will credit, or damn, schools for outcomes that are largely due to differences in the communities that those schools serve. Getting better measures on outcomes, adjusting for the differences across families that we can see in the data lab, helps.
Our measures don’t tell you what’s going on in any particular school, but they do let you know whether a school is doing about as well as expected in the current system given the kids it teaches, or whether it’s a place that the Education Review Office might want to go visit to see what’s going on. It could be that better-than-expected performance in one school has nothing to do with that school’s practices but instead has everything to do with an after-school tutoring club the parents set up – for example.
Earlier, Joel looked at differences across schools to show that most of the difference in public school performance, by decile, disappears when you account for the differences in the families those schools serve. Piles of low-decile schools showed up as top performers, if you run the stats properly.
The broader project, which will take some time because lab work is onerous and we only have Joel doing this work for us, will extend to a much broader set of outcomes going beyond NCEA. The Ministry of Ed has like 3000 staff; we have a bit over a dozen staff and we only have Joel in the lab.
I’m keen to know how different schools vary in stuff like Not In Education, Employment or Training (NEET) status in years following high school completion; tertiary completion; salaries a few years after completing high school; crime rates; benefit uptake – there’s a lot that can be looked at. But it’ll take a while. We start with one set of data matches and build outward from there, adding things on as we go.
Anyway, Joel’s most recent project looks at whether there are differences in outcomes across state, state-integrated, and private schools. Not many kids go to private school in New Zealand, but private and integrated schools dominate the league tables for achieving University Entrance.
Because of the cost of private schools, they’re mostly going to be attended by kids from richer families. We’re still looking at outcomes observable in school data available in the data lab. School data includes every student’s performance on every NCEA standard they’ve sat, and whether they’ve achieved University Entrance. But it doesn’t include data on whether they took up options available in some private schools to attend International Baccalaureate classes instead, or to take the Cambridge exams instead of NCEA. So Joel looked at UE as basis for comparison.
And remember that the broader project will eventually get to a lot more outcomes. Those take time, and we have one econometrician on the job.
Joel found that state-integrated schools outperformed state and private schools on University Entrance, adjusting for all the family background characteristics observable in the lab. You can’t adjust for everything in the lab, but the stuff you can adjust for, like parents’ education, will also be correlated with some of the things you can’t observe.
So, for example, the weight and value parents put on education can matter a lot, but you can’t observe that in the data lab. If parents who put the highest value on education will both push their kids harder at home, helping them through, and be more likely to select out of public schools and choose an integrated or private school, then you could be unfairly crediting private schools for effects that come from family background. But, at the same time, if, on average, the parents who put the highest value on education also have high levels of education themselves, then you’ll have mopped up some of the effects of “parents value education” by controlling for parents’ own education. It isn’t perfect, but so long as the unobservables correlate positively with the observables, then you’ve handled some of that selection issue.
Or, at least, you’ve somewhat bounded it. Take a very different area: the persistent arguments about whether unobserved confounds drive the J-curve in alcohol and health. If adjusting for all of the observable health behaviours you can find doesn’t do much to reduce the J-curve, and those observable health behaviours are real likely to be correlated with unobservable health behaviours, then it isn’t plausible that unobserved confounds are driving the rest. Here, adjusting for the kitchen sink of family background reduced the coefficient on state-integrated schools but hardly got rid of it. You’d need the effects of the unobservables that aren’t already mopped up by the observables to be as big as the effects of the observables, and to have driven the selection of private over state schools, to knock out the effect of private schools – and you’d need huge effects of unobservables to take out the effect of state-integrated schools.
Anyway, here’s Auckland Uni Prof of Ed Peter O’Connor on it all:
However University of Auckland Professor Peter O’Connor argues that the focus on UE results alone “reduces the complexity of learning to such a narrow construct that it becomes meaningless”.
“As a professor of education, I’d fail my master’s students on that. It’s a false science,” he said.
“There is nothing to suggest that going to a private school means you will be happier, lead a more purposeful life, contribute more to the world, have better relationships with your partner or your children. That in fact your life matters for having been lived.
“You might have better connections, even more money, but that isn’t much in the grand scheme of life.”
It’s kinda funny. We never said anything about happiness, leading a more purposeful life, or any of that. We were just looking at the average effects of integrated and private schools as compared to state schools on university entrance – a metric a lot of people still do care about, and the one that it’s possible to check in the lab. We will be broadening to more outcomes in future.
But I guess if you’re a student contemplating doing grad work in education that has any kind of econometrics in it, you probably shouldn’t pick O’Connor as supervisor. If he doesn’t like whatever numbers you get, he might fail you because he didn’t understand the study or the methods. Michael Johnston over at Vic’s education department would be a way better choice if you wanted to do it in an Ed department.
Or, perhaps even better, do it in economics.
Either way, you can even start with all the code Joel’s used to run the data matches – we’ve got it all up in there freely available for anyone else to build on. There are years and years worth of studies to be done in there, and we’ve only got Joel. It’s basically the best administrative dataset in the world for linking high school students’ grades, their family backgrounds, and their later life outcomes. Drop us a line if you’re considering picking up on any of this in your thesis work – we’re always happy to provide a bit of advice.