Closing the Credibility Gap

Why Act’s race-based welfare statistics are worthless

Listener: 7 February, 2004.

Keywords: Maori; Statistics;

Early in January the Act Party released a paper that calculated the tax collected from Maori was $2.3 billion a year, while government spending on Maori was $7.3 billion a year. Whatever the factual situation –– below I suggest that the figures are misleading –– different political flavours will draw different conclusions.

Racists may surmise that the gap shows Maori are not pulling their weight. Demographers will say that the Maori population is younger than average and children don’t pay tax. Welfarists might say that Maori are poorer and the fiscal system is (mildly) redistributive. Act argues that the data proves the current policies are not working. That lacks logic, since the gap may have been larger a decade ago. In any case, the data certainly does not prove that Act’s alternative policies will work, any more than the Rogernomes drawing attention to poor economic performance meant their policies worked. (They made things worse.)

However, Act got the figures wrong. I do not receive the generous state subsidies for such investigations that the Act parliamentary research unit does, but even a couple of hours on a Sunday afternoon (I should’ve been at the beach) showed that their figures are riddled with problems, and probably overstate the gap. Fortunately, the paper that Act released is sufficiently explicit to be able to trace the errors. To be fair to the unit, this is the work of amateurs, not crooks. Perhaps the major mistake is that they have grossly underestimated the tax paid by Maori, using 2000/01 March year data as if it applied to the 2002/03 June year, they didn’t bother to reconcile the data base they used (self-reported incomes) with more precise estimates of recorded income, and they omitted over one-third of the tax base.

Suppose that we were to apply exactly the same method to non-Maori. According to the Act method, the rest of the country pays about $23.4b while benefiting from $34.5b of government spending, again being in a major deficit. Who is covering the deficit (and contributing to the budget surplus)? The nonsense arises because the Act calculations ignore over $14b of taxation.

I leave others to ask what the credibility of Act’s economics policy is when the party loses $14b of tax revenue. Instead, I want to illustrate this as typical of an amateurism endemic throughout the public debate.

The first rule in professional research is know your literature. Early New Zealand exponents of the fiscal incidence research included Cornelis Weststrate, Wolf Rosenberg and Les Castle. Suzanne Snively made enormous progress in the 1980s, leading to Statistics New Zealand’s The Fiscal Incidence on Income Distribution 1987/88 published in 1990. More recently, Ron Crawford updated that study for the Treasury. Not to know the literature is to repeat past mistakes. Even the Treasury, I am afraid, makes them. Professional researchers have been appalled by just how wasteful some of its work has been because it does not know the territory. (Not Crawford’s, though.)

The Act research unit extended the standard work to fiscal incidence by ethnicity, following some equally literature-blind work by the NZIER. Ethnic data requires great care, because definitions jump around from database to database. The Act report does not discuss this problem, so I don’t know if it is sensitive to it. A professional would not dare publish work on one ethnic group without checking the method worked on others.

There is no indication that the Act report went to independent referees. (The NZIER uses internal referees, and the Treasury often uses overseas ones who know little about the New Zealand context. Better than nothing, but neither strategy is ideal. Instead, hire independent competent New Zealand referees.) The professional has the advantage of experience. Is the result consistent with past research findings? If it is not, what assumptions generate the un-usual outcome? (The smell test told me there was something wrong with the Act data. Although it is likely that there is a gap, the reported size suggested a miscalculation.)

None of this is rocket science. Yet, far too much New Zealand economic (and other social science) research ignores the elementary. A Gresham’s law predicts the debasing of research quality under commercialisation. In the 1970s and early 80s, the social science professions struggled against it. But Rogernomes, having little respect for intellectual quality inconsistent with their ideology, penalised those who identified the flaws in their approach, the struggle collapsed, and we end up with shallow meaningless “research” such as that released by Act.