A troubling attitude to statistics

Last week, a government press release  trumpeting the success of the “Troubled Families Programme” (TFP) claimed:

More than 105,000 troubled families turned around saving taxpayers an estimated £1.2 billion

Much of the press was too supine and lazy to bother to question this, so many stories simply copied the press release.  Sadly, that’s pretty typical, although the Mirror was a commendable exception.

But the headline is untrue. We have, as of now, absolutely no idea whether the TFP has saved taxpayers anything at all; and if it has, how much.  The £1.2 billion is pure, unadulterated fiction.

The reasons why we don’t know yet are pretty simple and entirely justifiable. TFP is a complex, multi-agency intervention with a fuzzily defined target group and very diffuse costs and benefits. Evaluating its impact is a tough job. That’s why CLG, commendably, commissioned an independent evaluation of its impact from a consortium led by Ecorys. [Full disclosure: NIESR is part of this consortium, although I personally am not involved.]   The evaluation has, as yet, again for very good reasons, not produced any estimates of impact, even preliminary.

So where did CLG get the £1.2 billion number? The answer appears to be that they compared very rough, self-reported, estimates, by seven local authorities, of the amount spent by some local services on those families supposedly “turned around” by the programme, before and after the intervention. 

And why is that meaningless as an estimate of savings?  Well, there are obvious problems with sample size, selection bias, and the fact that the local authorities get paid according to the number of families they have supposedly “turned around”.  But  even if we leave those issues to one side – even if we take a giant leap of faith and assume that the figures are somehow accurate - they tell us almost nothing about what would otherwise have been spent on those families, and hence of the savings.  We do not know what would have happened without the programme. There is no counterfactual, and hence there can, by definition, be no estimate of “taxpayer savings”.

To see this, think about someone who gets flu and is running a temperature of 38.5. We give her drugs, and a week later she’s running a temperature of 37.5.  Did the drugs help?  Maybe. Or maybe she would have got better anyway. Maybe the drugs delayed her recovery. Who knows? We don’t know the counterfactual. And on the basis of the information we have so far, we can’t.

The only way we’ll know is by comparing her with someone else, also sick, and roughly similar, who didn’t get the drugs. That is the counterfactual. And that is what – using complex econometric methodologies and lots of data – the evaluation will try to do. But at the moment we have no idea what the answer will be.

So are CLG officials stupid or ignorant? Of course they’re not.  Do they not understand the basic principles of programme evaluation – that you just can’t use figures like this? Of course they do. Buried in a “methodology report” on the CLG website is the following statement – not even referenced, let alone reflected in the press release:

The Cost Savings Calculator (CSC) estimates the fiscal savings expected from the change that families experienced following support from the Troubled Families Programme. However, it is likely that some improvements in outcomes would have happened in the absence of any intervention. The improvement that would have happened anyway is commonly referred to as ‘deadweight’. As part of the independent national evaluation of the programme, data on a representative comparison group is expected to be used to estimate the outcomes that would have occurred without the programme. This will allow the evaluation of the additional impact of the programme at a national level, and produce estimates of ‘deadweight’ to be used in the CSC in the future. In the interim, the CSC uses the best available ‘deadweight’ estimates, based on whole population changes. However, in the report, all the figures provided are gross.

“All the figures provided are gross”: in other words, exactly as I explain above, entirely meaningless, at least as estimates of “savings to taxpayers”. Until the results of the evaluation are in, we know nothing about whether there any savings, let alone what they are. In other words, civil servants knew the truth. And still they allowed the publication of a press release which, in a bold type headline, deliberately and successively sought to mislead the press and the public.

There is lots more to criticise about the CLG publication. In particular, as Stephen Crossley points out, a close reading of the CLG report reveals that:

Manchester (for example) have identified, worked with and turned around a staggering 2385 ‘troubled families’. Not one has ‘slipped through the net’ or refused to engage with the programme. Leeds and Liverpool have a perfect success rate in each ‘turning around’ over 2000 ‘troubled families. By my reckoning, over 50 other local authorities across the country have been similarly ‘perfect’ in their TF work. Not one single case amongst those 50 odd councils where more ‘troubled families’ were identified or where a ‘troubled family’ has failed to have been turned around.

In other words, CLG told Manchester that it had precisely 2,385 troubled families, and that it was expected to find them and “turn them around”; in return, it would be paid £4,000 per family for doing so. Amazingly, Manchester did precisely that. Ditto Leeds. And Liverpool. And so on.  And CLG is publishing these figures as fact.  I doubt the North Korean Statistical Office would have the cheek. For those interested in this topic, Stephen has much, much more on the inconsistencies in the CLG analysis here.

Frankly, this whole episode is disgraceful.  Of course, it reflects badly on Ministers – and not just Eric Pickles, but Danny Alexander, also quoted in the press release. They are looking for positive stories about a programme for which it is simply too early to give any sort of verdict. So they are making claims that are not true. That’s politics, although I don’t much like it and I don’t think we should stand for it. But it reflects far worse on the civil servants whose professional duty it was to stop them.  Deliberately misleading the public is not public service. 

Total views: 23,765