The trouble with the Troubled Families Programme

NIESR has been in the news quite a bit in the last 24 hours following the publication of a set of reports on the Troubled Families Programme. The Programme was launched in 2012 with a budget of £448m to tackle around 120.000 problem families who were said by the Government to cost taxpayers £9bn a year.

Post Date
18 October, 2016
Reading Time
4 min read

​NIESR has been in the news quite a bit in the last 24 hours following the publication of a set of reports on the Troubled Families Programme . The Programme was launched in 2012 with a budget of £448m to tackle around 120.000 problem families who were said by the Government to cost taxpayers £9bn a year.

NIESR  was commissioned by the Department of Communities and Local Government to evaluate the effectiveness of the programme, as part of a consortium led by Ecorys.

The key finding in our report is that across a wide range of outcomes, covering the key objectives of the Troubled Families Programme – employment, benefit receipt, school attendance, safeguarding and child welfare – we were unable to find consistent evidence that the programme had any significant or systematic impact.

The vast majority of impact estimates were statistically insignificant, with a very small number of positive or negative results. These results are consistent with those found by the separate and independent impact analysis using survey data, also published today, which also found no significant or systemic impact on outcomes related to employment, job seeking, school attendance, or anti-social behaviour.  

The purpose of the analysis was to estimate the impact of participation in the initial phase of the Troubled Families programme on a range of outcomes encompassing benefit receipt, employment, educational participation, child welfare and offending. The data used were compiled from information provided by local authorities and from national administrative datasets covering tax and benefits receipt, offending, educational attainment, schooling and child social care.  

Fifty-six local authorities provided the data used in this study between October and November 2014.    These data were then matched to national level administrative data sets. Data was obtained on approximately 25 percent of the 120 000 families that participated in the programme, representing a large sample, and enabling us to undertake detailed analysis. However, the data supplied was of variable quality.  As a result, some important data was missing, and it was necessary to make certain assumptions in assigning individuals to treatment and control groups. In addition, a significant number of individuals were not matched to certain of the administrative data sets, and this necessitated further assumptions – for example, we assume that individuals not matched to employment records were not employed.

A number of factors would appear to have contributed towards our evaluation result. It’s possible,  for instance, that outcomes were measured too soon in some areas to provide a full account of the Programme’s potential impact.  But even in those cases we would still have expected to have observed some positive trends in impacts, and yet very few materialised. In particular, as we wrote in the report “ the lack of impact observed with respect to employability – where successful programmes generally have a short-term impact – does not suggest that longer-term impacts on employment outcomes are very likely. There was, in fact, no evidence of even a modest reduction in benefit receipt over 52 or 72 weeks, or any other progress towards work in the form of adults moving from JSA onto other types of out of work benefit.”

The real answer might lie in the targeting methods. The  local share of the target of 120,000 families for the programme was essentially a notional one, and it is unclear how far the families who went on to receive the intervention were representative of this target group, beyond the fact that they met the eligibility criteria at a given point in time.

It’s also important to understand why the Payment by Results (PbR) progress data showed an apparently different picture from that described by our evalution. The PbR data showed that the overall target of 120,000 families was almost achieved, with most Local Authorities having identified and achieved outcomes-based payments for families in numbers that closely represented their local share of this target. By contrast, the impact findings, as set out above, did not provide any systematic evidence that the programme had any impact on any of the outcomes related to employment, benefit receipt, offending or child welfare.

The key point is that the PbR progress data counts the number of positive outcomes observed for families on the Programme. The impact estimates, in contrast, estimate how many net positive outcomes there are over and above any positive outcomes that would have occurred in the absence of the programme. There is therefore no necessary contradiction between the PbR data suggesting that 120,000 families had been turned around, and the findings of the evaluation that no significant or systemic impact could be attributed to the programme. It is simply that different things are being measured and what we have uncovered is that there is no evidence of any systematic or significant impact of this policy on families that do need help.