Friday, 10 December 2010

Response to Anthony Painter

I wanted to put this in the comments page here but it was too long.
In summary:
Essentially, for all your quibbles, you agree with my statistics but quibble about the analysis. You agree that EMA positively impacts 5-10% of students and is wasted on the rest. I hope the NUS are reading this.

I have never claimed that EMA doesn’t benefit anyone. It would be odd indeed if a £550million programme didn’t have some benefits. My claim has always been that the benefits EMA brings are out of all proportion to its costs. My claim has always been that, based on my experience and all the data in these papers:
  • ·         About 90% of EMA goes to kids who don’t need it and who would be at college anyway.
  • ·         About 5% goes to kids who would drop out without it but who don’t get any significant benefit from being at college – either dropping out or not getting any qualifications.
  • ·         The other 5% goes to kids who do benefit from it.

That’s the argument I have always made and I see nothing from any of your quibbles to change this.

Where we differ is that you think it’s a) impossible to improve that waste rate and b) that the programme is still worth it. I differ on both these. The IFS – and indeed the CBfT, in their pro-EMA report – suggest a number of ways you could more tightly target the allowance and the government plan to give it to schools will clearly eliminate a lot of waste. If it truly proved impossible to eliminate the waste, then I am afraid EMA would not be worth it.

OK, now the detail. Let’s take this bit by bit.

The difference between the area and individual impact of EMA.

The figures you mainly quote above are pilot area averages (thou you do jump about a bit).

If by jumping about a bit you mean I have read the entire paper...

It is important to read pp 6-7 of the IFS report to understand why this is problematic. The key point is:

"We therefore suggest as a rule of thumb that, in order to obtain the effect of the EMA across those who received it, the estimates above be multiplied by a factor of 2½ (for outcomes at age 16) or 3 (for participation at age 17 and attainment outcomes)."So the participation and attainment impacts are in the main much greater than you suggest.

It is important to read appendix D of the report to understand they’ve already grossed up the figures. The stats I take from the tables are ones that have already had this adjustment made to them – ‘The headline results (presented in Tables 1a and 1b) can therefore be ‘grossed up’ to give the effect of the EMA on those who received it by dividing the estimated impacts by the proportions above... doing so yields the notional figures that are presented in Table 1 alongside the actual estimated impacts.’ The 7.3 percentage point figure for participation I have been quoting all along is the one that has already been ‘grossed up’. Not only that, but when they broke it down by age, sex, method, etc., I picked the largest figure, the one which was most damaging to my case. Also, after reconsidering this, I have realised that in one of my previous posts, I assumed that 7.3% figure wasn’t grossed up, and so I grossed it up again! So in this post, and in the letter to the NUS here I am actually wrong – EMA has even LESS of an impact than I said! See here for details.

As for the impact on attainment, you are right in that I and the IFS do not gross up the impacts on attainment. This is for two reasons. Firstly, with attainment more than with participation you want to get a picture of the impact the policy has had on the entire cohort. This is because the base points rate is also measured over the entire cohort. It’s practically impossible to pick a base which is the average of kids on EMA who would have been doing their courses without it. You don’t know which kids and which results belong to those who would have been there anyway, and those who have been incentivised to stay on. As I pointed out in my original post, the IFS are aware of this problem with the base rates, and that's why I think they do gross up participation and don't gross up attainment. I do think that the attainment results are probably the most unreliable part of this survey - there is this problem with the base rate, and the statistical significances aren't very good either - when you factor in the problem of real world significance I talked about last time I would not be inclined to draw anything meaningful from them.

Secondly, if you try and gross up the attainment stats, you are inevitably have to pick stats which are self selecting. Let me explain: you can obviously only calculate attainment by looking at 18/19 year olds. But that means you exclude all the 16 year olds who drop out before doing any qualifications. The only way you can measure attainment of EMA recipients is to exclude all the EMA recipients who didn’t attain anything – which is quite a lot and so will distort your figures. 25% of 17 year olds getting EMA don’t progress to getting it at 18 – that’s very significant.

Also, so paltry are the attainment rates that you actually could gross them up by a factor of three and they’d still be educationally insignificant – still well within the margin of error of the exam boards.

2. The results of the EMA on black and Asian students (that you applaud and accept to be significant)
I didn’t go as far as applauding them – they’re actually extremely paltry, just slightly less paltry than the ones for white kids.I said they were the only ones that were educationally significant – they are, just.

are not just an anomaly- they indicate the real impact of the policy. These communities are more likely to be recipients of EMA.
Ah, classic sleight of hand. Yes, a greater percentage of the ethnic minority community are likely to be recipients of EMA than the percentage of the white community who are recipients; but most EMA recipients are still white because most of the UK is white! Ethnic minorities are still a small minority of overall recipients (about 11% based on the figures in this survey); they’re just a slightly bigger minority of EMA recipients than they are of the entire population.  So again, most of EMA is wasted. Also, I would suggest that the fact that we can so obviously tell that it’s ethnic minorities who benefit from it makes a targeted scheme like the one outlined below easier to administer – you’d target the money to areas with high ethnic minorities, and/or make principals aware of this – although the problem with this is the problem of ‘reverse racism’. Do you really think that it’s insignificant that EMA has no impact on attainment on the largest ethnic group?
It should be emphasised however that even here the impacts are underestimates- you have to be enrolled and participating in order to receive EMA (as well as meeting the income criteria.)
I’ve dealt with this above. And it should also be emphasised that the statistical significance of a lot of them isn’t very high either – some are only at the 10% level of certainty.

3. You talk about A Levels but largely neglect level 3 qualifications where the impact is greater. There is a reason for this. Colleges use EMA as a 'nudge'- receipt is dependent on attendance and punctuality. Without EMA, there's no nudge and grades decline. Vocational level 3 qualifications are entry level in a whole range of careers....
If you’d read the post properly I deal with this issue. The IFS themselves are extremely unclear with the terms. The bit you’ve obviously missed is: there seems to be quite a bit of uncertainty about the points tariff the researchers talk about. At one point they refer to it as the ''Key Stage Five tariff'. It's a common misconception even amongst educators, but there is actually no such thing as Key Stage Five - at least not in the official sense that there is with Key Stages Three and Four.  Then, at other times, whilst appearing to refer to the same thing, they talk about the 'A-level tariff'. Of course, there is such a thing as A-levels, but I don’t think there is such thing as an exclusive 'A-level tariff'.  I have assumed what they are talking about here is the UCAS tariff, especially as they use the phrases ‘Key Stage Five tariff’ and ’A-level tariff’ interchangeably. The UCAS tariff includes a whole range of different qualifications, academic and vocational. So when I have been referring here to A-levels, don't assume that I am excluding all the kids doing vocational courses. The tariff includes them, and believe me, an increase of 2.5 points is just as unimpressive for BTec nationals as it is for A-levels - arguably even more unimpressive.
An improvement of 2.5 tariff points in a BTec national diploma is even less educationally significant than the same across three A-levels.

4. Your deadweight argument fails as you do not subject alternative schemes to the same level of scrutiny. The coalition's scheme has a fundamental weakness. You can't identify those who would drop out without EMA.
Given that it seems to impact ethnic minorities the most, then you very clearly can – although I accept that this brings up the issue of race discrimination. Be interested to see what you think about this. Furthermore, the IFS themselves saw their survey not just as an overall impact evaluation, but as ‘being able to provide more breakdowns by subgroups which will shed light on which groups the EMA is having a relatively large and a relatively small impact upon.’ There’s a mine of information in there about the types of kids EMA benefits, not just according to race but according to gender, previous performance and wealth.  So to say ‘you can’t identify those who would drop out without EMA’ also seems to suggest you stopped reading the IFS report at page 7. Even the CfBT, who wrote a report defending EMA, acknowledge that you could probably cut the 10 and 20 quid bands. But you won’t even concede that!
This creates a moral hazard. Students have an incentive to threaten to drop out or to threaten to not enrol in order to force the institution's hands. How do you judge the genuine case? So the bureaucracy argument is neither here nor there- a large number of people who are hit by EMA will inevitably be missed by a discretionary scheme. The choice is between a £500 million policy which meets its objectives or a £50 million scheme that doesn't with the knock-on social, economic and fiscal impacts of swelling the realms of the NEETs.
 I think this is the area where I disagree with you most fundamentally, and not for reasons of statistics. Perhaps some kids will drop out even though they could stay on just in an attempt to ’bluff’ the institution. That much is apparent from the comments on the EMA website and this lovely chap here– but I would suggest any kid going through life with that attitude is going to need a lot more than EMA to be successful.  We need to remind kids that six years ago EMA didn't exist and poor kids still went to college then. In the end I think most kids will get this - when you talk to most of mine they do realise this and admit that, as much as they like it, they would stay on without EMA. Plus, I have a greater faith in principals and schools to know who really needs the money, as opposed to a more distant means test which we all know was open to tons of abuses. Kids are actually far less likely to game the system when they see the people responsible for giving them the money every day. Plus, this current 500 million scheme does not meet its objectives – one of its objectives was to reduce the NEET rate, which it hasn’t done. The IFS in another paper note that much of what increase there was in the staying on rate came from kids who would otherwise have been in jobs, not kids who would otherwise have been NEETs.  That’s why bringing it in didn’t have much of an impact on NEETs, and getting rid of it won’t either.

So for all the statistical dancing around here: your numbers are flawed and misinterpret the evidence;
Not at all, unlike you I read the whole report and got to Appendix D where they explain they’ve already grossed up the figures
you fail to appreciate the nudge aspect of the policy and how this is most particularly useful in a College environment; and fails to properly scrutinise the alternative scenario next to the default. And if it's abolished 10,000s more students will be condemned to failure.
Again, this is another attitude I have a problem with. Students will not be ‘condemned to failure’ by this. That suggests students are passive victims with no agency of their own. Do you really want to reduce poor kids to this level? Ed Miliband himself has said one of the problems with markets and the state is that they both tend to view people not as individuals but as statistics. I think that is exactly what you are doing here.  Also, you basically seem to accept my point that there is a lot of waste. Your argument is that there is absolutely no way we could cut that waste. I find that very strange. I can’t think of any area of life where we couldn’t improve a 90% waste rate. What would you say if we equipped every new classroom with interactive whiteboards that only worked 90% of the time? What would you say if we employed teachers who only worked for 10% of the school day? What if we bought tables and chairs that could only be used for 10% of the school day? However, even if we accept your point that there is no way we could target the scheme better, you seem to think it is fine that we give 450million to kids who don’t need it  as long as 50 million reaches the kids who do.  If that were the case – and it isn’t – I am afraid I would have to abolish the scheme entirely.  There are so very, very, very many more ways you could spend £500 million in education and get a better return on investment.  Even in an era without tight budget restraints, I think the waste of 450million would be unjustifiable.  In an era with them, it’s ludicrous.

Oh, and finally...the numbers of those affected are far greater than you argue. You state that the policy helps ’12,000-18,000’ kids. This is completely wrong. In 2009-10 there were 643,000 recipients. According to your statistic of 83% deadweight loss, that means 109,310 will be positively impacted. Using the survey figure of 10% retention, it’s 64,300. This is a two-year cohort so very quickly over say a ten year period you are looking at 300,000 or so to 550,000 impacted. In anybody’s world that’s significant.

Thanks to you kindly pointing out the error in the double grossing up done by CbFT, the stat is actually 95% deadweight (see here).That also chimes with the RCU survey. So that’s two surveys, including the most rigorous,  which suggest 5-6% and a couple – with less rigorous methodology – which suggest 10-15%. Let’s split the difference at 8%. So that’s 51440  potential drop outs. Bear in mind, as I have pointed out, that not all of them would have gone on to complete their courses or get any qualifications anyway – that is, for from preventing drop outs, EMA probably just delayed some and may have even caused others. We can probably say on the evidence that about a quarter of those 51k would have dropped out anyway, so now we are down to 38k affected.  And as much as you may have doubts about the 50m fund, you have to concede that it will make some difference, don’t you? It will probably have some deadweight too, but it’ll be miles less than the EMA – in fact, were I a head teacher, I would feel pretty confident about getting a 10% deadweight rate. So we can assume 45m will be spread across the most needy 38k students – 1185 a year, about the same as the current maximum EMA of c.1170. Easy.Of course, you can never guarantee that not absolutely one individual will suffer as a result of something, but according to all the evidence, we can feel pretty certain it won’t cause significant problems.

I also think it's very interesting that based on a clearly unfinished reading of the IFS report, you saw fit to take me to task over what was a fairly complex statistical issue - one that I was right about anyway, and that had I been wrong would have made a fairly small difference. I notice you claim on Twitter that this makes my post 'deeply flawed'. Yet you don't feel inclined to comment on my post about the NUS's misrepresentation of EMA, which is based on a misunderstanding of percentages and percentage points - fairly elementary I would like to think.  This error means they increase the drop out rate from 17% (which as I now show here was already incorrect) to 65%.

If you thought my alleged error was deeply flawed, what would you call that?  Very deeply flawed? Deeply deeply deeply flawed? As, unlike the NUS, you clearly are numerate, I'm interested to know.
Even if you cannot agree that spending 500million pounds to get a 50 million impact is terrible value for money, can you at least agree that supporters of EMA should base their campaigns on the facts? Will you be joining me in asking the NUS to correct these errors?


  1. I going to post the response in parts:

    PART 1

    Thank you for the response. And you are right to say that my major issue is with your analysis rather than your statistics. The problem I have with your statistical ‘dancing around’ is that in previous posts you have flipped around between pilot area averages and EMA recipient averages without making it clear what you were quoting and why. That weakens your case considerably.

    The problem I think you are encountering is that you are trying to two things at the same time and, as a consequence, you end up achieving neither. The first thing you are trying to do is administer a statistics lesson (to the NUS presumably.) But your posts have demonstrated how easy it is- through ill-discipline- to give a distorted view. Your clear statistical capability is masked by unclear presentation which devalues that element of your case. (and thanks for the patronising aside on Appendix D! I did as it happens read Appendix D though Table 1 is pretty self-explanatory anyway and the accompanying text that I quoted makes it equally clear. It should be noted that Table 1 (as opposed to 1a and 1b) deals with participation only and not attainment.)

    In particular, you criticise the NUS (rightly in fairness) on their confusion of percentages v percentage rates. But yet you flip between the two over the posts here without making it clear what you are doing and why. In your case, it’s not confusion; it’s lack of clarity.

    Final point on the analysis, the point you make about the impact of EMA on different ethnic groups is, well, ill-considered. A greater proportion of certain groups receive it and so by looking at these sub-groups we can get a better sense of its real impact (though there may be issues specific to certain groups, it is highly likely their socio-economic status is the major impact on likelihood to participate in education.) So yes a majority of people who receive EMA are white British but a majority (probably) of, say, black African British receive it. My point was quite simply that if you want to understand the real impact of the policy you’d be better to look at the impacts on the latter than the former group. So it’s nothing to do with racial targeting etc. Quite why you went down that route is difficult to fathom.

  2. Part 2:

    Anyway, the major issue with the analysis is not actually to do with statistics- though it should be emphasised the issues presented by your use of statistics of are not insignificant. It is your reliance on the ‘deadweight’ argument.

    All public programmes have a degree of ‘deadweight.’ The example I gave on my piece on Left Foot Forward was on GP check ups. Only a tiny proportion lead to the identification of a serious illness so the vast majority is deadweight so would you cut this expenditure?

    You could say the same about education- more than half don’t leave school with 5 GCSEs A-C (including mathematics and English) so why not just not bother with the rest and put them on some cheap literacy and numeracy programme with some basic subject matter, teach them two hours a day, and reduce costs by over 50%? There is nothing intrinsic in your statistical analysis to suggest there is too much ‘deadweight’ in the programme. It’s just a number and you’ve arbitrarily said ‘that’s too much.’ But the logic of your position is the culling of many critical programmes that I’m sure do work in your view. For some reason, you’ve jus decided to single out EMA (perhaps because it’s a programme that the coalition wants to cut?)

  3. PART 3

    My final point is an observation. Throughout your analysis you have taken a very school-centric view. This is problematic in a number of ways. Though I haven’t got access to the numbers, given the nature of the cohort we are talking about, a larger proportion will actually be in college rather than school. This creates issues for your argument in a number of ways.

    Firstly, College qualifications are measured by percentage achievement at various levels rather than UCAS tariff points. If say, an intervention increases success by 7% as seems plausible from the IFS analysis, this is enormous. A college that increases its achievement rates by that amount could go from being in the bottom to top quartile! (Incidentally, had the IFS used success rates rather than achievement rates, drop outs could have been factored in: the rate is calculated by achievement x retention. This doesn’t mean that scaling up of achievements is a problem that can’t be surmounted which you to seem to suggest- in fact, p.7 of the IFS report states explicitly that they can be, albeit in a rough and ready fashion.) I would encourage looking at the by college evidence presented in this article in TES:

    Secondly, your assertion- and that is what it is- that headmasters can allocate the fund efficiently may be so (though I some scepticism- it opens a scope for gaming of the system where students claim that they would not continue/ drop out unless they receive the payment but the EMA suffers from gaming also in fairness.) However, move into a College scenario where you may have 2000 students (Lambeth has 2,078.) I’m afraid your optimism re efficient allocation becomes impossible in such a scenario (Lambeth College will have 1000 students on EMA!) I should state that in the LFF piece I stated clearly that £50million was better than nothing and £100million would be even better still!

    What all this means in practice is that this policy is not a matter of ‘spending £500million to get £50million benefit.’ The only thing to back this up is your ‘deadweight’ argument which, as I have shown, is a nonsense. Actually, the overall benefit is significantly greater than that. The cost to the taxpayer of a NEET is £56,000 over their lifetime. So if just 9,000 end up as NEETs as a result (hands up, I doubled the expenditure over two years in the LFF piece but should have also divided it by two realistically given it’s a two year cohort which meant I quoted 18,000 instead of a more realistic 9,000- silly me to hinder my case in that way!) of the cut then the policy would pay for itself in fiscal terms (and we haven’t even begun to discuss the individual and social benefits….) Given a youth unemployment rate of somewhere in the region of 20% it will be more than 9,000 who become NEET.

    So if your argument is that we don’t want to pay now to save later and are willing to take the social costs etc. on the chin over and above that then fine. By why all this statistical dancing to get to that point? I say invest now, save later, unleash individual opportunity, and minimise negative social impacts of low or no qualifications and unemployment. Or don’t. That’s the choice.