I was wrong – EMA is even more of a waste than I thought.
In this post on the issue of EMA deadweight, I analysed a report by CBfT which the NUS had cited wrongly.
The NUS had cited it wrongly – but what I didn’t notice was that the CBfT had themselves cited research from the IFS wrongly.
Essentially, what has happened is that the IFS calculated what impact EMA had on participation on kids living in an area. Then, in order to get the impact on kids who were actually receiving EMA as opposed to just living in the area, they had to gross up their original impact figure to take into account that not every kid living in an area got EMA. But they include their grossed up figures in the main body of the report.
The CfBT make the mistake of assuming these were the ungrossed up figures, and therefore gross up again. When reading the CfBT report, I was a bit baffled by that, but it’s only now that I’ve worked it out. Essentially it means the CfBT overestimate the impact of EMA by about 2.75. Hilariously, even with this double grossing up EMA still wasn’t that effective and the CfBT, who defend EMA, still had to admit that the deadweight cost ‘may be thought to be too high.’
Let’s just get this clear because it is quite funny. The CfBT are defending EMA. They do their stats, slip up, gross up twice and overestimate impact of EMA by about 2.75.
Even after overestimating the impact of EMA in this way, they still have to admit the deadweight is huge.
And they are supporters of EMA!!
Here’s the full stat breakdown.
The CbFT report, written by Mick Fletcher, used the IFS research to claim:
The most recent figures show that 43% of full-time students aged 17-18 receive EMAs. Taking an average of recent evaluations EMA may have increased participation by up to 7 percentage points, leaving some 36 percentage points as deadweight.
As I showed, those 36 percentage points equal 83 percent. However, to get this you have to accept the assertion that ‘taking an average of recent evaluations EMA may have increased participation by up to 7 percentage points.’ I assumed this 7% figure was taken from the IFS study which Fletcher cites earlier on. However, when I reread Fletcher’s rationale for using the 7% figure instead of some of the other ones used in the IFS report, I realised he’d used the wrong figure for this particular calculation.
The IFS report comes up with sixteen different figures for EMA’s impact on participation. It calculates it using two methods – comparing pilot areas to control areas, and comparing all pilots to all of the rest of England. Then, it calculates the effect on the entire cohort and the effect just on the cohort who actually took up EMA. Then, it calculates each effect for 16 year olds and for 17 year olds. And finally, it calculates each by sex. It would have been nice if they had come up with a figure for both sexes and both ages as this might have helped give a better overview. Anyway, this is what we have.
Now, I am not particularly concerned about which method, sex or age you choose. What matters for this particular thing we want the data for is the bit in the left hand column – whether we look at the impact across the pilot areas as a whole, or the impact across EMA recipients. The former measure produces much lower figures because of course, you are measuring the increase in participation in relation to the entire cohort. For the second measure, you are measuring the increase in participation only amongst EMA recipients. So you are obviously going to get higher impact rates. I assume what they mean here is that they attribute EMA to be responsible for x percentage points of those EMA recipients. So actually, this figure is exactly the one we want. We don’t then have to reduce it down in the method I show above, because they’ve already done that. If we were going to go through the measure I outlined above, then we should do it with the figure for the impact across the pilot areas as a whole. And if we do that, then we obviously end up with the same figure that’s in ‘impact across EMA recipients’ column, because of course this entire piece is based on the same data.
So, when I wrote this in my previous post:
We have 100 sixth form students. 35.7 get EMA but would be at sixth form anyway. 7.3 get EMA and wouldn’t be in the sixth form without it. The other 57 are at sixth form and don’t get the cash. Of the 43 kids who get EMA, it’s only 7.3 kids who are actually being incentivised by it to stay on at 6th form. The other 35.7 would be there anyway. Thus, the deadweight cost is 35.7/43, or 83 %
Actually I should have used the following figures:
We have 100 sixth form students. 40.1 (43-2.9) get EMA but would be at sixth form anyway. 2.9 get EMA and wouldn’t be in the sixth form without it. The other 57 are at sixth form and don’t get the cash. Of the 43 kids who get EMA, it’s only 2.9 kids who are actually being incentivised by it to stay on at 6th form. The other 40.1 would be there anyway. Thus, the deadweight cost is 40.1/43, or 93.3 %.
Thus, only 6.7 percentage points of pupils already receiving EMA are said by this data to have been encouraged to stay on as a result of EMA. The reason why this isn’t exactly the same as the 7.3 it should be to match up with the table above is because Fletcher uses the figure of 43% EMA take up whereas the IFS have the exact ones for the cohorts they are discussing which are 40% for 16yr olds and 30% for 17 yr olds. It’s all in appendix D.
So, if we want to calculate the number of people on EMA who are only carrying on in further ed because of EMA, then it doesn’t matter which method we use or whether we look at 16 year olds, 17 year olds, males or females. What matters is that we use the SECOND row for each of these segments.
That gives us 5.5%, 7,3%, 2%, 5.5%, 6.3%, 8.1%, 0.6% and 4.5%. It isn’t strictly accurate to take a simple average of these, because there will probably be more females than males and definitely more 16 year olds than 17 year olds, but let’s do so anyway to get a rough idea: 4.975%. So, according to the IFS, the most reputable organisation out there, 5% of EMA recipients would not be there if they weren’t getting the money. That means that 30,000 kids are in college who wouldn’t have been otherwise (and this tells us nothing about their drop out rate or attainment rate), and we have given 570,000 kids £500 million quid they didn't need.
But apparently, making savings would be impossible.
I'd hate to see what Labour thought a waste of money was.