Monday 6 December 2010

Does EMA improve attainment?

Yesterday I published a rather long statistical analysis showing that most EMA money was going to kids who would have stayed on at college without it.  That is, EMA was having very little impact on the participation rate and was mostly going to kids who would have participated without it.

This is an important issue, because the stated main aim of EMA was to increase participation. Most of the analyses of its effectiveness have revolved around this issue, and it's often discussed as removing one of the 'barriers to participation.' So, the fact that it doesn't increase participation very much and that most of goes to kids who would have participated anyway are very significant facts.

However, despite the fact that I have just spent an inordinately long time examining the stats on this, it is not the only issue. In fact, to me, it is not even the main issue. It was always fairly obvious to me that if you pay not very well off 16 year olds to do something not very arduous that a lot of their mates are doing, you will get a lot more of them doing said thing. If you pay people to 'participate' in something, you will raise participation rates; the fact that large sums of taxpayers' money have been spent to prove what I thought was a fairly elementary economic proposition is slightly baffling. The one useful thing this research can tell us is the amount of increased participation, which as I have shown is fairly paltry.

However, to my mind participation is not the only point of EMA, and it's not the only point for two reasons.

First, EMA could have improved participation rates by 300% - that would not necessarily make it a success. Even if it had improved participation rates and you could prove it had, even if you could definitively prove that everyone receiving it would drop out without it, that would still not to me prove anything at all about EMA. To prove the success of EMA, the proof that it has increased participation is necessary, but not sufficient. You also need to prove attainment. This is because, funny enough, education is about more than just 'participation', just as jobs are about more than presenteeism.  Does EMA improve attainment? Does it increase what I think we have to wincingly call 'life chances'? Are the kids receiving it going on to university? To better jobs? To not be on the dole?  If all EMA does is get kids to sit quietly at the back of the class for two years before emerging with three Us at A-level, then it may have increased participation and got the unemployment numbers down, but it hasn't added anything to the lives or economic potential of the kids involved and it's merely delayed either them getting the same job they would have anyway, or them becoming a NEET.

This may sound an unduly pessimistic point, but this is what I see a lot of when I am teaching. I  have certainly taught kids on EMA who would have dropped out if they hadn't received EMA - and an awful lot of them really should have dropped out. They were only there for the money. They did poor work, only just made the attendance threshold to get their money, rarely did homework, rarely contributed in class, never met coursework deadlines and in general generated an awful lot of work for their teachers that would have been better spent on the kids who actually wanted to be there.  So what we really need the research to show is how many kids are there getting EMA who should be at college, who want to be at college, who will do well at college, and who will not be there unless they get their EMA?

Second, the reverse of this holds true. Even if EMA hadn't improved participation rates at all, even if all of it was going to pupils who would otherwise have stayed on in education, then it could still be a success if it had improved those students' grades. The main way it would have achieved this would have been in allowing students to stop paid work and devote more time to their studies. If this were the case, then I would have to concede that despite the 'deadweight' issue I outlined previously, it was still having some benefits. However, again, my anecdotal evidence suggested that pupils were not doing this. Lots of pupils on EMA continue in paid work. Even the NUS don't expect that students receiving it will give up paid work, merely that they will reduce their hours. And of course, it isn't necessarily the case that giving up or reducing paid work results in an increase to the time you devote to your studies.

Anyway, my anecdotes aside, let's consider the evidence. What impact does EMA have on attainment? Because the main aim of EMA was to increase participation, most of the surveys on it measure that. Only the most recent IFS survey, from what I can see, analyses attainment. This has been reported very positively, as improving attainment rates by 5-7%. Let's have a look at what it says.
 "Both males and females saw improvements in average A Level tariffs of roughly 4.5 percent of the base level." 
What this means is highly dependent on what the base level is, and the base level is different depending on the two different methods the IFS researchers use, and also depending on the way they break it down - by sex and by age.  This is not just hairsplitting. One of their methods defines the base level as "56.7, which we measure as the average Key Stage 5 tariff across all individuals in the pilot and control areas." 4.5% of 56.7 is an improvement of 2.5515 points. That would take you up to 59.2515, which is very close to one D grade but is unfortunately still just one E grade A-level. The improvement of 2.5515 points equates to ONE EIGHTH of ONE-A Level grade. All of the IFS's findings on average A level tariffs are a classic example of the difference between statistical significance and real-world significance. This is not something I have made up, it's something that is acknowledged by statisticians - here an introductory guide to statistics explains the difference:

"Many researchers get very excited when they have discovered a "statistically significant" finding, without really understanding what it means. When a statistic is significant, it simply means that you are very sure that the statistic is reliable. It doesn't mean the finding is important or that it has any decision-making utility. 
For example, suppose we give 1,000 people an IQ test, and we ask if there is a significant difference between male and female scores. The mean score for males is 98 and the mean score for females is 100. We use an independent groups t-test and find that the difference is significant at the .001 level. The big question is, "So what?". The difference between 98 and 100 on an IQ test is a very small difference...so small, in fact, that it’s not even important." 
Precisely.  The one-eighth of an A-level grade finding is statistically significant, but it isn't significant or important in the way we use those terms everyday.  One eighth of an A level grade is not at all educationally significant.  It’s the difference between getting an E and getting a slightly better E. I would think that an eighth of one A-level grade may well be within the margin of error of most examination boards. If you sent off the exam papers of every kid in the cohort to be remarked, you'd probably get a greater fluctuation than one eighth of an A-level grade.  It's the equivalent of 5 marks out of 400 on the uniform mark scheme - that is, 1.25% of an entire A level. And of course, most kids will do three A-levels. So the impact over the entire A-level studies is 0.4%  

Now, there is a question about what the correct average base level is. The IFS express some confusion about this themselves, saying 
'Strictly speaking, this calculation does not compare like with like since the estimated impact is the average effect across pilot areas whereas the base reported is the average level across the whole sample (pilot and control areas). However the choice of an appropriate base is not clear and this calculation provides a reasonable approximation.'
When they break it down by age, sex and race, there is a huge fluctuation in base levels.  I can't definitively find the correct average base level - obviously it isn't as simple as averaging all the ones given in a table because there are more females than males, a lot more whites than ethnic minorities, etc.  So I cannot find the definitive average base level for all pupils, but they do repeat and reuse the figure of 56.7 a couple of times, so that might be it. In which case improvements of 5% are educationally insignificant. Even on some of the higher base levels, where improvements of a few percent mean more, then based on my analysis above we are then perhaps talking about an impact of 1 or 2 percent of the total grades achieved over three years. 

The reason why all these figures are so statistically significant but so real world insignificant is that they start from such a low base. It's like when you have a school of 1000 where one kid gets an A grade. If the next year you can get two kids getting an A grade, then you have a 100% improvement! The headline figure is completely true and looks fantastic, but dig deeper and you see the problem.

If we analyse some of the other 'significant' improvements the IFS talk about, we can see this.  In the news reports on this paper, the impact of EMA on ethnic minorities was trumpeted.  Here's what the impact was:
"by age 19, the EMA increased the average A Level scores of these two groups [black females; Asian females] by around 8.6 points and 16.1 points respectively." I don't quite know how they can guarantee this improvement was in A level scores - I discuss this further below.  That aside, 8.6 points equates to about two fifths of one A-level grade, and 16 points to about four fifths. Be clear, it's not two fifths of one A-level, but two fifths of one A-level grade. It's actually the difference between just getting a D grade, and getting a middle D grade, or just getting a C grade, and getting a middle C grade.  To go back to my uniform mark scheme difference, it's the equivalent of 16 and 32 marks respectively out of the 400 available for one A-level, which works out as a 1.3 and 2.6% impact on a two year three A level course. I think these stats do, just, have some 'real world' educational significance, but still not massively so.  I would also point out that for by far the largest segment of females - white females - there was no statistically significant impact on results. There are similar results for males. So what we are seeing here is - again - that where there EMA does have an impact on attainment, it is concentrated on a very small sector of those receiving it. That is, again, a large amount of EMA is wasted. On balance, I would want to preserve the small but, I think, ‘real-world’ significance of the gains to ethnic minority females. A system that targeted money at those in need rather than giving it to all those beneath a crude income level would get most of the gains of the current system for much less money. Funny enough, that is exactly what the government have proposed.

Two other minor things I would point out as well - a lot of the findings on attainment are less statistically significant than they might be - some only to the 10% level. And secondly, there seems to be quite a bit of uncertainty about the points tariff the researchers talk about. At one point they refer to it as the ''Key Stage Five tariff'. It's a common misconception even amongst educators, but there is actually no such thing as Key Stage Five - at least not in the official sense that there is with Key Stages Three and Four.  Then, at other times, whilst appearing to refer to the same thing, they talk about the 'A-level tariff'. Of course, there is such a thing as A-levels, but I don’t think there is such thing as an exclusive 'A-level tariff'.  I have assumed what they are talking about here is the UCAS tariff, especially as they use the phrases ‘Key Stage Five tariff’ and ’A-level tariff’ interchangeably. The UCAS tariff includes a whole range of different qualifications, academic and vocational. So when I have been referring here to A-levels, don't assume that I am excluding all the kids doing vocational courses. The tariff includes them, and believe me, an increase of 2.5 points is just as unimpressive for BTec nationals as it is for A-levels - arguably even more unimpressive.

Overall, what the findings do here is confirm the anecdotal suspicions of me and of everyone I know who has taught in a school with EMA.  You can split up the kids getting EMA into three groups
  • ·         Group 1 - Most of it goes to kids who would have stayed on anyway, and has a limited impact on their decisions about paid work and if it does have an impact on this decision, has a very limited impact on whether they reallocate that time to schoolwork. Hence, it has a limited impact on their eventual grades.
  • ·         Group 2 - Of the EMA that does go to kids who would drop out without it, many of them are only there for the money – their eventual qualifications are poor, perhaps non-existent, and unlikely to impact their lives.
  • ·         Group 3 - The remainder of pupils on EMA – I’d make an educated guess of about 2-3% of the entire number of kids on EMA, a guess which is informed by although not based on the IFS report – would drop out without it, or would struggle on with reduced grades.  These kids deserve the money. 

 If there were no possible other way to get these kids the money, then I guess you could have an agonised debate about whether it was worth wasting £535-540 million in order to help 12-18,000 kids stay on at school or get better grades.  On the 2% scenario, each kid who has been genuinely helped is costing you 45k a year.  That seems fairly awful value to money for me. I am sure someone will argue that that sum is worth it if it stops those 12,000 ending up on the dole – but then, you’d have to prove those kids would have ended up on the dole anyway, as there is no guarantee they would – I would suspect that the kids who end up on the dole are more likely to be those from group 2, for whom EMA merely delays their entry into NEETdom. Anyway, to prove that you would need to do another complicated survey of the future careers of EMA recipients and find a way of statistically justifying that.  If – and it’s an important if – there were no other way to get these the money, then perhaps, at a pinch you could justify spending half a billion on kids who didn’t need it. I probably wouldn’t buy that argument, but I might concede there was an argument.

Fortunately, there is no need for such an agonised debate. There are other ways of targeting the money at the kids who will make use of it, so the above paragraph is completely academic.  There are several possible ways of targeting money at the kids who really need it, but probably the most effective is to do what the government are planning – give the fund to schools and colleges to distribute in cases of genuine need.  Hilariously, some people have said that this will increase the burden of bureaucracy on schools.  Those people have clearly not seen the burden of bureaucracy the current EMA system brings, which just reinforces my general suspicion that very few of the people defending EMA have ever worked in a school.

I will discuss what this fund should look like in my next post.

4 comments:

  1. From Anthony Painter:

    OK. There seem to be a number of flaws in this analysis.

    The difference between the area and individual impact of EMA.

    The figures you mainly quote above are pilot area averages (thou you do jump about a bit). It is important to read pp 6-7 of the IFS report to understand why this is problematic. The key point is:

    "We therefore suggest as a rule of thumb that, in order to obtain the effect of the EMA across those who received it, the estimates above be multiplied by a factor of 2½ (for outcomes at age 16) or 3 (for participation at age 17 and attainment outcomes)."

    So the participation and attainment impacts are in the main much greater than you suggest.

    2. The results of the EMA on black and Asian students (that you applaud and accept to be significant) are not just an anomaly- they indicate the real impact of the policy. These communities are more likely to be recipients of EMA. It should be emphasised however that even here the impacts are underestimates- you have to be enrolled and participating in order to receive EMA (as well as meeting the income criteria.)

    3. You talk about A Levels but largely neglect level 3 qualifications where the impact is greater. There is a reason for this. Colleges use EMA as a 'nudge'- receipt is dependent on attendance and punctuality. Without EMA, there's no nudge and grades decline. Vocational level 3 qualifications are entry level in a whole range of careers....

    4. Your deadweight argument fails as you do not subject alternative schemes to the same level of scrutiny. The coalition's scheme has a fundamental weakness. You can't identify those who would drop out without EMA. This creates a moral hazard. Students have an incentive to threaten to drop out or to threaten to not enrol in order to force the institution's hands. How do you judge the genuine case? So the bureaucracy argument is neither here nor there- a large number of people who are hit by EMA will inevitably be missed by a discretionary scheme. The choice is between a £500 million policy which meets its objectives or a £50 million scheme that doesn't with the knock-on social, economic and fiscal impacts of swelling the realms of the NEETs.

    So for all the statistical dancing around here: your numbers are flawed and misinterpret the evidence; you fail to appreciate the nudge aspect of the policy and how this is most particularly useful in a College environment; and fails to properly scrutinise the alternative scenario next to the default. And if it's abolished 10,000s more students will be condemned to failure.

    For these reasons I'm 100% sticking with EMA. It works compared with the alternative.

    ReplyDelete
  2. Oh, and finally...the numbers of those affected are far greater than you argue. You state that the policy helps ’12,000-18,000’ kids. This is completely wrong. In 2009-10 there were 643,000 recipients. According to your statistic of 83% deadweight loss, that means 109,310 will be positively impacted. Using the survey figure of 10% retention, it’s 64,300. This is a two-year cohort so very quickly over say a ten year period you are looking at 300,000 or so to 550,000 impacted. In anybody’s world that’s significant.

    ReplyDelete
  3. My reply is too long - see here http://tinyurl.com/37c7von

    and here http://tinyurl.com/3ywm5dv

    and here http://tinyurl.com/37russg

    ReplyDelete
  4. I must admit your statistics are interesting although I've always believed that there are an awful lot of things which are difficult at best and impossible at worst to quantify statistically, including much of the above analysis. However, as someone who receives EMA and knows a lot of my peers who also receive it, a 5% figure of those who genuinely need/deserve it almost exceeds my expectations. Personally, I've only ever seen that apart from the possibility of £50-£100 at the start of term for textbooks and supplies, the only real cost that is a barrier to education for some people is transport and that, I would have thought, would be quite easy for schools to identify and calculate necessary benefit for. So I would broadly agree with your conclusion but don't place too much significance on incredibly detailed and ambiguous statistics, they're too easy for people to shoot down.

    ReplyDelete