How can we learn if Teach First is working?

September 8, 2013

Last week I published a paper I wrote with Jay Allnutt about the impact of Teach First on GCSE attainment. We received a large amount of feedback on the paper, via a seminar presentation at BERA conference, comments on a blog I wrote, twitter and email. Rather than simply present these research findings at researched2013, I showed the audience some of feedback we received on the paper from the education community to reflect on how research can move forward understanding of major education policies. This blog gives an approximate overview of my presentation at researched2013, following on from Joe Kirby’s @joe__kirby blog of his talk.

The paper I published with Jay Allnutt showed that schools participating in Teach First improved their GCSE results. In their second year of participation, the improvements were in the order of one grade in one of a pupil’s best eight subjects (5% of a pupil s.d.) or a two percentage point gain in the % achieving five or more GCSEs at A*-C (incl. Eng and maths).

Making these sorts of claims was not straightforward because…

…Schools were not selected at random to join Teach First

I think all new education policies should be randomised in their implementation, providing it is possible. Teach First received substantial government funds so has a duty to taxpayers to demonstrate its effectiveness. This was a major failing on the part of Civil Servants who agreed to fund it (Sam Freedman @samfr spoke eloquently about this issue). Rather than recruit school’s from 20 local authorities in London in the first year of the programme, recruitment should have taken place across a randomly drawn lot of deprived schools or alternatively local authorities should have been randomised into the programme.

This did not happen, producing a very serious identification problem because Teach First schools look very different to others – they are concentrated in London, are relatively deprived, were likely to have had particularly severe teacher recruitment problems and may have had headteachers who were particularly dynamic or risk-taking.

We try to deal with non-random selection in our estimation by finding a set of schools that look identical to the Teach First schools, except that they were not participating in the early years:

Is Teach First working

And rather than simply comparing GCSE performance between our Teach First schools and a matched control group of schools, we run regressions that model changes taking place year-by-year at every school:

Is Teach First workingDID

The first set of responses we received to our papers asked…

Are there confounding factors?

“it seems possible that managerial teams that are ‘early adopters’ [of Teach First] are a different calibre to ‘followers’ who catch up with trends after it starts to look cool … [so it might be] … higher managerial chutzpah of the early adopters doing others things nothing to do with TF that cause results to rise”

[blog comment]

For this type of criticism to be valid, the timing of the adoption of ‘higher managerial chutzpah’ must have to exactly coincide with the participation in Teach First. (Why? Well, we match on change in GCSE scores so their superior performance cannot have preceded the programme and we perform falsification tests, which demonstrate that Teach First did not have an impact in the years before the school joined the programme.) But more importantly, if our findings are entirely due to ‘higher managerial chutzpah then we would not witness positive effects of Teach First for those departments who received TeachFirsters, compared to those in the same school who did not.

Is Teach First workingDEPT

The next set of feedback we received claimed that…

The effects are too large small

It seems most unlikely four [Teach First] teachers can raise the attainment of a school or department single handed by teaching their pupils better, not least as they simply do not teach enough pupils.

[blog comment]

Does ‘5% of a standard deviation’ represent significant and positive impact?

[@David_Cameron76 on twitter and in the audience at researched!]

This poster asked the far more important question…

The effects are too large small, given costs

Are these estimated marginal benefits worth the massive public subsidy that @TeachFirst receives? I don’t think so

[@jpjsavage on twitter]

We don’t answer this question in our research paper, but I hope we’ll be able to in a new Nuffield-funded project I’m working on which is led by Ellen Greaves at IFS. We’ve been asked to look at the relative costs and benefits of all the different teacher-training routes. There are two major impediments we face: no national database of school and departmental participation in PGCE programmes exists and SCHOOL DIRECT WAS NOT RANDOMISED (@samfr – this happened your watch did it not? Why no randomisation to stagger roll-out?).

Some commenters on our research did not believe our findings because…

Your research is clearly biased

…Clearly you have identified what you see as a benefit of this approach (albeit based on the work of someone who clearly has such a positive view of what Teach First are doing that he has gone to work for them)…

[Jonathan Savage as blog comment]

“You’ve seen improvements in these schools and have just gone looking for explanations…”

[seminar audience member]

I have no interest in Teach First in the sense that I wasn’t paid by them to do the research, I trained as a teachers via a traditional PGCE and the wider ‘narrative’ of the research I do isn’t underpinned by a theoretical view that programmes such as Teach First are likely to be successful. My co-author was trained by Teach First, worked in a secondary school whilst he conducted this analysis, and now works for Teach First. If people believe our research is biased it is hard for us to persuade them otherwise. It isn’t enough to claim that scientific and quantitative methods are less susceptible to bias, but this isn’t true (see this article on how doing maths is influenced by political beliefs). We should have been required to declare our methods – our outcomes variables and regressions we planned to run – before we conducted the research, as EEF evaluators are required to, but we didn’t.

We also had the usual comments from people who ask…

Can we ever ‘know’ anything?

You identify so many potential caveats, influencing factors, limitations to the validity of the methodology and other potential problems it is hard to take your findings seriously.

[Jonathan Savage as blog comment]

Paper ‘estimates’ benefits. Others claim them as facts.

[‏@jpjsavage on twitter]

Can you really claim direct causal links? Your paper is rightly tentative the headlines are not.

[@egwilson on twitter]

There are philosophical questions about what knowledge and causality are; I’m not the right person to blog on these things. But my problem with the commenters above is that by invoking the ‘it is not causality’ argument, they are taking a value position that there is still no evidence that Teach First works. For me, research on policies should simply aim to shift the balance of probabilities on whether they work or not. Does our research alter the balance of probabilities that Teach First works? I think it does, though we can argue about how much. Am I convinced enough of my findings that I’d be happy to suggest to a headteacher that they join the programme? I think I would (though if they were really reluctant I definitely wouldn’t push it!).

Some commenters rightly ask what the ‘thing’ is that we’ve identified, or…

How should we interpret the effect?

Could you account for the vacancy-filling nature of the programme? That is, a mathematics teacher is better than no mathematics teacher?

[comment by email]

(My response: This is just part of the Teach First effect so we don’t want to ‘account’ for it in our estimation!)

I don’t think you can compare to other ITE methods, as not part of study?

[@BeckyFrancis75 on twitter]

(My response: Correct)

What if the ’cause’ is involvement in “ITE” for a school/department & not “TF“?

[@JohnClarke1960 on twitter]

(My response: Great question! And we’ll be able to say more about this once we’ve completed our Nuffield project)

Have you proved that Teach First impacts on poor kids within these schools, as opposed to other kids?

[a question at researched2013]

(My response: Great question! Let me crunch the data and get back to you)

Now we start getting onto the really important questions, as raised by @samfr in a blogpost…

How might Teach First plausibly change the pupil experience?

Is Teach First workingHOW

Is Teach First workingHOW2

And we had a few more great questions from the audience at researched2013:

Have PGCE courses seen a deterioration in applicants due to Teach First?
 
Why is Teach First more appealing than a PGCE and what role does the salary versus student loans play in this?
 
How can we know if Teach First or PGCE applicants have the qualitatively superior characteristics?
 
Is a major advantage of Teach First the two year length of the programme, compared to just one year for a PGCE?
 

Yes, they’re young and inexperienced. But Teach First participants have the right stuff

September 6, 2013

IOE LONDON BLOG

Rebecca Allen

Today, Jay Allnutt and I published a new piece of analysis (PDF) showing that schools taking on Teach First participants have achieved gains in their GCSE results as a result of the programme. Our analysis tracks the performance of these schools in the first three years after they join the programme and compares them to changes in progress at a set of schools that look identical, except for their Teach First participation in that year.

We make sure this comparison set of schools have the same pupil demographic profile, the same prior levels and trends in GCSE performance, are in the same region of England and are all schools who will choose to join Teach First at some point in the future (formally this is known as a matched difference-in-differences panel estimation). Overall, school-wide gains in GCSE results are in the order of an improvement of about…

View original post 629 more words


Looking for research curious teachers

May 7, 2013

Are you a teacher who is interested in how we should train our new teachers?

Would you like to observe how (fairly) large scale research is carried out?

I am looking for some research-curious teachers and headteachers who might be willing to sit on the advisory board for one of the research projects I am currently involved with. The purpose of the advisory boards is to help inform the research questions and methods, and to help interpret the findings of large projects.

The projects are:

  • A Nuffield-funded investigation into initial teacher training (this is a joint Institute for Fiscal Studies, National Foundation for Educational Research and Institute of Education project). This project investigates the costs and benefits of different teacher training routes. In particular, we would like to discover whether certain routes are more effective than others, in terms of recruitment, the costs (particularly time costs) and benefits to schools associated with training, and their subsequent retention in state schools in England. It would be particularly useful to include teachers on our advisory board who have experience of mentoring PGCE students, GTP/SCITT/School Direct teachers or who have recently trained themselves. The project is led by Ellen Greaves at IFS.
  • An ESRC-funded investigation of the early careers of teachers. This project investigate how the early experiences of teachers in training placement schools and first posts affects their subsequent likelihood to move into particular types of schools or exit the profession altogether. Understanding how to create a schooling environment that retains the best teachers within the state maintained system is critical because we know that relatively large numbers of high quality teachers leave the profession every year and this turnover is damaging to pupil achievement. The project will conduct a large survey of PGCE students and will match this data to institutional records and the School Workforce Census to track the careers of these teachers. It would be particularly useful to have early career teachers on our advisory board for this project.

The advisory board for each of these projects meets about twice a year for the two-year life of the project. The advisory boards would be held at teacher-friendly times, e.g. after 4:30pm during term time or possibly at half term holidays. Obviously 100% attendance of advisory board members is difficult to achieve, so we will recruit more teachers than we need in the hope of achieving representation at each meeting. The meetings will held at IOE or IFS (i.e. in Bloomsbury, close to Euston station) so teachers will need to think about whether it is feasible to travel into this part of London.

If you would like to get involved or would like to know more, please do contact me at r.allen@ioe.ac.uk. In your email it would be helpful if you could tell me how many years you have been teaching, what training route you originally took and what school you now teach in.

EDIT: Thanks for all the interest in these projects. We’ve filled the spaces on our Advisory Boards, but do still send an email if you’d like to give feedback on our proposed surveys.


Evidence-based practice: why number-crunching tells only part of the story

March 26, 2013

IOE LONDON BLOG

Rebecca Allen

As a quantitative researcher in education I am delighted that Ben Goldacre – whose report  Building Evidence into Education was published today – has lent his very public voice to the call for greater use of randomised controlled trials (RCTs) to inform educational policy-making and teaching practice.

I admit that I am a direct beneficiary of this groundswell of support. I am part of an international team running a large RCT to study motivation and engagement in 16-year-old students, funded by the Education Endowment Foundation. And we are at the design stage for a new RCT testing a programme to improve secondary school departmental practice.

The research design in each of these studies will give us a high degree of confidence in the policy recommendations we are able to make.

Government funding for RCTs is very welcome, but with all this support why is there a…

View original post 679 more words


Does ‘the gap’ matter to children eligible for free school meals?

March 26, 2013

IOE LONDON BLOG

Rebecca Allen

David Laws, the Liberal Democrat Minister for Schools, has been making a series of speeches over the past month about “closing the gap” in the attainment between pupils from deprived and more affluent backgrounds. Yesterday, he warned that schools should not be judged as outstanding by Ofsted if they failed to close the gap, a goal that sounds fair and even laudable in principle, but I believe is rather unfair in practice.

The “gap” is the difference in GCSE achievement between the average for pupils who are eligible for free school meals and the average for those who are not. Pupils eligible for free school meals have similar characteristics across schools since they all come from families claiming some sort of benefit. The problem is that the background of pupils who are not eligible for free school meals (FSM) will vary considerably across schools, since the group includes…

View original post 750 more words


Novices and Veterans: What new data tells us about teacher turnover and school deprivation

September 20, 2012

CMPO Viewpoint

Rebecca Allen and Simon Burgess

A new school year has just started, new students have just arrived – what about new teachers? Are there a lot of new faces in the staffrooms? One of the stories frequently told about schools serving poor communities is that they suffer from very high and damaging staff turnover. Few teachers stay a long time, and, relative to schools in the affluent suburbs, there is a constant ‘churn’ of staff. This lack of experienced teachers reduces the chances of new teachers learning the trade on the job, and means that both students and school leaders are forever coping with new names, personalities and teaching styles.

Is this true or urban myth? For the first time, we can start to answer this question systematically, moving beyond a collection of local anecdotes. New data collected from all schools about their workforce has the potential to hugely improve…

View original post 943 more words


How can London schools be so good, given the high cost of living for teachers?

September 20, 2012

IOE LONDON BLOG

Rebecca Allen

Chris Cook, the Financial Times education correspondent, has been writing about the Department for Education’s suggestion that the School Teachers’ Review Body (STRB)should consider whether greater variation in teachers’ regional pay is needed. He notes that greater variation in teacher pay would create a bizarre situation where schools in our most successful region (London) become even more generously funded, with a deterioration in funding in places where schools appear to struggle.

This observation raises the interesting question as to why London schools do so well, given that the high cost of living should make it difficult for them to recruit and retain the highest quality teachers. Why don’t the capital’s best teachers simply migrate to Stoke or Blackpool where their salary would provide them with a nice family home and a higher standard of living?

I would suggest that there are four possible explanations for this phenomenon, and it is…

View original post 537 more words