« D.C. Voucher Students Begin to Nudge Ahead | Main | More on TFA: It's Elementary »

The Teach-For-America Boost — Redux


The Urban Institute made national headlines last spring when it released an influential study suggesting that Teach For America recruits were more effective than other teachers in North Carolina's high schools. One criticism of the study at the time, though, was that the researchers were comparing the TFA teachers with a group of teachers with a hodgepodge of training.

In answer to the critics, researchers Zeyu Wu, Jane Hannaway, and Colin Taylor decided to update their study with a larger sample of teachers and students. They added data for 32 teachers and more than 2,000 students, and re-ran the numbers so that they could do more "apples to apples" comparisons. The results were the same: Across the eight subjects tested, the students of TFA teachers racked up bigger learning gains than their non-TFA counterparts.

The TFA teachers were also found to be more effective than teachers who had graduated from a fully accredited North Carolina teacher-training program and those who were licensed in the subjects they taught. The overall TFA boost, in fact, was bigger than the size of the learning improvement that students normally get from having a teacher who's been on the job for three years or more.

That last point is important, the researchers write, because one of the slams against TFA is that its teachers often leave the classroom after two years, cheating students of the benefits of more experienced teaching. You can read the full text of the revised study here.

Something to wonder about: Would the results be the same for elementary schools, where pedagogical know-how may be just as important for teachers— if not more so— than subject-matter expertise?


TFA consistently lies about their data--that is common knowledge--if this data is coming from TFA . . . don't trust it! If you want to make your data show "value-added" gains, look to TFA to learn how to fudge.

The study looks to be well done, but I'm still not satisfied. They added more teachers, yes, but that brings them up to 150 observations of 98 teachers over the course of 7 years. In other words, less than one teacher per district per year.

The comparison of TFA teachers to teachers with different levels of certification is interesting, but I want to see them compared to teachers who are put in the same situation they are -- teachers in struggling school teaching low-performing students.

The authors sort of touch on this near the end, comparing TFA teachers with bottom-quartile scoring students to non-TFA teachers with bottom-quartile scoring students. This is where I'd really expect TFA to excel. Surprisingly, the TFA advantage shrunk noticeably among this group -- to .061 standard deviations, about the same as having a teacher with 3-5 years of experience (.054 SD).

The sample size does seem a bit on the small size, but the results are still worth paying attention to (as long as we don't draw any overly simplistic conclusions from them, like "experience and training are bad for teachers, so we should staff all schools with recent college graduates and then toss them out after two years". If we reject the interpretation that experience and training inevitably result in worse teaching, and speculate on other possible reasons for the results, we might come up with something useful.

For instance, maybe TFA teachers feel less constrained by all the pressures that come down on teachers because they aren't in it for the long haul, and that frees them up to pay more attention to their own students and do what's good for them. Or maybe they feel free-er (how is that spelled, anyway? "freer" looks weird) to risk trying new things, and some of those novel approaches or lessons are working well. If this is the case, it would make an argument for loosening some of the constraints on all teachers, and putting more attention on being responsive to what's happening with one's own students and less on conforming to externally-imposed standards and expectations, it seems to me.

Or perhaps there's some kind of self-selection effect--maybe people who go into TFA have a particularly good mix of subject-matter competency and social awareness and idealism that results in their being more effective self-taught teachers. I'm not sure if there would be any policy implications do be drawn from this, but it might be interesting to explore the question. Actually, I can think of one policy implication; if this is the case, there's probably a stron scale effect. That is, if people decided to try to recruit a majority of new teachers as TFA teachers, perhaps by upping the financial rewards somehow, whatever the advantage possibly conferred by the current critical mix of self-selection factors would probably disappear, as the factors entering into people's decisions to join TFA changed.

In any case, I'd be ineterested in seeing more research. In particular: I assume some of the TFA teachers do opt for teaching as a career, and go on to become experienced teachers, and possibly even get some kind of formal training at some point. I'd like to see comparisons of their later teaching both with their own early teaching, and with randomly selected current TFA teachers.

I wonder if what we also see is young, recent college graduates with no children who are only planning on being at the school for 2 years so they work like crazy during that time and then they leave . . . leaving students with a sub for the next two years.

We have been looking at this kind of data for our school district and the results have not been as favorable for TFA.

Comments are now closed for this post.

Follow This Blog


Most Viewed on Education Week



Recent Comments

  • KP: I wonder if what we also see is young, recent read more
  • Jean: The sample size does seem a bit on the small read more
  • Corey: The study looks to be well done, but I'm still read more
  • Kafi Payne: TFA consistently lies about their data--that is common knowledge--if this read more