« UPDATED: Will Reforms to Seniority Catch On? | Main | Duncan Calls for Multiple Measures in Evaluation »

Los Angeles TFA Teachers Outperform Peers

| 12 Comments

A study financed by the Eli & Edythe Broad Foundation shows that students taught by Teach For America teachers in Los Angeles outperformed peers who were taught by other teachers—including veterans with many more years of experience.

Initially, the study was performed for internal purposes. Having provided quite a bundle of financial backing for TFA, Broad wanted to get a sense of how its investment was paying off in terms of stronger student learning. But officials for the group said they ultimately decided to make the study public given the growing national conversation about teacher effectiveness.

California state test-score results of students of 119 second-year TFA teachers in grades 2-12 were compared with those of the students of 1,190 non-TFA teachers in the same grade levels, subjects, and schools as the TFA teachers, during 2005 and 2006.

The results are interesting for a few reasons. First of all, TFA teachers were linked to test scores that were 3 points higher overall than non-TFA teachers, even those who had been in the classroom much longer. And, they were even more effective than other teachers with similar years of teaching experience. (The scores for that comparison were 4 points higher for TFA teachers than for non-TFA teachers.)

It's important to know, though, that since students weren't randomly assigned to TFA teachers or non-TFA teachers, it isn't scientifically possible to say that TFA is the reason why the teachers were more effective. These data are certainly suggestive, but they aren't evidence of a causal link.

And with any study, there are a couple of caveats. For instance, the findings here combine reading and math, so it's not entirely clear how to interpret them for subject matter. Content area is an important distinction because previous studies of TFA have shown that the group's high school instructors had a particularly strong correlation with improved math achievement.

The folks at Broad think this type of analysis could be indicative of what will be possible once data systems continue to grow and students can be linked to teachers. One interesting feature of the study is that analysts used two different growth methodologies and found that one was much better at explaining variability in test scores. That's important because there isn't really good consensus on the "best" methodology for gauging teacher effect on student achievement.

Second, the paper is an example of the kind of analysis that might be useful for higher ed institutions and programs that prepare teachers as they consider ways of improving the effectiveness of their own programs.

TFA has already begun those efforts, as I reported earlier this year.

12 Comments

Kool-aid drinker Sawchuck--Isn't Broad bankrolled study of TFA akin to tobacco company "studies" showing cigarettes are good for you? Is there any disinterested research to confirm the THREE-POINT (not statistically significan) difference in TFA teachers' supposed test-score boost? I mean, how would you be able to show that a TFA teacher in one school was "ou-performing" a "non-TFA" teacher in another--with different kids, different SES, different faculty teammates? It boggles the mind. And how can you accept without question, the lumping of "non-TFA" teachers together as one sample group. How for example, would this sample of TFA teachers perform when compared to a sample of National Board Certified (non-TFA) teachers? Or next to a sampling of Harvard, Columbia or Stanford ed school grads?

Three points? Is it statistically significant? But more importantly is it educationally significant? What is the effect size of this small difference. At least if the raw scores were revealed one could estimate how much better are the TFA educators.

There are methods of improving education which yield large gains, much larger than 3 points. For example Shayer & Adey have detailed how to accelerate students in their book "Really Raising Standards". Physics educators have improved normalized score gains on conceptual evaluations from below 25% to between 30 and 80% using various programs such as Modeling from AZ State. (Modeling.asu.edu)

Why not go for the large gains rather than the small ones?

I was in a training session yesterday where the presenter chastised us for teacher turnover rates. She suggested a best practice would be giving new teachers five less students for their first year and no hard to manage students. Add two challenging students the next year and then let teachers sink or swim the third year. If we are being pro-active in student assignments it would definitely influence this data.

In a response to some comments here, the regressions showed that the results are statistically significant at the 1 percent level, i.e., that there is a 99 percent probability that the difference in scores examined is due to the presence or absence of a TFA teacher, not some other statistical noise.

In response to commenter Bernard, the TFA teachers were compared to non-TFA teachers in the same schools, as I noted in the item.

I think it is interesting that teacher effectiveness is continually linked to assessment using paper and pencil tests/ multiple choice tests/ performance or skills-based tests/ and standardized tests which have all had weak links to extended life long-learning or abilities to function with 21st century skills bases. Projects/presentations/authentic constructions and representations all are much better forms of assessment that show what children know and don't trivialize what they don't know.

At most schools, gifted classes are assigned on the basis of seniority. Considering that TFA teachers usually lack that seniority, the results are all the more significant, since they are actually, on average, working with a tougher set of students.

Meidl, there are some skills (particularly math and English mechanics skills) that really cannot be tested as effectively by projects as by standardized tests. While these should not be the only tests used, they really are the only objective test we have. It's nice to talk about higher-level thought processes tested by projects, presentations, etc., but honestly, most of the kids TFA teachers teach first need to learn how to read, write, and do basic arithmetic.

Steve does a great job explaining the study. Perhaps, however, there should be some rule that prior to commenting on the article about a research study, the commenter needs to READ the study (not just the article). Otherwise we end up with these type of knee-jerk reactions and soap-box comments. For once, if would be nice to read a comment that was informed and thoughtful about the features of the research. Medl, it is fine to ask about the measures used in the study, but you provided NO support for your view and certainly didn't provide any evidence that the current measures were tainted. Bernard, before you post another comment, you may want to take a basic course in research and stats. BTW, Bernard, not only do Harvard and Stanford produce very few classroom teachers, neither university tracks graduates to determine how well they do with with real kids in real classrooms. In fact the sad truth is that most ed schools do not track their graduates' effects on K-12 classroom performance. Before you sling arrows, perhaps you should start by demanding "effects" reporting from ALL teacher prep programs.

So this study doesn't mean anything.

It is as of little value as the Hoxby, or as I like to call it, the Hoaxby, Report.

Dora Taylor
http://seattle-ed.blogspot.com/

Dora, did you read the study? Of course it means something. Perhaps just not what you want it to mean. The study shows that more student growth in L.A. can be linked to TFA teachers when compared to similar experience level or veteran teachers. Also, by personalizing your critique of Hoxby's research, you are doing yourself a huge disservice. It makes you seem small minded and injures your credibility. I am sure that you wouldn't teach your students to use name-calling as a strategy for thoughtful criticism. As educators, we need to rise to the challenge of informed discourse--and not the take the easy road of discussion like the Sunday morning talk shows. Do not let your ideology overwhelm the evidence.

John Clement has an interesting idea re: using reforms that achieve a larger effect size. Russ Whitehurst made the same type of argument in a recent Brookings paper: Don't Forget the Curriculum at http://www.brookings.edu/brown.aspx

The fact that there has been success with TFA teachers should be considered good news all around. The methodology should be studied and taken apart to find out exactly what is making this program work. With the state of our education, petty competition and skepticism must be pushed aside in order to find real solutions to better the educational sytem. No matter what!

It is great achievement for ‘Teach For America’ teachers. But the results also depend upon the section of the students picked up by the researchers. I think this study is more useful for the other institutions providing training to teachers. Regular update in teaching practice is must, so teaching training should be continuous process.

Comments are now closed for this post.

Follow This Blog

Advertisement

Most Viewed on Education Week

Categories

Archives

Recent Comments