A few days ago, I ran a guest post authored by science educator Jack Hassard; Cobb County, Georgia, Rejects Teach For America. One cogent comment came from Stuart (EdOutsider), who wrote the following:
Listen, all this slapping our own backs might be fun and good, but of all the states where TFA places its teachers, three (Tennessee, North Carolina, and Louisiana) conducted a study to determine which certification path produced the greatest collective student gains. What did it find? TFA teachers (teaching in the poorest 20% of schools) outperformed all other certification routes, including residency master's programs. Yes, on 5 weeks of training. So, on the whole, would you say we are recruiting the right applicants to the profession?
Here are the studies:
The fact that the most recent evidence was not included in this discussion, I am sure, has nothing to do with the author coming from a school of education.
I asked my prior guest, Phil Kovacs, to respond, since he had posted the reviews of research cited by Jack Hassard. Here is his response:
From Phil Kovacs:
Stuart (EdOutsider) argues that the three studies show some sort of clear evidence that TFA is solving problems that traditional programs can't. Let's take a look...
Teach For America Teachers' Contribution to Student Achievement in Louisiana in Grades 4-9, 2004-2005 to 2006-2007
This study was published by an organization (the National Council on Teacher Quality) funded by neoliberal reform organizations out to replace colleges of education. This organization is so bad that when they offered to evaluate colleges in our state, for free, ALABAMA turned them down. You can read more about issues with their work here, and here.
[Note: See a response from a representative of NCTQ below.]
This study is not peer-reviewed, and given that it is now three years old, it should have been. So it has not been scrutinized, but the abstract tells us what we already knew, that TFA recruits are better than new teachers on these particular indicators. They are not statistically significantly better than veteran teachers mind you, and this is important when we look at the next study.
I critiqued this on one of my earlier posts for Mr. Cody. The authors note that the rate of TFA turnover ultimately has a NEGATIVE impact on student performance because of how much better teachers with more than 4 years of experience do than new teachers of any certification. Perhaps that is true on both the Louisiana and Tennessee study, but we can't know because no one looked.
Finally, the Tennessee report card might be worth something, but it is based on a value added assessment that has been questioned by the scientific community, as I pointed out in an earlier post for Mr. Cody. It is also important to note that the Tennessee data linked to on that "research" page shows that only 8% of TFA recruits are still teaching after 4 years (compare that to UT Knoxville's 50%.) Just to remind the reader, this is taken from the report's summary: "The analysis contained within this report is not based on a comprehensive set of measures upon which the quality of teacher training programs should be ranked."
That statement is insightful is it not?
Just for the sake of argument, let's pretend that the findings on all three reports are valid and significantly so. I would still be hesitant to accept them as the "be-all-end-all" of teacher evaluation because they rely on a very limited definition, using Stuart's words here, of "student gains."
Anyone who has taught children of any age also know students make gains in areas outside of the subjects being tested. Social, personal, familial gains are just as important, if not more important, than academic gains when discussing the growth and development of human beings. They are not important, ostensibly, when discussing the proficiency of future workers. How fast can a child churn out math problems has become primary to how well she wrestles with the human condition. In the TFA summer camp, the second question is ignored because it can't be measured, and, the logical positivists supporting TFA and the education revolution we're currently enduring don't care about what can't be measured. Because TFA members are trained to have a myopic focus on standardized testing, they may in fact produce small gains on standardized tests, but those gains are, at present, the only possible positive from the program. For example, are their students ending the school year more interested in science? More resilient? More likely to graduate? We can't know. All we can know is that a) TFA members have been taught to worship data and b) that data worship may be producing some small gains in some subjects in some cities but we can't know for sure because of the admitted noise in the instruments being used to measure what they are doing.
Maybe, in the English classroom, the TFAers are not teaching writing because it is not on the test, as has been reported in a number of classrooms. Maybe they are simply working from the state provided workbook to raise scores and ignoring everything else that might be worth teaching...i.e. inquiry based science (not testable) or autobiographical history, which is not immediately testable, but has been suggested by scholars such as Nieto and Bode to improve student development and growth robustly defined.
Maybe, in these math, science, history, and reading classes, the teachers don't have students who are ELL or SPED, as has been reported across the country. Maybe, as has already been rumored in my city, the students with the worst behavior are being moved out of TFA classrooms because the TFA recruits have no classroom management.
In short, are principals putting the weakest students with their best teachers, which would make a lot of sense? My hunch is yes, though we need a study with principals who aren't afraid of losing their jobs, to determine if that was true. If the answer is yes, gains on short term tests seem logical.
Maybe the reason TFA members burn out so quickly (data showing the retention rate is declining will be forthcoming) is that they put in 16 hour days, work through weekends, skip meals, etc., and can do so because they don't have children of their own. And the burnout is a genuine problem because, as noted by the portal report that Stuart is parading, the children who have teachers with more than four years of experience do better on standardized tests than those who do not. (Again, I hate to use these tests as barometers of human performance...)
We need to be asking a) why the gains are there, b) if they are meaningful, and if so, we should look at c) what is going on. We need a cadre of scholars doing a, b, and c and then we need careful action, rather than uncritical parroting.
Update: Response from NCTQ:
I would like to alert you to a correction needed in your reposting of Phil Kovac's post. Kovac mistakenly cited NCTQ as the publisher of Study #1. We merely reviewed this study in our monthly newsletter last year (and hosted the pdf to ensure reliable access to it). The study's actual authors are listed on the first page of the pdf.
Moreover, Kovac's characterization of NCTQ as 'out to replace colleges of education,' couldn't be further from the truth. Our organization stands strongly in defense of traditional teacher training programs. I invite you to read through our responses to some of the most frequently expressed concerns and questions regarding our upcoming review of teacher preparation programs.
[The dialogue regarding the need for a correction continues in the comments below. I have extended an offer to NCTQ to share more about their work in this blog.]
What do you think of the research shared by Stuart, and the response from Phil Kovacs? Is there important evidence here that should be considered?
Philip Kovacs is an assistant, tenure tracked professor at the University of Alabama, Huntsville.