UPDATED: Intensive Teacher Mentoring Not Showing Effects, Report Finds
Some bad news for supporters of intensive, or "high quality," teacher induction: Teachers were no more likely to boost student achievement or to stay in the profession after two years of these services, compared with teachers who received less-intensive forms of mentoring, according to a new Institute of Education Sciences report released this afternoon.
If that sounds vaguely familiar, it's because this report offers the second year of findings from a three-year study. I wrote about the year-one effects here. The findings are notable because of the study's "gold standard" research design, which involves a set of "treatment" and "comparison" schools.
Intensive-mentoring programs are typically more comprehensive and structured than the more informal "buddy systems" that are widespread in America's schools. Mentors in the program are also more carefully screened and assigned to novices. The two most widely known models are those run by the New Teacher Center, in Santa Cruz, Calif., and by the Princeton, N.J.-based Educational Testing Service, which were used in the schools studied here.
The year-two study was conducted by comparing a subset of schools that received a second year of intensive mentoring in about seven school districts, to a pool of schools that received the regular district-sponsored mentoring programs. The findings were similar to those in year I: Teachers receiving the intensive mentoring were more likely than those in the control group to report that they were assigned a mentor and spent more time overall in mentoring activities.
But the additional mentoring just doesn't seem to be translating into better student reading and math scores or teacher-retention rates. It also doesn't seem to be affecting the type of teacher who chooses to stay or leave the profession.
I'm no research expert, but I'd say it's fairly common for one year of a treatment not to have an effect; big, complicated programs like these intensive-mentoring initiatives can take a while to be put into place and iron out all the kinks. After two years, though, one does wonder where the disconnect might lie.
I'll update this blog item shortly as more reaction pours in from the field. Stay tuned.
UPDATE 1: The New Teacher Center has a release out on the study. It says that the design of the study didn't permit schools to replicate its induction model fully. For instance, the mentor-selection and supervision process was not conducted using the NTC's protocols. Also, the mentoring provided in the "comparison" groups was better than originally thought, so the schools receiving the intensive mentoring may not have had a strong enough "dosage" to differ from the comparison schools. These issues, NTC contends, may have led to the "no effects" results.
"In sum, we recognize that the Mathematica study was an experiment, not an induction program. We believe that it may not reflect the significant outcomes that can be achieved when districts have the time, capacity and willingness to focus on an in-depth, universal implementation of all elements of high-quality induction," New Teacher Center officials wrote.
You can read more about NTC's general issues with the study here.
UPDATE 2: Also, to clarify the first paragraph in this blog item, Mathematica Policy Research conducted the study for IES.