Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the American Enterprise Institute think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

Accountability Opinion

Making Sense of the Dee-Wyckoff IMPACT Study

By Rick Hess — October 18, 2013 4 min read
  • Save to favorites
  • Print

Stanford’s Tom Dee and UVA’s Jim Wyckoff have just published an important study on Washington DC’s controversial teacher evaluation system. They find that the IMPACT system (launched during Michelle Rhee’s tenure as chancellor) appears to boost teacher effectiveness and also makes it more likely that low-performing teachers will depart. Per usual, Dee, the king of regression discontinuity, has found a clever research strategm that lets the researchers push past descriptive data and correlational findings. (Full disclosure: Dee reminded me the other day that I played a role in connecting the authors with DCPS.) The study, published as a National Bureau of Economic Research working paper, notes that IMPACT appears to aid students both by “avoiding the career-long retention of the lowest-performing teachers and through broad increases in teacher performance.”

The study is thoughtful, careful, and an important reminder that the whole point of rethinking teacher evaluation, pay, tenure, and such is to help schools and systems do a better job supporting and retaining terrific teachers, attracting talent, and getting lousy educators to improve or find new work. The results have yet to be peer reviewed and ought to be treated as preliminary, but they are promising.

Unfortunately, in the giddy chest-thumping to which would-be reformers have shown themselves all too susceptible, it can be easy to overlook some of the factors that help qualify the broader significance of the findings. Before pointing to Dee-Wyckoff as proof that aggressive teacher evaluation “works,” would-be imitators need to take a good look at what DCPS has actually done. For starters, I’d point to at least a half-dozen things that distinguish IMPACT from many other teacher evaluation efforts.

1] IMPACT is a program, not a statute: My biggest concern is that casual readers may regard the findings as speaking to state legislation regarding teacher evaluation. It’s crucial to keep in mind that IMPACT is a program and not a statute. This means that DCPS has been able to readily and repeatedly tweak the system, year over year (and even sometimes during the year). The evaluation model is not set in stone and needn’t apply to a slew of districts with various needs; it is designed for DCPS.

2] DCPS didn’t have to negotiate the whole thing: Michelle Rhee didn’t have to negotiate all the ins and outs of IMPACT. The DC chancellor has the authority to evaluate teachers, and this meant that it was possible to devise a system with sharp edges. It also meant that the system could be readily modified and improved with formal (and informal) teacher input, without requiring drawn-out new negotiations.

3] Talent: Over time, DCPS has recruited and assembled a remarkable, large, and sustained team to design and implement IMPACT. They became a magnet for talent on this precisely because they were a national leader. They’ve tapped top-shelf advisors, worked assiduously to address educator concerns, and taken pains to explain the system clearly and accessibly to educators. This has been a big expense, in a district of just 45,000--and I’ve not seen a lot of other districts make a similarly large and sustained commitment on this score.

4] The technology and personnel systems to make IMPACT work: The data systems in place in DCPS were inadequate for IMPACT’s needs. The district essentially had to build a parallel personnel data system in order to handle IMPACT, a substantial investment that few districts are able or willing to make. Indeed, I’ve been in other districts that lack even the basic information technology needed to make an IMPACT-style system work. In more than a few cases, fixing that may require more than some TLC--it may require installing a whole new (expensive) management information system.

5] DCPS did IMPACT seriously: Everything about IMPACT is ambitious. The dollars for exceptional performance are huge, relative to what most districts have dabbled with. The consequences for persistent low performance are substantial. Past experience gives reason to doubt that much milder versions of the system will deliver the same jolt. Meanwhile, the program design is unusually sensible and coherent. As Dee explained, ""D.C. is fielding incentives that are just very different from what we’ve seen before. Part of that is, it’s not just cash for test scores. It’s instead incentivizing things that teachers can control more directly.”

6] DCPS teachers have helped shape IMPACT: With all the attacks on Michelle Rhee in recent years, it’s easy to caricature how the work in DCPS has gone down. In truth, anything as big and stark as IMPACT was going to be contentious. But Rhee and Henderson have worked throughout to solicit input from teachers, involve teachers in the process, and make it clear that they’re open to making smart adjustments to the system. Because IMPACT isn’t framed by legislation or the contract, Rhee and Henderson have enjoyed the agility to do this effectively--and had the ability to hold the line when necessary. And, nowadays, lots of teachers in DCPS are fond of IMPACT--and the opportunities it provides them--even though it has real teeth. It’s unclear how other districts will negotiate these tensions.

I fear that any would-be reformer who reacts to the Dee-Wyckoff with a “whoo-hoo!” and then beating the drum on teacher evaluation is missing the point. It’s a mistake to allow reassuring results to serve as a justification for half-baked efforts, one-size-fits-all teacher evaluation statutes, or inattention to the gritty work of implementation. Doing so runs the risk of encouraging inept efforts to scale a promising possibility. And it’s not like we need any more of that.

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.