Value-Added Debate Heats Up in Los Angeles
Reporters for my hometown paper, the Los Angeles Times, just published a fascinating, potentially explosive story based on the use of teacher "effect" data that indicates which teachers seem to be producing the strongest gains for their students.
Using value-added student test-score-growth data gathered from a public-records request, the newspaper created a database with information about how well teachers in tested grades and subjects fare in terms of boosting student achievement. It will even release data on individual teachers' growth scores because, the paper writes, "they bear on the performance of public employees who provide an important public service, and in the belief that parents and the public have a right to the information."
The district had the ability to generate such information but never pursued it, partly fearing the reaction from the local teachers' union.
Some of the analyses' early findings stand in contrast to the conventional wisdom about teacher distribution. For example, they find that the most- and least-effective teachers are not actually concentrated in the most- or least-affluent schools. That's counter to most other measures of teacher quality; for instance, low-income students tend to be assigned more out-of-field teachers.
Meanwhile, there's been precious little information on the distribution of effective, rather than qualified, teachers. (Mathematica Policy Research, as part of the federally funded Talent Transfer Initiative, is releasing an analysis on this topic sometime in the near future. See a recent story of mine on this issue of "equitable teacher distribution" for details.)
Other surprising findings: Qualifications had few effects overall on effectiveness, in contrast to other studies suggesting that "bundles" of certain qualifications might have an impact. And class size was unrelated to effectiveness as well.
United Teachers Los Angeles is, let's just say, really, REALLY peeved about this. It's fighting back, asking members to send letters to the editor to protest the story, and I've even heard it wants members to boycott the paper and cancel subscriptions. The union argues that test scores aren't an appropriate measure of student learning and are even worse for judging teacher effectiveness, citing problems with the estimates.
Frankly, though, I wonder whether UTLA's protests will do much to change things now that the genie is out of the bottle. Parents, teachers, and students are likely to make full use of this database, and the consequences will be interesting to watch.
Way back in 2008, when I first wrote about value-added data for EdWeek—presciently, it seems— the president of the National Council on Teacher Quality told me she felt that the data, when unveiled, would create a demand for more such information. "If the public were given half a chance to learn about the power of value-added-data systems, I doubt they'd remain unengaged and tolerate depriving schools and teachers themselves of the data," she said. She may be about to be proved correct.