Reading and Math Scores: 'Handle With Care'
Just how much do gains on reading and math gains on state tests tell us about school quality?
This week, Stanford University's CREDO released its authoritative new study of charter school management organizations. Using its highly regarded "matching" technique in which each charter student is paired with a district-based "virtual twin," CREDO consistently provides the best, most deliberate analysis we have of reading and math performance in charter schools. So, what should we take from CREDO's findings?
Many have seized on the results in order to make sweeping claims. CREDO found that non-profit schools made much larger test gains than for-profit ones, prompting AFT president Randi Weingarten to thunder "this CREDO study confirms that for-profit charter and virtual schools serve the interests of corporations" rather than kids. Alex Hernandez of the Charter School Growth Fund celebrated: "[CREDO] reports that the 107,000 students whose schools receive support from the Charter School Growth Fund gain, on average, the equivalent of four additional months of learning in math and three additional months of learning in reading each year when compared to peers in other public schools."
Such responses have become common in the aftermath of any major study that analyzes reading and math gains (which is most influential research nowadays). Consequently, it seems useful to review the reasons why reading and math scores may go up and what that means when making sense of them. There are at least six reasons that scores may be going up:
- For a variety of reasons, students may be learning more reading and math. The tests are simply picking that up. All good.
- Students may be learning more in general. And the reading and math scores are a proxy for that. Even better.
- Instructional effort being shifted from untested subjects and activities to the tested ones (e.g. to reading and math). Not great if we value the full breadth of the curriculum, but potentially a reasonable decision to reemphasize reading and math.
- Teachers are learning what gets tested and students are becoming increasingly acclimated to the tests.
- Schools are focused on preparing kids for tests and engaged in test preparation so that the scores improve even if students aren't learning.
- Scores are being manipulated in various ways. This can mean things as perfidious as cheating or as mundane as starting the school year earlier.
Which of these apply turns out to matter quite a lot. Some thoughtful people will dispute this. They'll say, "Whoa, Rick, you're overcomplicating things. This is a bunch of hand-wringing. You don't see people getting so angsty about using runs to do 'moneyball' analysis of baseball teams or profitability to evaluate companies."
Such complaints miss a simple distinction: Baseball is about scoring runs. If a team scores more runs than its opponent, it wins. That's the whole goal. Period. For-profit companies generally seek to maximize profits. Though, in that case—and especially at our most respected firms—the pursuit can be tempered by concerns about long-term success and attention to things like social mission and employee morale.
In schooling, of course, no one—not even testing's biggest enthusiasts—thinks that testing is the goal of schooling. At best, reading and math tests are thought to be proxies for a subset of learning. And most of us recognize that there's a lot of slippage there, even if we presume that the tests are well-designed and reliable. (If you're interested, I discuss this point at greater length in Letters.)
This all means that why test scores are going up is quite important. For instance, Weingarten may be right that for-profit status is enough to cripple a school's instructional acuity . . . or it may be that for-profits focus more on what their families value than on reading and math scores (families tend to rank test score performance pretty low among the things they value). Similarly, Hernandez's pride in the reading in math gains of CSGF schools may be well-placed . . . or it may be that CSGF selects for a select population of schools that get big test gains, that those schools know big gains are expected, and thus that they employ strategies that may produce big reading and math gains (providing a distorted picture of learning). I don't know which may be the case. I'm betting that Weingarten and Hernandez don't either. And in light of the strong claims that get made, that's a problem.
Given the massive body of influential contemporary research that rests on reading and math scores, I am always struck at how uninterested most researchers (outside of Harvard's Dan Koretz) seem to be in understanding why test scores moved. I understand what's going on, of course: researchers are trying to get funded and published, and trying to wade into the mucky depths of this stuff will only complicate their tale, slow their stride, and make it harder to get funded or published. And econometricians have to make some simplifying assumptions if they're going to work their magic—one of which is to assume that reading and math gains are valid and reliable measures of student learning. But still.
Test score gains tell us something useful. But, until we get more insight into what's causing them, they should be stamped "Handle with care."