Federal

If Math and Reading NAEP Scores Fall, Who’s to Blame?

By Liana Loewus — October 21, 2015 2 min read
  • Save to favorites
  • Print

The newest round of math and reading scores on the National Assessment of Educational Progress, known as the “nation’s report card,” are due out early next week. We’ll see how 4th and 8th grade students performed nationally, in each state, and in 21 districts.

But apparently there’s already some buzz that the scores have dropped.

Michael Petrilli, the president of the Thomas B. Fordham Institute, wrote in a blog post today: “Rumors abound that the news is going to be bad, with scores down nationally and in a bunch of states. That will be used as fodder to attack Common Core, teacher evaluations, charter schools, or whatever else you happen not to like that’s prominent in today’s education policy conversation.”

Petrilli, who happens to like the policy efforts he’s listed above, offers an alternative explanation for the possible decrease—the economy. The last time the nation saw declines in NAEP scores was after the 1990 recession, he argues. (However, it’s worth noting that he uses the NAEP “long-term trends data” to show this—but that’s different from the data coming out on Wednesday, which is hard to compare over time.)

“While those of us in education reform are working hard to make sure that demography does not equal destiny, we must also acknowledge the strong link between students’ socioeconomic status and their academic achievement,” Petrilli writes. "[W]hen families are hurting financially, it’s harder for students to focus on learning.”

Ironically, this is the same argument that education historian Diane Ravitch and others who oppose reform efforts like high-stakes testing and increasing charter schools have been using for years: That poverty is to blame for most of the U.S. education system’s problems.

What’s more, the Great Recession ended in 2009. And the results coming out next week are from tests that students took earlier this year.

There’s no doubt that if the scores, which had been inching up recently, have now tanked, everyone will be pointing fingers. In fact, many people will surely point them directly at outgoing U.S. Secretary of Education Arne Duncan, who pushed through sweeping policy changes in a short time period. (And if the rumors are true, could this even be an explanation for his early exit, after he spent many years giving the impression he would remain until the end of President Obama’s term?)

Whatever the NAEP results say—and I emphasize here that neither I nor Petrilli have seen them—the caveat about “misNAEPery” applies: Remember that it’s extremely difficult to use NAEP data to prove whether a particular policy worked or didn’t work.

As Morgan Polikoff, a University of Southern California researcher, wrote recently, “any stories that come out in the weeks after NAEP scores are released should be, at best, tentative and hypothesis-generating (as opposed to definitive and causal effect-claiming).”

A version of this news article first appeared in the Curriculum Matters blog.