Assessment

The Art of Reporting Common-Core Test Results

By Catherine Gewertz — June 26, 2015 6 min read
  • Save to favorites
  • Print

It’s that joyous time of year again, when states begin unveiling their test scores. Among the first is Tennessee, which reported yesterday that while its students are making nice gains in math and science, they’re losing ground in reading.

Of course, as Chalkbeat pointed out in a preview story, Tennessee’s test doesn’t exactly reflect its standards. It adopted the common core, but balked at using PARCC to test students’ mastery of it. It chose instead to stick with its old test, the TCAP, weeding out some questions that were really out of line with the new standards, but not otherwise revamping it.

That misalignment is something politicians typically point to when they’re trying to explain bad scores. How will Tennessee leaders explain both weak performance and strong performance on a misaligned test? That’s just one of the many challenges in the test-reporting season that lies ahead.

You hardly need us to tell you that this common-core test-results season will be an interesting one. This summer and fall mark the first time we’ll see state scores on the PARCC and Smarter Balanced tests. As we’ve shown you in our reporting, 28 states and the District of Columbia will report scores from one of those federally funded tests. All the other states are using a variety of other tests, homegrown or bought off the shelf.

Regardless of which test they’re using, however, all but seven states are reporting results of tests that claim to reflect the Common Core State Standards. How well any of those tests really reflect the standards, however, is a largely unanswered question.

This is just one of the challenges that state officials will be juggling this year as they report scores from common-core assessments. Even without the alignment issue, there’s the trendline issue: most states gave new tests this year, so that makes year-to-year comparisons impossible. Unavoidably, people will want to make year-to-year comparisons. They’ll yearn to make year-to-year comparisons. And while those comparisons might tell you something about the relative rigor of each test, they won’t tell you much about students’ progress over time.

So what kinds of things can you expect to hear from state officials in the coming months? You can get a preview by watching a webinar sponsored by the Council of Chief State School Officers last month. In the one-hour session, public information officers from Kentucky and New York State shared their experiences reporting common-core test scores in 2014. [UPDATED: After publication of this blog post, the CCSSO deactivated the link to the webinar.]

As you might recall, those two states had polar opposite experiences: Kentucky’s scores predictably dropped, but its outreach campaign and statewide partnerships minimized any major blowback. When New York’s scores dropped, on the other hand, all hell broke loose.

The Kentucky education department’s director of communications, Rebecca Blessing, called the state’s tale “navigating the rough waters of common core without going under.”

She credited the calm waters to strong messaging in the years between 2010, when the state was the first to adopt the standards, to 2012, when the first test scores based on them were released. State officials worked with partners such as the Prichard Committee and the Chamber of Commerce in reaching out to lawmakers, educators, families, and the business sector to argue for the common core as a key vehicle for building the strength of the workforce and reducing college remediation rates, Blessing said. They chose to focus more intently on student readiness for work or college than on student proficiency, she said. Another key decision was that the state didn’t tie student test scores to teacher evaluations as soon as the new tests, developed by Pearson, were given.

How bad were the Kentucky score drops that all of this lead-up work managed to contain? Here’s the chart Blessing presented, showing the score drops the state projected, and the ones that actually occurred:

In the lead-up to the score release, Kentucky officials repeatedly warned that fewer students would score proficient on the new tests, and emphasized that the 2012 results couldn’t be compared to the old ones since the new tests were based on different standards, Blessing said.

Lisa Gross, who was the department’s chief of communications at the time, said during the webinar that the state’s partners backed the department in its bid to defend the new standards and tests, because they understood and agreed with the argument “that we weren’t doing [students] any favors to give them good grades in high school and [then] find out they’re not prepared for college.”

New York’s test score release, on the other hand, could be considered “a cautionary tale,” said Tom Dunn, the state department’s director of communications. Like Kentucky, New York officials made their common-core case for many months, arguing that higher standards and tougher tests were necessary to ensure that students were ready for college and good jobs. But the 2013 tests, designed by Pearson for the common core, were given—and teacher evaluations tied to those scores—before the state’s common-core-aligned curriculum was fully rolled out, prompting a firestorm of opposition by the state’s biggest teachers’ union, Dunn said.

The department has struggled to shift public opinion as elementary school proficiency rates plummeted from the mid 80-percent range to the low 30s, and as low as “single-digit proficiency” in some urban districts, Dunn said. One of the department’s strategies has been to couch the drops in terms of a score decrease, rather than a proficiency drop, and to portray the road ahead as a ladder that must be climbed to portray student readiness honestly, he said.

New York leaders decided to prepare the public for the change in scores by warning of the drops, Dunn said. State officials warned that a drop in scores—not performance—would be coming, “that 84 percent proficiency is not real,” Dunn said.

But looking back, he advises against that strategy (a strategy that many states have been using for months). Instead of making the public more ready to accept the changes, the strategy backfired. People accused the department of “jiggering” the results to back up its dire predictions, Dunn said. As a result, “I’d suggest you keep your predictions quiet,” he said.

He advised states, when discussing the coming score drop, to avoid giving numbers. “Don’t give people a target to look for,” he said. “That can complicate your lives.”

Parents contributed to the backlash, too, when they “realized their kids would do worse, not just kids in general,” said Dennis Tompkins, the department’s chief of external affairs. “The problem,” said Jonathan Burman, another department spokesman, “is that some people heard the message, but until they actually got their kids’ scores and saw the precipitous drop,” the real impact didn’t sink in. “I don’t know how you prepare” parents for their own children’s scores, he said. “When you get that envelope... it’s jarring.”

In Kentucky, the state and its partners managed to keep waters calm with their messaging. “We said scores are going to be lower, and don’t panic, it’s the short-term sacriice you make for the long-term gain,” Blessing said. “That was the message we used with parents across the board.”


Get Curriculum Matters delivered to your inbox as soon as new posts are published. Sign up here. Also, for news and analysis of issues at the core of classroom learning.

A version of this news article first appeared in the Curriculum Matters blog.