Rhee vs. QC: How StudentsFirst's Report Card and Quality Counts Differ
With the release this week of two different K-12 state rankings, one from Education Week and one from StudentsFirst, it's been a bonanza for people who like (or at least are compelled to write about) "report cards" on education. Naturally enough, there have been some comparisons between the two. For example, Motoko Rich, an education reporter at The New York Times, noted that both Education Week's Quality Counts and StudentsFirst gave Florida a B- grade. The obvious statistical difference there is that StudentsFirst's B- was the highest grade given out to any state, while Quality Counts gave better grades to five other states.
But more broadly, what are the similarities and differences between the two reports?
First, since several people have made an issue of this, it's important to note that Quality Counts includes actual student achievement data. For example, the share of a state's students scoring proficient in reading would get a corresponding number of points on a zero-to-100 basis on one indicator used in the methodology for Quality Counts this year. StudentsFirst, meanwhile, as the American Federation of Teachers has noted in critical fashion, does not include test scores or other student achievement data in its state grades, although despite the letter "grades" there's no real pretense that such scores matter to StudentsFirst, at least not for this state report card.
Even where both Quality Counts and StudentsFirst give grades based on policy, the two reports are judging states based on different policies in some cases.
Let's take a look at a few of the 24 policies that Rhee's group tracks. StudentsFirst awards a "GPA" to states that "allow for mayoral and state control of academically low-performing schools and districts" or require that "public charter schools should have the first right to buy or lease excess public space, at or below market value." They also provide additional GPA points for having parent-trigger laws.
Quality Counts doesn't take into account any of those categories in its state grades. It does, however, take into account things like whether the state has incentives for teachers to work in hard-to-staff areas, as well as whether a state has a reduced workload for first-year teacher (both positives on the Quality Counts rubric). The methodology from StudentsFirst doesn't indicate that the group factors those areas into states' grades.
Now, some of the grading criteria do overlap. Both Quality Counts and StudentsFirst provide credit to states, for example, if they require "student achievement" to be included in teacher evaluations. States also get credit in both reports for having alternative certification pathways for teachers. But certainly the two reports part ways over controversial policies like the parent-trigger option.
One other major difference worth pointing out is one of quantity. Roughly 100 individual "indicators" of various kinds are used in Quality Counts (although they aren't all updated every year), while StudentsFirst says it gave out grades based on 24 different policies. So the main point is that the reports may be superficially similar, but the gears that make them work grind and clank in some different ways, and can be used in different ways.
UPDATED: Eric Eagon, a spokesman at Policy Innovators in Education, a Minneapolis-based advocacy organization, created an Excel spreadsheet showing the difference in state rankings between Quality Counts and StudentsFirst. For old times' sake, let's look at the District of Columbia! That's the jurisdiction with the biggest gap in rankings between the two reports (41 spots). D.C. ranks fourth in StudentsFirst's report card, and 45th on Quality Counts. But there's not a titanic difference in the actual grades between the two groups. StudentsFirst gives D.C. a C+, while Quality Counts gives it a C-. And look at Arizona, which is ranked 43rd by Quality Counts and 8th by StudentsFirst, but receives the same letter grade in both, a C-.
On average, the difference in states' ranks between the two reports was 15 spots, according to Eagon's spreadsheet. In contrast to D.C., West Virginia is an example of a state Quality Counts ranks much higher than StudentsFirst (9th compared to 48th, B- compared to F).
For the sake of disclosure, PIE's board of directors includes the oft-quoted Thomas B. Fordham Institute's Michael Petrilli, and Patricia Levesque, the CEO of the Foundation for Excellence in Education, which is led by former Florida Gov. Jeb Bush. So they have some leaders with sharp ideological points of view. [Eagon has subsequently pointed out that the board also includes more progressive members like Cynthia Brown from the Center for American Progress, so I've updated the post to include that as well to correct that oversight.]