California Gets D+. Does Anyone Care?
California's schools just got a D+ grade. Mostly folks here have shrugged their shoulders about the Education Week's "Quality Counts" mark, which places the state in the lowest tier. Perhaps they shrug because they are unclear what to do about changing the grade.
The almost annual EdWeek grade is fairly predictable—the state has seen D grades before—and much of the calculation is based on factors over which educators have no or little control. The education level of parents, their employment status, and even the fiscal support public schools receive aren't things that teachers and principals can influence. That said, there are some possible lessons in these data.
Overall, California got a score of 69.2, putting it in the same league as Louisiana, Idaho, Arizona, Oklahoma, New Mexico, Nevada and Mississippi. Massachusetts, New Jersey, and Maryland led with scores above 85.
The EdWeek composite score contained three elements: Chance for Success, School Finance, and Achievement. To be meaningful, all require disaggregation.
The "Chance for Success" index, which counts for a third of the total grade, includes parent income and education and whether parents are fluent in English, where California ranks 51st (last among the states and the District of Columbia). The children of these parents tend to enter school not speaking English well; indeed, California has 1.5-million English Learners, more than twice the number in Texas and five times the number in New York or Florida.
While I can understand that statistically a student's life or educational chances may be lessened by having poor, immigrant parents, I can't rationalize why these measures become part of an education quality index.
California did better on the parts of the "Chance for Success" index that schools can control. The state was 7th in Kindergarten enrollment, 15th in preschool enrollment, 22nd in high school enrollment, 19th in postsecondary participation, 22nd in high school graduation, and 28th in adult educational attainment.
Finances Don't Add Up
It's in the "School Finance" section that California deservedly looks horrible. The good news is that it is no longer last; the state clawed its way up to 46th in the rankings of per-pupil expenditures adjusted for regional cost differences. California was also 44th in state spending on education as a percentage of state taxable resources.
This difference is reflected in gaping differences in teacher-student ratios: 24:1 in California, while the other large states averaged about 15. (See megastates comparison from the National Center for Educational Progress.)
Nagging NAEP Questions
The "Achievement Index" also requires decoding. Overall, California ranks 33rd, but the state does horribly on achievement on the National Assessment of Educational Progress (NAEP), the test that is administered nationally allowing a comparison among the states. This year it was 46th in 4th grade math, 47th in reading, and a little higher on the 8th grade tests.
There are some bright spots in the achievement data. Gains on NAEP (2003-2013) put California 2nd in the country in 8th Grade reading, 4th in 4th grade reading, 18th in 8th grade math, and 35th in 4th grade math. The state is also 8th in high test scores on Advanced Placement exams and 10th in changes in AP scores over the 2002-2012 decade.
But it's the state's continuing lag on NAEP that raises unanswered questions. It has always done poorly on this exam, which unlike the state's standardized tests does not necessarily track the curriculum. California students made substantial gains on the state's Academic Performance Index, also based largely on a standardized test. The state's goal was to have schools score over 800. When the test was first administered in 1999, only 31 percent of the schools scored over 700 on the index, in 2012, 83% of schools did.
Unlike the API, which was a high-stakes exam for students, teachers, schools, and districts, NAEP is a no-stakes exam: the results don't affect a student's progress toward graduation, potential college admission. They are not taken very seriously by the schools or students; neither get feedback about how well they did.
There have been various attempts to explain the disparity between results on other measures, where the state appears to be getting better, with NAEP where it continuously lags. California classifies more students as English Learners than other states, and includes more of those students in its tests than others do. But that alone does not explain the differences, because non-English Learners also do comparatively poorly on the test.
Meanwhile, the state's NAEP results have become highly politicized. Marshall Tuck ran for State Superintendent of Public Instruction using the phrase "46th is not good enough," and the Broad Foundation features NAEP scores prominently on its web site to illustrate that "public education is in deep distress." Number crunching economists continue to use NAEP as the standard to judge public education. Thus, while NAEP is a no-stakes exam for students and schools, it's a high-stakes reputational exam for the state as a whole.
As I scour the background material, I can find little useful diagnostic data that tells California educators what they could or should do to better their performance.
After pouring over the numbers for a couple hours and doing some background reading, "Quality Counts" left me with a pair of conclusions.
First, the devil is in the details rather than in the summary score. The D+ grade doesn't mean much.
Second, California is right to be heading toward developing multiple indicator dashboards and leaving its single indicator system behind. If the useful information is in the details, it's important for journalists as well as policy makers to focus on them.