Education

ED Autopsy: How School Data Goes Wrong

By Sarah D. Sparks — July 16, 2012 3 min read
  • Save to favorites
  • Print

Washington, D.C.

In May, the popular annual U.S. News & World Report “Best High Schools” ranking suffered several high-profile inaccuracies, caused in part by faulty school information in the federal Common Core of Data. At the federal STATS-DC conference here last week, sponsored by the National Center for Education Statistics, federal, state, district and media data experts diagnosed a “perfect storm” of tight data deadlines, tighter budgets, and missed failsafe systems that led to the errors.

As my colleague Christina Samuels reported, the annual high school report includes descriptive information from the Common Core of Data, such as school size, poverty level based on free- and reduced-price-lunch participation, student-teacher ratios, and other characteristics. U.S. News reporter Robert J. Morse said these data are analyzed along with information from the College Board and International Baccalaureate programs about the number of students taking and passing advanced coursework.

U.S. News started to remove schools from the list as errors came to light; in total, 17 schools have been incorrectly awarded gold, silver or bronze rankings, about a third of 1 percent of the 21,776 public high schools reviewed in 49 states and Washington, D.C., Mr. Morse said. (Nebraska was not included.)

Marilyn Seastrom, acting deputy commissioner and chief statistician for NCES, said the errors were not part of states’ original data reporting the Education Department, but were introduced when the state had to resubmit data from all of the districts in order to correct a separate error. These supplemental data reports came in very close to the reporting deadline, according to Julian Montoya, the state EdFacts coordinator at Nevada, and were not edited in the same way as the original data submissions.

“I call this a perfect storm,” Seastrom said. “We have a system of multiple checks and balances, and all managed to fail simultaneously.”

Ironically, the check that finally caught the problem came from a class of students at Green Valley High School in Henderson, Nev., which was inaccurately ranked No. 13 on the list. The students noted that the school’s student-teacher ratio was listed as 4:1—leagues below the schools actual ratio of 24:1—and its Advanced Placement test passage rate was 100 percent, rather than the actual 64 percent passage rate. The students and their teacher alerted the Las Vegas Sun, which called Seastrom.

Marie Stetser, the program director for NCES’ Common Core of Data, said it has made several changes to the way it edits data reported by the states, to catch more potential errors:

• It will average the variation in school numbers over four years and compare both the variation from year to year and the variation between the current year and the three other years.
• It will look for unusual swings in district data, such as a student-teacher ratio that is three times higher than the average, or a student population growth 10 times larger than that of other comparison years.
• Potential errors will be sent back to states using a reporting form that will make it easier for states to view potential errors in individual districts and correct them, rather than resubmitting the entire state’s data and risk introducing new errors. This might have prevented the introduction of errors in the Common Core of Data used by U.S. News.

The new data reviews are also highlighting areas with unusual school data, such as districts with virtual schools, Stetser said. “This concept of virtual schools is really bringing something new in, and we’re going to have to figure out what’s normal,” she said.

For the 2013 ranking, based on 2010-11 data being analyzed now, Morse said the magazine will implement additional data analyses to identify unusual outliers, like Green Valley’s suspiciously low student-teacher ratio.

Tolani Adeboye, the deputy director of state and federal evaluation for the New York City Public Schools, said though her district did not have incorrectly ranked schools, all districts can take a lesson from the situation about the need to be responsible for their own data.

“This was a great learning moment for New York City. I think there’s a sense out there in [local education agency]-land that these data aren’t very meaningful once they leave their state departments,” she said, “and in fact people are making very important data requests to the federal government. Data is valuable to the extent it’s accurate and well understood and well interpreted. So we’re very vested in this conversation.”

A version of this news article first appeared in the Inside School Research blog.