New SIG Data Serve Up Same Old Conclusion: Mixed Results
The federal School Improvement Grant program—one of two legacy-defining initiatives that departing U.S. Secretary of Education Arne Duncan is planning to speak about in Boston later Thursday—has been highly controversial and yielded mixed results. And new data from the U.S. Department of Education released Thursday morning continues to paint an uneven picture of the program's impact, just as Congress is about to decide its fate.
Only a little more than half of the schools that received a third round of the newly revamped SIG grants—awarded during the 2012-13 school year—improved, while the other half saw stagnant student achievement, or actually slid backward.
That's not as strong a showing as the first two years of the Obama administration's revamped SIG program, which witnessed gains on state math and reading tests among about two-thirds of the schools that got three-year turnaround grants beginning in the 2010-11 school year, as well as those that started the turnaround process in the 2011-12 school year.
Still, the latest results from SIG schools are consistent with those from other public schools nationally, over the same time period. About 54 percent of SIG schools that got grants in the 2012-13 school year saw gains in their first year of turnaround, compared to about 45 percent for all schools across the country over the same time period. And about 46 percent of SIG schools stayed in the same place, or slid backward, compared to about 56 percent nationally.
Duncan has called SIG's past progress "incremental." And in Thursday's speech, he planned to say, "Let me be totally honest, we haven't gotten as far as I, or anyone, had hoped. But there's been vitally important progress." (Duncan also addressed the administration's signature Race to the Top grant program. More on that program from Andrew here.)
Unpacking the data
Experts are divided on whether SIG, which funded turnarounds at about 1,500 schools across the country, has helped or not. But nearly everyone who has studied the program points to big limitations in the department's data. States, districts, and schools need far more specific information from the feds about what worked for turnaround schools and what didn't in order to deliver a final verdict on SIG's effectiveness—and truly learn from the program's successes and missteps, they say.
Still, for the first time, the department details how SIG schools compared to all public schools nationally. And generally speaking, SIG schools were more likely to see double-digit gains in reading and math than other schools.
That's to be expected, said Robin Lake, the director of the Center for Reinventing Public Education at the University of Washington—low-performing schools are more likely to see big jumps than other schools, because, essentially, they have nowhere to go but up. (And in the most recent year in the report, SIG schools were also more likely to see big losses than other schools.)
Check out the math data here:
And the reading data here:
Notably, the analysis excluded a large percentage of schools, including roughly half of the first two cohorts of schools, as well as schools nationally, for math and reading data. (The exclusions were in about the 20 percent to 30 percent range for the third cohort of schools, those that got grants in 2011-12.) That's because so many states switched tests or standards between 2009-10 and 2012-13, so it would be nearly impossible to do a true apples-to-apples comparison.
What's SIG? Some quick background: The School Improvement Grant program, aimed at helping states fix long-foundering schools, got a huge infusion of cash, about $3 billion, in the American Recovery and Reinvestment Act of 2009, aka the economic stimulus. But with that money came strings. Schools had to try out dramatic turnaround strategies, such as closing down, turning into charters, getting rid of half the staff and the principal, or replacing leadership and trying out merit pay.
Even after the stimulus funds dissipated, the administration continued to fund SIG, to the tune of more than $500 million a year. But, in a 2013 spending bill, Congress made big changes to the program, allowing states to ditch the federally mandated models and try an evidence-based approach, or come up with their own turnaround prescriptions.
And SIG's future is uncertain—a bill in the U.S. House of Representatives to rewrite the Elementary and Secondary Education Act would get rid of the program altogether. So would a House budget bill. The Senate's ESEA legislation includes resources for the lowest-performing schools, but it looks different from SIG. (More here).
The new data doesn't seem likely to help SIG's case with GOP lawmakers.
"This is further evidence that the idea of Washington trying to fix 100,000 public schools by telling states and local school districts exactly how to fix their struggling schools is deeply flawed," a Senate GOP aide said. "The national school board approach needs to be mothballed and the president should work with Congress to pass legislation to restore responsibility to state and local school districts for determining what to do about struggling schools."
Even without the exclusions, there's a slew of questions that aren't answered in the report: Although the data explains how SIG schools did compared to other, non-low-performing schools, that's not as helpful as knowing whether SIG schools made a lot more progress than similar schools in the same district that didn't get the grants.
Although the data explains how SIG schools did compared to other, non-low-performing schools, that's not as helpful as knowing whether SIG schools made a lot more progress than similar schools in the same district that didn't get the grants, Lake said.
What's more, the data doesn't tell much about whether the gains have been sustained after the money—typically somewhere between $1 million and $2 million per school—dissipated. Were the schools able to keep improving without big federal cash? And what did the results for the past two years of SIG schools look like—in other words, how is the program maturing?
Andy Smarick, a partner at Bellwether Education who served in the U.S. Department of Education under President George W. Bush and later served in New Jersey's state education agency, thinks that even with those unknowns, the results don't look great for the prorgam.
"The best thing we can say is that $7 billion in SIG spending seems to have coincided with a 2 percentage point annual increase in reading proficiency in SIG schools," Smarick said in an email.
But Diane Stark Rentner, the deputy director of the Center on Education Policy, a research and advocacy program that has looked deeply into the SIG program, had a different take. These schools, she said, often have high-poverty populations and the intractrable social problems that come with it.
"It's sort of simplistic to think that if you just fix the schools that you're going to fix everything else," she said. "You can have great teachers and a great principal and you're going to add value, but you're not going to solve every problem."
Still, she agreed that more study of what actually went on at the schools—not just these topline numbers—would be valuable: "There's a whole lot more than we can learn from [SIG] both the good and the bad," she said.
And Keith Look, who served as the principal of the [email protected] in Louisville, Ky., which got one of the first SIG grants, noted that you can't forget about school districts and the role they play.
Turning around a school "is hard work," said Look, who is now the superintendent of the Danville Area School District, also in Kentucky. "Most of the schools that are struggling the most have been struggling the most for a long period of time. Maybe we have to start asking a different question, which is: What role do these schools play in the overall functioning of a district?"
Districts, he said, sometimes "rely upon [having some low-performers] for [their] own sense of equilibrium." (Read about the ups and downs of Shawnee's SIG experience in this series.)
Some other tidbits from the analysis: SIG schools were more likely to see increased graduation rates than other schools. That was particularly true for the first cohort of schools, those that started in the 2010-11 school year. And most SIG schools were able to increase learning time, including by extending the school day or year. (The report doesn't delve into whether that additional time was sustained after the money went away.)
This isn't the final word on SIG. The Education Department's Institute for Education Sciences is working on a more comprehensive report, due out next year.
Don't miss another Politics K-12 post. Sign up here to get news alerts in your email inbox.
Follow us on Twitter at @PoliticsK12.