Blog

Your Education Road Map

Politics K-12®

ESSA. Congress. State chiefs. School spending. Elections. Education Week reporters keep watch on education policy and politics in the nation’s capital and in the states. Read more from this blog.

Federal

How Did SIG Schools Do on Graduation, Attendance, and Other Factors?

By Alyson Klein — June 25, 2013 3 min read
  • Save to favorites
  • Print

Back in 2009, the Obama administration and Congress gambled $3 billion on a big nationwide effort to turn around the lowest performing schools. The grants—as much as $500,000 per school per year—were supposed to be used over three years. That means the school year that just concluded is the last for the supercharged-by-the-stimulus School Improvement Grant program. SIG continues to get about $530 million through the regular appropriations process. But it has big enemies in both political parties who see the program as too prescriptive. SIG schools must put in place one of four improvement models that include dramatic steps such as closing down a school or getting rid of half its staff.

So did these schools—considered the very worst in the country—actually get any better? It’s been three years—and we still don’t really know for sure.

The U.S. Department of Education has already released some analysis of first-year test scores (read all about it here) that essentially showed that SIG appeared to help in some schools, but others actually got worse. Then, in January, the department put out a big data dump that included the first-year data from SIG schools—plus data from nearly every school in the country. The information is very useful, but you have to be the Ultimate Excel geek to get at it.

And test score data is only part of the picture anyway. Secretary of Education Arne Duncan made it clear during the first year of the program that “leading indicators” (like graduation rates, discipline data, teacher and student attendance) might actually say just as much about whether or not a school is on a path to improvement as test scores.

So now the department has put out data for the very first year of the SIG program. Keep in mind: This was the 2010-11 school year. SIG has been operating in schools for two years since then. And there’s no comparative data for 2009-10 (the department didn’t ask schools to collect it, since there was no SIG program.) There’s also no national comparative data on most of these indicators either—it’s just not the type of data that schools usually have to report to the department.

Still, the data gives some picture of what SIG schools are up against. (That’s the department’s take, anyway.) For instance, they had a collective national graduation rate of 62 percent back in 2010-11. (That’s compared, roughly, to 78 percent nationally in 2009-10). So these, are obviously challenging schools.

Just 18 percent of students in SIG schools were enrolled in advanced or college-preparatory courses that year. The average student attendance in SIG schools was 90 percent, while the average teacher attendance was 94 percent. Most of these aren’t figures that the department tracks nationally—and it does not have national data for the same school year—so it’s really tough to put the SIG data in context.

The most popular SIG model by far—the transformational model—asked schools to add extra time to their day. SIG schools averaged 74,680 minutes per school year in 2010-11 (but we don’t know yet compared to what).

The bottom line: This data will be a whole lot more interesting when we get another school year to compare it to. The department says that’s coming soon.

Something really important to keep in mind: SIG schools (and other struggling schools) are in a unique position as states transition to the Common Core State Standards and assessments. In some states, the changeover has meant brand new standards and tests and a whole new baseline for achievement and growth. Check out the experience of one school in Louisville, Ky., that my colleague, Lesli Maxwell, and I have been following here.

Related Tags: