Opinion
Federal Opinion

Department of Education Responds with More Information about Turnarounds

By Anthony Cody — March 31, 2012 6 min read
  • Save to favorites
  • Print

Follow me on Twitter at @AnthonyCody

On Tuesday, I posted a blog titled “Spinning the Numbers on Turnarounds: School Improvement Grant Controversy Brews.” In it I questioned the very limited information Secretary Duncan had released the previous week, when he claimed positive results for the program. I even went so far as to question what the numbers could mean. I had written to the Department of Education asking for clarification on Monday, but had not received any reply. Thursday, an entry was posted at the Department of Education’s Homeroom blog, which contained further information that clarified the issue. I immediately updated Tuesday’s post to include this. Yesterday, I received a formal response, with the request that I share it with my readers. What follows is that response.

From the Department of Education’s Jason Snyder:
We had started to gather the data to answer your follow-up questions about the preliminary, year-one achievement data in the School Improvements Grants (SIG) program when you posted your blog item “Spinning the Numbers on Turnarounds.”

As it turns out, your speculation about whether SIG schools “are really improving at all--or is this just a game of spinning numbers to try to make appear better than they are” is unfounded. Our preliminary achievement data from the first year of the SIG program show both that SIG schools to date are far more likely to have made double-digit gains in proficiency in math or reading than to have had double-digit declines, and they are far more likely to have had an increase in math or reading proficiency than to have had a decline. Your suggestion that “SIG schools are improving at a pace less than random chance would provide” is incorrect.

I have provided more details on the preliminary data cited by Secretary Duncan in a blog post on the Department’s website. But to briefly recap, about 830 schools nationwide were included in the first cohort of SIG grants. Forty-three states have now reported preliminary achievement data in the first year of the SIG program for about 700 of those schools.

The preliminary achievement data for year one, while still not complete, breaks down as follows:

• In math, more than 25 percent of SIG schools reported double-digit gains in proficiency, compared to 7 percent of schools that reported double-digit losses. In other words, SIG schools were more than three times as likely to report a double-digit gain in math as to report a double-digit decline in proficiency.
• The picture is similar in reading. Close to 20 percent of schools made double-digit gains in year one, more than double the 8.5 percent of schools that had double-digit declines.

Given the difficulty of school turnaround efforts, few anticipated that a substantial number of schools would in fact make advances in achievement in the first year. But for the vast majority of SIG schools, gains in proficiency in math or reading in the first year of the program are far more common than not. The preliminary data indicate:

• In 63 percent of SIG schools, math proficiency increased, compared to 33 percent of schools where math proficiency declined--meaning that increases in math proficiency were almost twice as common as declines.
• In 58 percent of SIG schools, reading proficiency increased, compared to 35 percent of schools where reading proficiency declined.

As these apple-to-apple comparisons demonstrate, SIG schools are not improving at a pace less than random chance would provide.

In addition, I want to quickly address your questions about Secretary Duncan’s statement that “in roughly 60 percent of SIG schools, the percent of students who were proficient in math or reading went up in the first year of the program.”

Secretary Duncan’s summary of the preliminary data may have been misinterpreted, inadvertently understating the extent of preliminary improvements in student achievement in year one of SIG. In using the 60 percent figure, he referred to the rough average of the percent of schools that had an increase in math proficiency (63 percent) and the percent of schools that had an increase in reading proficiency (58 percent). In fact, the percent of all SIG schools that had increases in either math OR reading proficiency in year one is much higher than you posited--it is 77 percent.

Our preliminary data also indicate that half of all SIG schools saw gains in both reading and math proficiency in year one--and that less than 20 percent of schools experienced declines in both subjects.

I also want to correct one erroneous assumption that you make in your blog about the requirements of the SIG program. You noted in my presentation to the Education Writers Association that I said it was important to have humility about the very real challenges to successful school turnarounds. But you then added that “humility is hard to detect in a program that demands that teachers be fired or schools closed as a condition of receiving scarce funding.”

That description of the SIG program is incorrect. The SIG program does not require that teachers be fired or that schools be closed as a condition for funding.

It provides four options for school turnarounds. Schools systems and their communities select the option and tailor the interventions to their local needs.

Their common objective is to transform their struggling school into a place where teachers and students want to go to teach and learn. Some decided that required teacher replacement, but most did not.

As I reported at the Education Writers Association meeting, the most common SIG model, transformation, does not require the replacement of teaching staff--and the transformation model is in fact being used by three out of four SIG schools nationwide. The least-used SIG model is closing a school. Nationwide, just two percent of SIG schools are using the school closure model.

I want to reiterate that both my blog post and the Secretary’s remarks make clear that we are talking about very preliminary achievement data--and that we have a long way to go to determine the ultimate success of the SIG program.

We don’t have all the answers, and we don’t pretend to. But tens of thousands of educators and principals are now taking on the challenge of improving teaching and learning in our nation’s lowest-performing schools.

I was a social studies teacher in a high-poverty school. Now, I am frequently on the road, visiting schools taking on the tough work of school turnaround. I see firsthand how that the challenges are great. But I also see the tremendous hope and sense of possibility that the SIG program is bringing to many schools where hope and possibility have long been in short supply.

I don’t question the sincerity or commitment to education of SIG’s skeptics. We know that this demanding work cannot be done alone--and that it is incumbent upon all of us to learn from one another about what’s working and what’s not.

Given your expressed commitment to honest reporting, open dialogue, and your concern that numbers not be “spun in ways that make the truth hard to see”, I hope you will seek to correct the errors in your post.

--Jason Snyder, a Deputy Assistant Secretary at the U.S. Department of Education, oversees the implementation of the SIG program.

Readers, what do you think?

The opinions expressed in Living in Dialogue are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.