Get instant email alerts from EdWeek's blogs. Learn more.

« House Republicans Question Obama Administration's Discipline Guidance | Main | For Your Consideration: Education Plotlines for 'House of Cards,' Season 2 »

New SIG Analysis Yields Same Old Conclusion: Mixed Results

A newly revamped analysis of the Obama administration's controversial and costly School Improvement Grant program continues to show that billions in federal money—plus major federal strings—equals a mixed track record when it comes to one of the toughest challenges in education policy: turning around perennially foundering schools.

Just like under an analysis released by the U.S. Department of Education in November, roughly two thirds of schools who participated in the program showed gains in the first year, while another third slid backwards.

If that analysis sounds familiar, that's because it closely mirrors SIG data that was previously put out by the department—and then promptly pulled back after department officials realized its contractor, the American Institutes for Research, or AIR, had erroneously excluded too many schools from the mix. AIR crunched the student outcome data for the SIG program a second time, but the results didn't change substantially.

And, like the original analysis, the revamped analysis showed that schools that the first cohort of schools in the program—those that started in the 2010-11 school year—made greater progress overall than schools in the second cohort—those that started the program in the 2011-12 school year. The revamped data, like the original data, showed that schools in small towns and rural areas are generally outpacing their urban and suburban counterparts, especially in math.

Perhaps the biggest change came in the overall averages. Under the new, recently revamped data, schools in the program's first cohort showed about a 7 percentage point improvement in math, compared to 8 percentage points in the original analysis. And schools in the second cohort improved an average of one percentage point in the revamped data, as opposed to 2 percentage points in the original data.

Meanwhile, in reading, Cohort 1 schools went up about 3 percentage points, on average, while Cohort 2 schools went up about 2 percentage points. Under the original analysis Cohort 1 schools looked better (improving at a rate of 5 percentage points) while Cohort 2 schools looked worse (edging up just one percentage point.)

Check out the full data here:




Follow us on Twitter at @PoliticsK12.


You must be logged in to leave a comment. Login | Register
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

Follow This Blog

Advertisement

Most Viewed on Education Week

Categories

Archives

Recent Comments

  • YARGI YAYINLARI: I¡¦m now not sure where you are getting your info, read more
  • stop smoking: you have an incredible blog here! would you wish to read more
  • web hosting: Campaign K-12: 'Only Bill Richardson Has a Bold Plan for read more
  • kpss: What is Taking place i am new to this, I read more
  • domy RzeszĂłw: Iˇ¦ve recently started a website, the information you offer on read more