Opinion
Education Opinion

Reading First Interim Report Doesn’t Pass the “So What? Test”

By Marc Dean Millot — May 07, 2008 4 min read
  • Save to favorites
  • Print

My April 5, 2007 “Letter From” was titled “Department of Education Technology Study Says What Every Major Study of Broad Reform Initiatives Says: It Depends and We Don’t Know.”

I could replace “Education Technology” with “Reading First” and write the same piece about the Reading First Impact Study: Interim Report conducted by Abt Associates, MDRC and Westat, and released by the Department of Education’s Institute for Education Sciences on May 1.

Here goes:

It is unlikely that the media will treat this report with much subtlety, but the study may tell us much more about the state of the evaluation art than the efficacy of (literacy programs purchased under Reading First).

Perhaps the first point is the idea of randomly assigning programs to teachers, even if they were volunteers. If we know anything from research on other large scale program interventions, we know that teacher “buy-in” and district support are essential to implementation, and we don’t need a study to tell us that implementation is important to program results. In the real world, providers seek schools with teachers who have selected the program after reviewing their options. Random assignment is bound to result in random buy-in, and so random implementation, and random results. The meaningful test of program efficacy is as the program is intended to be offered on the market - comparing teachers who want to use the program with teachers who do not use it….

Second, “the study was designed to report results for groups of products rather than for individual products.” (Note: The Reading First Study contains no such wording, neither does it distinguish among the various educational offerings purchased with Reading First funds.) To get a sense of how useful this is to consumers or policy makers, consider a study of automobile emissions, safety, or gas mileage based on the categories “compact,” “SUV,” and “luxury”. In each category every make and model, or rather selected makes and models chosen by the reviewers, are treated as a homogenous group and assigned to drivers interested in participating in the study. What exactly is the utility of the findings about the impact of these classes of car on any of these measures? Does it help policymakers make decisions about the automobile industry? Does it help consumers decide on the purchase of their next cars? No and no.

(Reading First) programs may or may not add value to student performance, but this study doesn’t tell us anything about that. Indeed rather than shed light, it is likely to obscure the issue. The study has value as a step in the development of appropriate methodologies for the evaluation of educational interventions promulgated on a mass scale, and the research community is better off for it. But is no guide to policy or purchasing.


On the whole, I’m satisfied with the substitution.

In a very real sense most of the educational programs purchased with Reading First funds and implemented by teachers in the classroom were randomly assigned. “The meaningful test of program efficacy is as the program is intended to be offered on the market - comparing teachers who want to use the program with teachers who do not use it”. Most teachers did not have a say, they implemented what the district decided - and I must add that the Administration constrained district choice to disadvantage programs like Success for All and others that do have a favorable record of evaluation.

Similarly, this study examines the gross results of a funding stream, rather than the specific effects of different educational programs on the market. It treats all offerings, at all price points, with all materials and support, etc as a monolithic “Reading First Program.” The study helps no one trying to decide on the purchase of a literacy program eligible for Reading First funding. Indeed, it misleads consumers - and policymakers considering the potential value of Reading First Funding - by suggesting that nothing works, although it indicates absolutely nothing of the sort.

Edbizbuzz readers know that I support the version of Reading First written into No Child Left Behind, but object to how Margaret Spellings and other Administration officials oversaw implementation at the White House and Department – especially those provisions related to the meaning of Scientifically-Based Reading Research. It would be very easy for me to write that this provides evidence of the Administration’s folly. I can safely say that the study provides no support for Reading First as a federal funding program, but it doesn’t tell us anything about the efficacy of any one of the privately-developed educational programs purchased by schools with Reading First funds.

If we expect research to guide policy, we can’t simply seize on findings and decide if they back our hopes or preferences. We can’t receive favorable findings as the Ten Commandments brought down from the mountain by Moses, or hire someone to do a detailed critique or counter study if the studies are politically unhelpful. I am modestly disappointed that this Reading First study suffers from the same basic flaw as the Educational Technology study released last year, but for all I know the two study designs were locked in at the same time.

The Reading First Interim Report doesn’t pass the “so what? test.” If it tells us anything, it’s that federal evaluation policy for K-12 curriculum and instructional offerings needs an overhaul.

Marc Dean Millot is the editor of School Improvement Industry Week and K-12 Leads and Youth Service Markets Report. His firm provides independent information and advisory services to business, government and research organizations in public education.

The opinions expressed in edbizbuzz are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Webinar
How To Tackle The Biggest Hurdles To Effective Tutoring
Learn how districts overcome the three biggest challenges to implementing high-impact tutoring with fidelity: time, talent, and funding.
Content provided by Saga Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Math for All: Strategies for Inclusive Instruction and Student Success
Looking for ways to make math matter for all your students? Gain strategies that help them make the connection as well as the grade.
Content provided by NMSI

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Education Briefly Stated: March 20, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read
Education Briefly Stated: March 13, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read
Education Briefly Stated: February 21, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read
Education Briefly Stated: February 7, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read