In What Works Clearinghouse Research, Does High Quality Equal Highly Useful?
How do principals and superintendents use the federal What Works Clearinghouse? Are they looking for the state of the research in an area, or direct recommendations for and against programs, or something in between?
A new report by the American Enterprise Institute raises questions about how the clearinghouse should approach presenting studies on curricula and how it should work to help school districts sift through and use research.
Writing for AEI, Alan Ginsburg and Marshall S. "Mike" Smith argue the clearinghouse's 27 randomized controlled trials evaluating math curricula are "not particularly useful," Smith told me in an interview. Among the critiques: Curricula that spanned several grades were not evaluated for longterm effects across those grades; and the majority of studies covered only the first year of implementation, which can be less stable than later years.
"A teacher that has experience with didactic teaching might do better with one kind of curriculum, while a teacher that is better with a project-based or child-centered approach might do better with another curriculum," Smith told me. The What Works Clearinghouse's reviews don't provide that level of detail on programs studied, so "they don't give the information that people need to make decisions that are local decisions, mostly."
"You add up all of these and you come out with the realization that this may not be the correct way to go about evaluating curricula," Smith said. "Curricula are not coming out every seven years; now they are changing or should be changed every year as publishers get new data and information."
The report argued that the clearinghouse should remove studies that cover curricula that are no longer actively supported by publishers and others that "do not provide useful information for WWC users." It called for the Institute of Education Sciences to partner with the Office of Management and Budget and the National Academy of Education to create a panel of evaluators, curriculum experts and users like teachers and principals to review the clearinghouse's standards.
The critiques show "a fundamental misunderstanding of how the What Works Clearinghouse works," said Ruth Neild, IES' acting director.
The clearinghouse is not meant to provide pros and cons for specific curricula, she said, but "to provide educators the best evidence that we have at a given time."
Neild argued that the studies include as many details about the characteristics of the programs as possible, as well as warnings when a study was run by someone connected to the program developer—another critique in the report—but said the clearinghouse does not keep or throw out individual studies.
"It seems they want studies to be included only if they are implemented under ideal conditions and executed flawlessly," she told me. "We are reviewing studies that consumers might find if they did a Google search, or that a sales rep might present at a pitch meeting. We are serving a consumer protection role. If we followed their arguments to a logical end we wouldn't have any studies in the clearinghouse."
Joy Lesnick, an acting associate IES commissioner who administers the clearinghouse, added that she would not favor removing studies that include outdated curricula, because many districts continue to use them for years after a publisher stops supporting the materials. "If you are deciding among two curricula, and one you already have, you as a decisionmaker may not decide to change if you are paying for nothing more than updated pictures," she said.
However, both Neild and Lesnick agreed with Smith and Ginsburg that the clearinghouse would benefit from more detailed data on the performance of different subgroups of students, and the conditions of implementing different programs in different settings. IES is launching an associated site next fall to make it easier for practitioners to compare results for different settings or student groups, such as English-language learners or rural students.
"I don't think any one study on its own is what's useful to a decisionmaker, but the summary and analysis of the research in the field," Lesnick said. "The What Works Clearinghouse is a tool for people to use, ... but alone, it doesn't tell you what to do; it can't possibly do that because everyone knows their individual contexts and priorities."
You can watch a longer discussion of the report below: