School & District Management

Experts Recommend Ways to Cut Costs on Randomized Trials

By Sarah D. Sparks — July 01, 2011 3 min read
  • Save to favorites
  • Print

Randomized controlled trials are considered the “gold standard” in education research, and it can sometimes be difficult to tell whether that moniker refers to the quality of the experimental design or the sheer cost of putting one together. Yet with continuing pressure for rigorous studies, particularly evaluating new interventions, researchers are looking for creative ways to make RCTs more affordable.

At the National Board for Education Sciences meeting in Washington on Wednesday, members heard from two researchers with experience conducting experiments on a budget.

“The larger you get, the more complicated, and the more compromises you have to make along the way,” said Bridget T. Long, NBES vice chairman.
“Often you think it’s going to be cheap and it’s not; you think it’s going to be simple and it’s not.”

RCTs are “a great thing, but expensive,” said Robert E. Slavin, chairman of the Baltimore-based Success for All Foundation. “To detect an effect size of .2"—generally considered the minimum significant effect size—"you need 40 schools, and it’s tough to recruit these schools.”

Mr. Slavin proposed that the Education Department create a stable pool of potential RCT participants by requiring schools or districts requesting a competitive grant like the Investing in Innovation program—or even a formula grant such as the School Improvement Fund— apply with a demographically similar partner school or district. Though both partners would be given money to implement whatever intervention was requested, one would be chosen to start implementing a year before its partner, to create a natural control group to test the effectiveness of the intervention or program. The control school still would get money in that first year, Mr. Slavin said, but to be used for “business as usual” during that one year.

“I think this is feasible, but is likely to work more for more narrowly targeted interventions, like Read 180,” said NBES member Adam Gamoran, referring to the Scholastic, Inc. reading program, which the Education Department recently evaluated. “This offers a way of getting more out of those evaluations. One could imagine a whole group of I[Investing in Innovation] interventions linked in this way.”

Eric Bettinger, an associate professor at Stanford University, called for better planning and collection of administrative data, to stretch the dollars of an initial experiment through many years of follow-ups.

Mr. Bettinger recalled a recent series of studies of the effects of Colombia’s national secondary school voucher program. In the original study, Mr. Bettinger spent a year and a half tracking students using the vouchers, at a cost of $350 per student observation—and even there, he was only able to get a 55 percent response rate. However, using administrative data painstakingly gathered on the children during the initial study planning, Mr. Bettinger’s team was able to follow up more than 13,000 students, all of them for at least 12 months and many fully through high school graduation, at a cost of only $6 per student and about two months of data collection.

“It was one of the formative experiences in my life,” Mr. Bettinger recalled. “I think administrative data really holds the key to reducing costs in RCT evaluations.”

He suggested education research grants to include planning for using administrative data and information from state data systems at the start of the design of new randomized controlled trials. Most studies now collect data for follow-ups as an afterthought at the end of an experiment or months or years later, he said, which is more expensive and can be harder to do. Mr. Bettinger also argued that the Institute of Education Sciences should continue to train educators and researchers on how to protect student privacy while collecting and using such data.

IES has started to ask for such administrative data and partnering projects for some Education Department evaluations, such as those for the Striving Readers and Teacher Incentive Fund programs, according to Audrey Pendleton, the acting associate commissioner for evaluation at IES.

Yet Ms. Long cautioned that gathering data generally and conducting large numbers of “scattershot” experiments based on administrative data or RCT participant banks may be too expensive at a time when education is under tight budget constraints. “We have to ask, Is it okay to throw the coin even if we’re going to get zeros?” she said. “That’s why prescription drugs cost so much. They pass those costs on to the consumer, and in education we can’t really do that.”

A version of this news article first appeared in the Inside School Research blog.