Personalized Learning

Harvard Researchers Probe Student Time Spent Using DreamBox Math Software

By Leo Doran — June 08, 2016 2 min read
  • Save to favorites
  • Print

The more time students spent using the popular DreamBox blended-learning math software, the greater progress they made, according to a study of standardized tests scores and student-usage logs by Harvard researchers.

But students generally failed to meet recommended targets for weekly usage, the University’s Center for Education Policy research found in a review of data of roughly 3,000 students in grade 3-5 enrolled in Maryland’s Howard County and in California’s Rocketship charter system over the 2013-2015 school years.

While DreamBox, based in Bellevue, Washington, recommends that students spend 60 to 90 minutes using the software each week, researchers found that students averaged between 35 and 44 minutes per week in the Maryland and California systems, respectively.

“Districts and CMO’s [Charter Manager Organizations] are getting a lot of use out this software,” said Rodney Hughes the Research Manager at the Center for Education Policy Research who oversaw the study in a telephone interview, “it’s important to know if these software platforms are working.”

According to the study, a student who began at the 50th percentile on California’s state standard’s test, and spent the average 44 minutes with the software per week—still below the company’s recommendations—tended to score in the 54th percentile by the end of the year.

DreamBox CEO Jessie Woolley-Wilson attributed the promising results to an instructional design that she says “always had close partnerships with teachers at the core.” DreamBox offers professional development services that accompany its software.

The Harvard researchers also found that school- and teacher-level decisions—and not student preference—had the greatest impact on how much time students actually spent with the software. Hughes suggested that many of the teachers they surveyed had competing curricular priorities. While using DreamBox does appear to benefit students, the study didn’t measure if using the software was more effective than other, more traditional, academic activities.

The study was also not designed to determine causality; students who spent more time on the software could be naturally inclined to study more in general, or teachers who push the software might also be more effective in developing skills through other activities.

Nevertheless, Woolley-Wilson called the study a “clarion call to all blended learning providers.” The study is the latest piece of encouraging research on technology in the classroom, after years of inconclusive data and skepticism.

Hughes, of the Center for Education Policy Research, described the overall results are “positive” and “encouraging” for the company and for adaptive learning proponents. He cautioned, however, that more evidence needs to be gathered, and that other ed-tech platforms might be more or less effective than DreamBox’s.

The Center for Education Policy Research is launching a Proving Ground initiative, in which the research center plans to expand its surveys of products in the marketplace.

There is more to learn, Hughes said, not only about “what features are driving student achievement” within platforms like DreamBox, but also about how the tools are being woven into the curriculum—or, what kinds of implementation decisions teachers and school officials are making.

See also:

Related Tags:

A version of this news article first appeared in the Digital Education blog.