Student Achievement

Study Examines the Conflicting Findings on Effects of More School Time

By Samantha Stainburn — June 04, 2014 3 min read
  • Save to favorites
  • Print

Does more school time improve student academic performance? It’s a simple question, but researchers have not been able to agree on an answer. Some studies have found that more instructional time does not increase academic achievement in developed countries; other studies that have examined school experiments with time have found that it does.

Daniel A. Long, an assistant professor of sociology at Wesleyan University, assesses conflicting findings about the effect of instructional time on achievement and expands on previous studies by examining data from the 2006 Program for International Student Assessment (PISA) in a study published in the May 2014 issue of the journal Educational Policy.

Evidence of the effects of more time often comes from studies that look at “convenience samples” of a handful of countries or schools—like KIPP schools) and broad measures of instructional time (say, length of the school year), Long writes in “Cross-National Educational Inequalities and Opportunities to Learn: Conflicting Views of Instructional Time.” He argues that this kind of evidence is not useful, in part because of selection bias, and should never be used as support for policy changes.

Other studies, like the 2000 PISA (Program for International Student Assessment) and 1999 TIMSS (Third International Math and Science Survey), look at nationally and globally representative samples of countries, schools, and students and broad measures of instructional time. Data from these two studies imply that instructional time has no effect on achievement in developed countries and a limited positive effect in developing countries, Long notes.

Long conducts his own analysis of data from the 2000 PISA and the 2006 PISA. The 2006 PISA surveyed more countries, featured a wider range of developing countries, and contained improved curricular-specific instructional time questions compared with the 2000 PISA, he says.

His analysis of 2000 PISA data agrees with other researchers’ findings. He finds that, controlling for socioeconomic status, instructional time has a statistically insignificant or a very small effect on student learning. In contrast, 2006 PISA data shows a strong, statistically significant effect from subject-specific instruction.

“An increased hour of math or reading instruction leads to an increase of about half a standard deviation in math and reading instruction at the country level,” Long writes.

The Wesleyan scholar suggests that the difference in results could be due to a different—and, he says, better—measure of instructional time in the 2006 PISA survey, which asked students about the number of hours they spent in reading and math lessons. The 2000 PISA survey asked school administrators to report the average length of a class period and asked students to report the number of periods they spent on subjects per week.

“PISA 2006 results imply that increases in learning time could dramatically improve student performance,” Long writes. “It is possible that increased instructional time could be a tool to increase achievement and narrow educational inequalities both within and between countries.”

However, he adds, “to date, we do not have sufficient evidence to make definitive claims about the effect of instructional time.”

More studies, particularly studies that also look at factors like time students are engaged with learning, tracking, and perseverance, are needed, he argues.

“To determine the effectiveness of increasing instructional time and to avoid wasting resources on a potentially inefficient intervention, we need a much closer examination of the use of time during the school day in the United States and internationally,” Long concludes.

A version of this news article first appeared in the Time and Learning blog.