Students in some school districts spend 20 more hours annually on district- and state-mandated math and English/language arts tests than do their peers in other districts, a new study has found.
"The Student & the Stopwatch," released Wednesday by TeachPlus, examines the wide variations in the time spent on testing. Nationwide, it found that some districts spend five times more time on tests than others. Urban schoolchildren tend to spend more time in testing than those in the suburbs. And teachers say that testing costs them twice as much instructional time as their students actually spend taking the tests.
NOTE: Teach Plus conceded major errors in its report. See our blog post about their revisions.
Teach Plus, a nonprofit that works to strengthen the teaching corps in big cities, used the six districts where it works (Boston, Chicago, Indianapolis, Los Angeles, Memphis, and the District of Columbia) as focal points for the study, examining district and state testing calendars and interviewing teachers about their experiences with testing. The organization compared those districts' experiences to testing practices in several suburban districts around each of the focal districts. And it conducted similar analyses in six other big urban districts (Anchorage, Houston, Denver, Cleveland, Atlanta, and Baltimore).
The study examined testing in kindergarten, 3rd, and 7th grades. Only at the 3rd grade level are tests mandated by the No Child Left Behind Act.
But the study didn't just focus on the state tests required by that federal law; It also examined the benchmark and formative tests required by districts. Those kinds of tests—given a couple times a year, or as often as every two weeks—actually take up more of students' and teachers' testing time (something the American Federation of Teachers reported, as well, when it did a similar study last summer).
The TeachPlus study found a good deal of variation, though, in how many of schools' testing burdens were state-imposed and how many were district-imposed. In Shelby County, Tenn., which includes Memphis, 3rd grade students spend 3.5 hours each year on state-required math and literacy tests, but six hours on district-required ones. That's the opposite of Baltimore, where 3rd graders spend nine hours on state tests and two hours on district tests.
In Chicago, for instance, students will spend 38.8 hours on state and districts tests by 8th grade, while in Denver that total will top 159 hours. At the 3rd grade level, differences in testing time could amount to as much as three instructional days: each year in Chicago, children spent five hours on tests, while in Cleveland they spent 25 hours.
Interviewing more than 300 teachers in its six focal districts, TeachPlus found that "the technical administration time does not equal the amount of instructional time a teacher loses." At the 7th grade level, teachers spent 16.4 hours each year on tests, compared with 14.4 hours for their students. In 3rd grade, teachers reported spending 27.7 hours on testing, compared with 13.7 hours for the students taking them.
Analyzing the views of the teachers, however, still produced many positive remarks about testing. While they did complain about how much time testing took from their instruction, and about a lack of alignment sometimes between the tests and their curricula, they also found value in testing, particularly when it was part of a tightly coordinated approach that incorporated analysis of test results into their instructional planning and student monitoring, the study said.
So what does all this suggest for the policy debate about assessment? The way TeachPlus sees it, the conversation should shift from state-mandated testing to district-mandated testing, since that is what is taking up most of students' and teachers' assessment time.
Teachers' comments "make clear that when tests are properly used in conjunction with the curriculum, test-appropriate standards, and are part of a teacher's regular instructional practice, the amount of time allocated for testing becomes a less important factor," the study said. "Debating time-on-testing, then, without a discussion of the test type and content misses the point."
Some of the tests' features that teachers value most—such as essays and constructed-response answers—take more time, the study noted.