Assessment

Pitfalls and Progress in Moving from Paper- to Computer-Based Testing

By Sarah D. Sparks — April 28, 2018 4 min read
  • Save to favorites
  • Print

Asking students and schools to switch to a new mode of assessment does affect students’ performance, but problems might be more short-lived than some educators fear, finds a new study by the American Institutes of Research.

In the first year of online testing for the Partnership for Assessment of Readiness for College and Careers, or PARCC, researchers found that students who took the test online across grades 3-8 showed the equivalent of five months less academic progress in math and as much as 11 months less academic progress in English/language arts compared to the performance of students who took the test on paper.

“Switching to the online assessment led to much lower measured achievement,” said Benjamin Backes, senior researcher at AIR’s National Center for Analysis of Longitudinal Data in Education Research. Students of all demographic groups showed similar dips in achievement when they took computer-based tests, though the researchers found students who were already struggling the most in reading had bigger declines from changing the mode of the test.

Once schools and students had a year of online testing under their belts, though, the test-mode effects shrunk by half in English/language arts and by a third in math, found co-authors Backes and James Cowan of AIR.

The Bay State allowed districts to volunteer to move to online tests in 2015 or 2016, and about half of them—mostly high-performing districts—did so. In both years, the state developed pools of students taking paper-based and computer-based versions of the same tests, and used them to compare how similar students at the same schools performed on different versions.

Backes noted that some questions were presented differently on the paper version of the test:

... and the online version of the same question:

“It’s pure speculation, but ... in paper format, it does look somewhat different,” Backes said. “For a reading passage, it’s just all in front of you on paper, but you need to use a scroll bar to go through it [online], so maybe it’s harder to refer to certain parts of the passage?”

Massachusetts’ students are perennial top-performers on large-scale tests, from the consortium-based PARCC to the National Assessment of Educational Progress, to international tests like the Program for International Student Assessment or the Trends in Mathematics and Science Study. And the state’s Deputy Education Commissioner Jeff Wulfson said the state viewed the move from paper-and-pencil tests to computer-based assessment—which tests at all levels have undertaken in the last five years—as a “natural progression.”

“We were aware of some of the downsides. I don’t think any of it really surprised us,” said Wulfson. The state chose to move to online testing as “a companion to the fact that we are spending a lot of time and effort putting computers into the classroom for instruction,” Wulfson said. “Kids these days are spending more time writing on computers than pen and paper.”

Massachusetts has since moved away from the PARCC, and it has corrected for the mode effect statistically when releasing its test results for its own online state test, dubbed MCAS 2.0. The researchers are continuing to track how students perform when moving from one online test to a new one, to see if the handicap for online testing continues to shrink.

“It’s just a reminder that we as a state should not be making any high-stakes decisions on the basis of one test where the mode effect could be within the margin of error,” Wulfson said. He noted that 98 percent of Bay State students will be learning and being tested on a computer by next year. “I think five years from now it will almost be a non-issue.”

There are three possibilities for the state’s next steps, in Backes’ view. If the decline is just caused by students adjusting to a new test, the mode effect will disappear over time. If a few items are inherently more difficult to do online, the testing developer can adjust them.

But the researchers plan to conduct a predictive study as part of the next round of online assessment, to look at a third possibility: that the online tests are assessing slightly different and more complex skills than paper and pencil—and these differences actually make the tests a better predictor of students’ long-term academic achievement in the subject.

“Just because there are mode effects, that’s not necessarily a bad thing if they are telling us something about student performance or preparation that were not being captured before,” Backes said.

Bob Lee, the chief analyst for MCAS, noted that the state’s 2010 college- and career-readiness frameworks require students to be able to read and write online.

“We are seeing in the data our students getting used to writing online and our teachers are getting used to giving tests online,” Lee said, adding, “we’re learning more about the close relationship between keyboarding and literacy in early grades.”

Illustrations Source: PARCC, Benjamin Backes

A version of this news article first appeared in the Inside School Research blog.