Teaching

Screens vs. Print: Does Digital Reading Change How Students Get the Big Picture?

By Sarah D. Sparks — May 16, 2016 6 min read
  • Save to favorites
  • Print

There’s new reason to believe so-called “digital natives” really do think differently in response to technology: It may be “priming” them to think more concretely and remember details rather than the big picture when they work on a screen.

[UPDATE: For more on the digital-vs.-print reading debate, check out Larry Ferlazzo’s blogs over at Education Week Teacher.]

Among young adults who regularly use smartphones and tablets, just reading a story or performing a task on a screen instead of on paper led to greater focus on concrete details, but less ability to infer meaning or quickly get the gist of a problem, found a series of experiments detailed in the Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems.

Using a digital format can develop a “mental ‘habit’ of triggering a more detail-focused mindset, one that prioritizes processing local, immediate information rather than considering more abstract, decontextualized interpretations of information,” wrote researchers Mary Flanagan of Dartmouth College and Geoff Kaufman of Carnegie Mellon University.

The use of tablets and laptops has been central to many districts’ efforts to build so-called 21st-century skills in students, particularly analysis and critical thinking skills. Kaufman and Flanagan certainly hoped to build those skills when they piloted a game to teach public health by creating a fictional epidemic, then asking participants to stop it through stopping a fictional epidemic.

But something odd happened when the researchers compared students playing the board game version to those playing an app-based game.

“We noticed even playing in the same situation, they played differently. It was strange,” said Fanagan, a professor of digital humanities, film and media at Dartmouth. “The iPad players played faster; they didn’t talk to each other as much, even when they were on the same team.”

Moreover, the app-game players focused on dealing with immediate, local outbreaks of disease. Those playing the board game were more likely to keep a “big picture view” of which people outside an immediate outbreak might be vulnerable, and they worked together more as a team. Overall, the board game players were better at stopping the outbreak and winning the game.

Understanding the Details or the Gist

Puzzled, Flanagan and Kaufman decided to look at how people performed three other tasks on paper or screen:

  • Answering a standard survey on concrete versus abstract thinking.
  • Reading a David Sedaris short story and answering a pop quiz on its complicated family dynamics.
  • Reading a data table of car specs in a variety of areas to identify the best model to buy.

All of the participants were in their early 20s and on average reported using tablets daily or weekly. But those randomly assigned to the “digital” group were not performing tasks on an iPad; they were reading on a screen with no hyperlinks or other interactive elements, using identical material to that used by those randomly assigned to use print.

So did it really make a difference whether someone read a PDF on a screen or had it printed out? Yes.

In all three tasks, the paper users were significantly more “abstract” in thinking. Digital participants reported preferring concrete rather than abstract descriptions of a behavior—for example, “making a list” would be associated with “writing things down,” rather than the “getting organized” description preferred by those taking the survey on paper.

Digital participants scored higher than the paper participants, on average, in recalling details of both the story and the car data table. But they scored lower on questions of inferred relationships and meaning in the Sedaris story, and they were less likely to actually choose which car was objectively best to buy.

How Do We Think During Data Overload?

That last finding in particular may sound strange, considering one of the most common uses of the Internet is for comparison-shopping and other rough data analysis. The researchers played on that, deliberately providing an overwhelming variety of data in the table and giving participants only a few minutes to choose a car. While 66 percent of those reading the data in print chose the best car, only 43 percent of those who read the data on a screen picked correctly.

“If you use a higher level frame of mind and focus on abstract processing ... you are better able to discern when aspects of the car are more important,” said Kaufman, an assistant professor at Carnegie Mellon’s Human-Computer Interaction Institute.

The findings align with other emerging research on how students process information differently in print and digital forms. A 2014 series of experiments found that while taking more notes overall was better than taking fewer, students who typed notes on their laptops rather than writing them on paper tended to take down information verbatim rather than summarizing concepts, and the more students wrote verbatim, the less they remembered a week later.

“Even if you aren’t multitasking, your brain has become so accustomed to operating in that frame of mind in that context that it automatically activates this mental script, we believe, that’s leading to these differences,” Kaufman said.

Can Teachers Use the Best of Both Worlds?

The researchers tried to get digital readers out of thinking so concretely in a separate version of the car-buying task. In this case, they again assigned participants to view car models’ specifications on a screen. But first, they asked a third of the young adults to think about why they would solve a problem—a way to trigger an abstract frame of thinking—and asked another third to think about how they would solve a problem—designed to prime them for concrete thinking. The third group had no priming.

Of the digital readers who had been primed to think abstractly, 48 percent chose the correct car—significantly more than the 25 percent of digital readers primed to think concretely, and more than the 43 percent of digital readers who chose correctly in the previous car data experiment. But that was still far fewer than the 66 percent of print readers in the prior experiment who identified the best car.

“We don’t really know why certain effects are happening, but they are happening, and it is going to be hard to get people to talk about it because it sounds like such a Luddite response to technology,” Flanagan told me. “We just have to see that as partially a design problem. In 20 years we might actually design systems to help us with this.”

For example, she said, teachers should consider the format of information when designing different types of activities, to help students focus on details or overall themes.

“There are many times when you are trying to get students to compare facts and figures. If you are making a timeline for World War II, it might be really great to have digital technology to optimize comprehension of details like that,” Flanagan said. “But when you want students to think more analytically about something, especially when you are asking something new—'How would you compare what happened in World War II to the Gulf War,’ say—that may require a shift in thinking. You may want to get off the screen for that for a little while.”

Hybrid activities also may help. A separate team at Carnegie-Mellon last year found that children learned physics concepts better by making predictions and physically shaking an “earthquake table” than by using an earthquake simulator app that also allowed them to shake the tablet. (Lead author Nesra Yannier explains more in the video.)


Related:

Related Tags:

A version of this news article first appeared in the Inside School Research blog.