Curriculum

Personalized Learning Based on Students’ Emotions: Emerging Research to Know

By Benjamin Herold — January 06, 2016 7 min read
  • Save to favorites
  • Print

CORRECTION

The push to better personalize learning is moving full steam ahead, as seen in new efforts to dramatically expand the types of data collected in the classroom and to focus more attention on responding to individual students’ “mindsets,” non-cognitive skills, and emotional states.

Such trends are explored in depth in “Extending the Digital Reach: Schools Push Personalized Learning to New Heights,” a new Education Week special report to be published next week.

Among the many fascinating pieces of this work are the growing efforts to move “personalization” beyond just responding to students’ academic skill levels and interests.

One strand of such efforts has to do with developing students’ “non-cognitive competencies,” such as forming relationships, solving everyday problems, developing self-awareness, controlling impluses, and working cooperatively. This is an area of focus in the U.S. Department of Education’s recently released National Education Technology Plan, which highlights digital tools such as Ripple Effects, The Social Express, and SchoolKit. All can be used by schools to help assess students’ non-cognitive skill levels, then provide them with opportunities to practice.

Another strand has to do with the influence of emotions on learning. In our upcoming special report, I also take a look at the work of academic researchers like Sidney D’Mello, a professor of psychology and computer science at the University of Notre Dame, in Indiana. D’Mello is at the fore of efforts to build “intelligent” computer-based tools that are capable of automatically detecting and responding to an individual student’s “affective state.”

When we talked, D’Mello described the importance of students’ emotions and affect—such as interest, frustration, or boredom—on their cognition. Often, though, these factors are ignored, he said.

The systems he and others are developing attempt to do three things: detect physical markers, such as a smile or a furrowed brow or dilated pupils, that indicate affect; put those signals into context; and prompt the system to respond accordingly. The detection happens primarily through facial-recognition and eye-tracking technologies, which have increasingly been found to be reasonable proxies for other measures (such as skin-conductivity and posture analysis) done in laboratory environments.

“Basically, the idea is to identify different affective states and use that information to get students on to a more productive strategy,” D’Mello said.

One way to help prompt such a change might be through “empathic mirroring"—essentially, programming an avatar to respond the way a good human tutor might when a student gets frustrated or bored, with some acknowledgement or sympathy. Other responsive strategies that researchers are exploring have more to do with recognizing confusion and using that information to provide students with further instruction, helpful hints, or encouragement. And some research has been focused on the potential benefits of identifying and responding to the right kind of confusion—that mental place where we aren’t quite sure what’s happening, or we’re getting conflicting information, but we’re highly motivated to figure it out and make sense of things. Helping push students to and through that place can result in deep learning, D’Mello contends.

Very little of this kind of “affect-sensitive” computing is publicly available at the moment. Until recently, the ability to reliably detect the kinds of physical cues D’Mello described was possible only in the lab, with lots of expensive equipment. With the rise of high-powered webcams and consumer eye-tracking technology, that is starting to change, but consumer products could be years away.

Privacy is also a major concern. In recent years, states have passed dozens of new laws aimed at better protecting students’ sensitive information, including in some cases sharp limits on the type of biometric data that affect-sensitive computing is dependent upon. For his part, D’Mello said that researchers never store “raw signals” and biometric markers, such as video of students’ faces, outside of a research environment. He also emphasized that work done in an academic context involves consent by subjects and extensive review by independent institutional review boards.

While widespread use of such tools could be a decade off, even if such hurdles are overcome, the work of D’Mello and others is worth paying attention to, especially given the intense interest in finding new, non-academic ways to personalize student learning.

For those who want to learn more, here are quick summaries of four research papers on the subject in which D’Mello was involved. And keep an eye out of our special report next week.

1. I Feel Your Pain: A Selective Review of Affect-Sensitive Instructional Strategies

This paper by D’Mello and Nathan Blanchard (Notre Dame), Ryan Baker and Jaclyn Ocumpaugh (Teachers College Columbia University), and Keith Brawner (U.S. Army Research Laboratory) was funded by the National Science Foundation and the Bill & Melinda Gates Foundation. It provides a great overview of the state of the work, reviewing six case studies of affect-sensitive systems that have actually been tested.

“It should be noted that the research on affective instructional strategies, especially those that have been systematically tested, is in its infancy,” the paper notes.

2. A Time For Emoting: When Affect-Sensitivity Is and Isn’t Effective at Promoting Deep Learning

Funded by the National Science Foundation and written by D’Mello and a team from the Institute for Intelligent Systems at the University of Memphis, this paper was presented at the 10th International Conference on Intelligent Tutoring Systems. It takes a deep look at a program called AutoTutor, which the researchers describe as “a dialogue-based [intelligent tutoring system] that simulates human tutors.” The system purports to automatically detect “learners’ boredom, confusion, and frustration by monitoring conversational cues, gross body language, and facial features.” A computer avatar then responds accordingly. According to the study, AutoTutor was more effective than a regular computer-based tutor for students without much pre-existing content knowledge, but the timing of when the avatar responded with support was crucial.

3. Automatic Detection of Learning-Centered Affective States in the Wild

OK, true confession: My favorite part of this paper is that the “in the wild” referred to in the title is a school computer lab filled with 8th and 9th grade students. Wild indeed!

This paper is by D’Mello and Nigel Bosch (Notre Dame), Baker and Ocumpaugh (Teachers College), and Valerie Shute, Matthew Ventura, Lubin Wang, and Weinan Zhao (Florida State University.) It was presented at the 2015 ACM International Conference on Intelligent User Interfaces.

The focus is on determining whether “intelligent educational interfaces” are capable of detecting and responding to the affective/emotional needs of students in a real-world environment, where noise, movement, and the like present challenges not found in more controlled laboratory environments. The researchers found that an intelligent system attached to a popular learning game about Newtonian physics was successful “at levels above chance” at accurately detecting students’ off-task behavior, boredom, confusion, delight, engagement, and frustration.

4. Frontiers of Affect-Aware Learning Technologies

Written by D’Mello and Rafael A. Calvo of the University of Sydney in Australia, this paper offers another overview of the field, which they say has been around for 15 years but only now has “tangible functional applications” coming online.

“There are two basic tenets of affective computing,” the authors write. “The first is that it possible for a computer to sense human emotions and systems that detect and respond to users’ emotions can produce more engaging and fulfilling interactions....A second tenet is that intelligent systems that model emotions can make more effective human-like decisions compared to their purely rational counterparts.”

There are five areas of focus described in the paper: detecting and responding to boredom, confusion, and frustration; strategically induced confusion in learners; research on boredom and engagement; efforts to support the development of “pro-social” behaviors, such as resilience; and the move to conduct research in classrooms.

An earlier version of this story incorrectly identified the affiliation of and Jaclyn Ocumpaugh. She is with Teachers College Columbia University. This story has also been edited to clarify that researchers may store “raw” biometric data for research purposes.


See also:

A version of this news article first appeared in the Digital Education blog.