Future of Work

Can Artificial Intelligence Predict Student Engagement? Researchers Investigate

By Alyson Klein — May 06, 2019 6 min read
  • Save to favorites
  • Print

Imagine if artificial intelligence—the same machine learning Netflix uses to suggest new movies to customers based on past favorites—could show teachers that students are more engaged first thing in the morning, and then suggest relevant classroom management adjustments for different times of the day.

That’s what researchers at the University of Montreal, led by Thierry Karsenti, a professor and the Canada research chair on information, will be investigating. The university is partnering with Classcraft, a five-year old education technology company based in Canada and the U.S. that offers what it describes as an “engagement management system” to help schools improve academic outcomes, school climate, and social-emotional learning. It is used in 75,000 classrooms in the United States and in 160 countries, according to the company.

The study is part of a broader trend to use software and data to help schools improve school climate and student well-being.

But that type of work brings with it a host of questions. Experts wonder if relying on the same software to track—and potentially course correct on—social-emotional learning can work effectively for the wide range of students in U.S.schools. They’re concerned about the impact on a teacher’s practice, and on kids, if the software makes a wrong prediction. And of course, there are always qualms about student privacy.

The Classcraft program created by Shawn Young, a former high school physics teacher, allows teachers to give groups of students “points” for positive behavior, helping them track traits like critical thinking, collaboration, and even empathy. The program makes that data available to educators. (Classcraft isn’t the only software program that works to improve school climate. ClassDojo, Class Charts, ClassMax, and Hero all aim to help teachers with classroom management.)

But right now, Classcraft’s software can’t predict what might happen in classrooms over time, or give teachers ideas on how to tweak their practice to boost engagement. And that information could prove very valuable to educators, Young said.

“What we want to be able to do is to help interpret that data and to be able to understand what’s happening in the classroom and from there to be able to make recommendations to teachers and principals and students about how they can course-correct their practices to have better outcomes,” Young said.

For example, he explained, the program may be able to tell educators: “We notice when you create this type of curriculum, or have this type of intervention in the classroom, engagement goes down for the next three hours. Maybe you should try this specific approach.”

The study will also examine the impact of making the prediction: Does student behavior improve overtime when the teachers’ moves are informed by Classcraft’s suggestions?

Classcraft isn’t worried that teachers in schools that participate in the study are going to feel spied upon. Young said that in many schools, principals are already keeping a close-eye on school-climate factors such as referrals and suspensions. He thinks Classcraft’s approach will give school leaders a window into the positive changes happening in teachers’ classrooms, instead of just “the worst stuff on the worst kids.”

But Robert Murphy, a senior policy researcher for the RAND Corporation, a nonpartisan research organization, said that the algorithms that AI-powered systems use to make suggestions and predictions aren’t always on the money. In fact, it’s considered pretty impressive if an algorithm like the one used by Netflix is correct 75 percent of the time. He worries about the impact of wrong predictions on classroom management.

“My feeling on these predictive tools is that it’s just one data point that teachers can use, and that’s fine,” said Murphy, who hasn’t examined the specifics of the research partnership. “These tools shouldn’t be used in making the ultimate decision” when it comes to classroom management, he said.

Young said he was aware of the potential for inaccurate predictions. That’s why Classcraft is participating in the study.

“This is why we need to do research and get it right,” he said. “Ultimately, the amount of data we have, as well as the robustness of the research partnership, will help us succeed.”

Another potential trouble spot is bias, according to Murphy’s RAND colleague, Osonde Osoba. For example, classrooms with mostly English-language learners may not to respond to suggestions that grew out of data sets made up primarily of native-speakers.

“No matter how large the data set, it’s going to be a challenge to fully represent all the types of classrooms where the model will be deployed,” said Osoba, an information scientist for RAND who has not examined the partnership. “As long as there is a difference in the types of classrooms it’s trained on, say, it’s trained on urban classrooms, if it’s deployed in classrooms that don’t look like that, say, rural classrooms, there is going to be a jump in errors in the validity of the model. That can lead to problems.”

He doesn’t see that potential problem as “insurmountable"—just something that the researchers should keep in mind.

Young said Classcraft doesn’t automatically collect information about particular students, in part to avoid profiling or identifying them without permission from school officials or parents. But, he added, if a district wanted to look at particular segments of students, they could do that to help identify biases in the data.

The research partnership, which is expected to last two or three years, won’t be able to predict trends on specific kids, just whole classes or groups of students. And all of it will remain anonymous and encrypted. Classcraft adheres to privacy laws in the states and countries in which it operates, Young said.

Still, similar efforts to monitor students’ social-emotional development have raised red flags among privacy advocates on both the political left and the right.

Karen Effrem, the president of Education Liberty Watch, a right-leaning organization, hasn’t taken a look at Classcraft and the University of Montreal’s partnership. But she’s been watching the trend of introducing social-emotional learning into technology more generally.

And the idea of collecting data about school climate doesn’t sit well with her. She worries it could be connected back to individual students, despite Classcraft’s assurances.

She isn’t comfortable with a machine making suggestions about how to improve student behavior. “This is all very concerning from both a data privacy perspective as well as just the accuracy of social-emotional assessment and data gathering,” she said.

Even so, the increasing use of artificial intelligence in education is likely to grow. According to Global Market Insights, the AI market in K-12 is expected to grow from less than $200 million in 2017 to about $1.2 billion in 2024.

With that expansion in mind, research partnerships like the one between the University of Montreal and Classcraft might be a good first step toward determining how AI might work effectively and appropriately in K-12 classrooms, because many important questions still remain.

Image by Getty

Related:


Related Tags:

A version of this news article first appeared in the Digital Education blog.