Opinion
Education Opinion

What if your Word Problems Knew what you Liked?

By Justin Reich — October 07, 2012 4 min read
  • Save to favorites
  • Print

I’ve been writing quite a bit recently about the power of the word “personalize” in education reform conversations and its many different meanings. To some, it means allowing students to proceed through content at their own pace. To others, it means allowing students to pursue their own interests. Those, I would say, are the two major ways “personalize” is used in our current discourse, but I stumbled across another definition worth discussing: personalization as situations where content is modified to align with a student’s interests.

Last week, Education Week ran an article about a recent study from Southern Methodist University showing that students performed better on algebra word problems when the problems tapped into their interests. (The link to the full paper is here. Many thanks to lead author Candace Walkington for putting her paper in a publicly-accessible form online). The researchers surveyed a group of students, identified some general categories of students’ interests (sports, music, art, video games, etc.), and then modified the word problems to align with those categories. So a problem about costs of new home construction ($46.50/square foot) could be modified to be about a football game ($46.50/ticket) or the arts ($46.50/new yearbook). Researchers then randomly divided students into two groups, and they gave one group the regular problems while the other group of students received problems aligned to their interests.

The math was exactly the same, but the results weren’t. Students with personalized problems solved them faster and more accurately [[see update below for more nuance]], with the biggest gains going to the students with the most difficulty with the mathematics. The gains from the treatment group of students (those who got the personalized problems) persisted even after the personalization treatment ended, suggesting that students didn’t just do better solving the personalized problems, but they actually learned the math better.

Why did students do personalized problems faster and apparently learn more? It’s not totally clear. The study used a randomized control trial, which are powerful at testing comparative efficiacy (did personalization work?) but often not very informative about process (why did it work?). Perhaps students were more interested in problems about things they cared about; perhaps the familiar vocabulary was less taxing on students’ cognitive load, so they could focus more on the math; perhaps some combination; perhaps something else.

It’s interesting to imagine applications of this kind of content personalization outside of mathematics. Could students learn grammar better in decoding and writing sentences about their interests? Could students learn reading comprehension better if their reading passages were similarly aligned? Could students learn historical thinking skills, like those assessed in the Stanford History Education Group’s new Beyond the Bubble problems, if the primary sources being analyzed resonated with student interests?

The general thrust of these findings isn’t totally new of course: teachers have obviously long known that tying academic content to students’ prior knowledge, passions, and hobbies helps keeps kids engaged. Technology, though, provides a potential mechanism for making the matching of curriculum materials and student interests much more granular: where every student could get her own word problems to solve or her own sentences showing vocabulary words in context.

After reading Walkington’s study, I’m left with two thoughts. First, I find the approach generally promising: use technology to help students learn new concepts within familiar contexts.

Second, as the word “personalization” continues to have more heft in educational reform discourse, it’s incredibly important in these conversations to identify exactly what people mean when they use the word. Allowing students to move through worksheets at their own pace is different than individualizing the content of worksheets which is different then letting students decide that all of these worksheets are dumb, and they’d rather do something else.

For regular updates, follow me on Twitter at @bjfr and for my papers, presentations and so forth, visit EdTechResearcher.

Update 10/8/12:

Ed Realist (if that is your real name...) raised questions about my characterization of the treatment group’s problem solving as “more accurate.”

The researchers tested two kinds of responses (OK, really three kinds, but in two big categories) to the word problems: formulating an algebraic expression and answering the problem correctly. Students in the treatment group did better than the control group in formulating an algebraic expression and equally well as the control group in solving the problem. In writing a short post, I summarized the gains across those two categories as solving the problems “more accurately.”

I agree with Ed, though, that it’s important to note that the gains were in translating the word problem into an algebraic expression rather than solving the algebraic expression. Ed and I probably hold different views of how important formulating the problem is, and his blog post expands on his perspective.

The nuance doesn’t change my conclusions:

1) Beware the multiplying definition of personalization
2) Putting new concepts in contexts appears to help students improve learning some kinds of math tasks, and that’s kind of interesting (in the way that making modest gains on worksheet problems is interesting)

The opinions expressed in EdTech Researcher are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.