Assessment

Math Consultant: Smarter Balanced Math Tests Have ‘Egregious Flaws’

By Liana Loewus — March 10, 2015 3 min read
  • Save to favorites
  • Print

In a recent critique of Smarter Balanced’s online practice tests for mathematics, educational consultant Steven Rasmussen argues that the assessments are “a quagmire of poor technological design, poor interaction design, and poor mathematical design” that will make it impossible to determine what students actually know.

The piece echoes, and in fact, cites some of the concerns math experts expressed to me for a piece last fall on the testing platforms for both PARCC and Smarter Balanced. Those experts pointed out that the new computer-based tests often force students to write their solutions rather than model answers with diagrams, graphs, flow charts, and other tools as required by the Common Core State Standards for mathematics.

Representatives for PARCC and Smarter Balanced, the two federally funded consortia that created tests aligned to the common core, told me for that article that the tests are a huge step forward and have advantages over paper-and-pencil tests of the past.

Rasmussen is the co-founder of Key Curriculum Press, which publishes K-12 mathematics materials, and has worked on math software development. In his 30-page critique, he looks at 10 online practice items from the Smarter Balanced Assessment Consortium (found here).

For instance, he points to this 10th grade problem from the practice test:

Rasmussen responds:

In general, asking students to show their work is a good way to understand their thinking. In this case, would anyone begin the problem by not sketching a picture of the circles? I doubt it. I certainly started by drawing a picture. A simple sketch is the most appropriate way to show one’s work. However, there’s just one major issue: There is no way to draw or submit a drawing using the problem’s “technology-enhanced” interface! So a student working on this problem is left with a problem more vexing than the mathematical task at hand—"How do I show my picture by typing words on a keyboard?” ...

Give me a grid and a circle-drawing tool if I am to show circles on a coordinate plane! Let me use a dynamic-dragging representation—like the one they misused in Question 1—to drag a dynamic circle to its correct position and size on a given coordinate system.

In truth, students are allowed to use—and should be given—scratch paper when taking the tests. But those papers are not turned in or scored. And students will only receive scratch paper if their teachers remember to hand it out.

Typing Equations

Rasmussen also rails on the test’s equation editor, a tool for writing numbers and mathematical symbols on screen. (It does not do calculations.) It looks like this:

“I wondered why there were so many buttons on it and what they each did,” Rasmussen writes. “The five arrow buttons above the numbers—three leftward and two rightward pointing arrows—look so similar that their actions can only be deciphered through trial and error.”

That seems, well, indisputable.

He also makes the point that the equation editor inputs numbers on the left side of the screen—the opposite of a calculator.

When I first tried the tool, I knew there was something disconcerting about it, but I couldn’t quite place what it was. Yes, you have to click on the numbers and can’t use the keyboard to input them. But I’m now thinking the orientation is what really feels weird. “Every school calculator builds numbers right to left for a reason,” writes Rasmussen. “With an input device that is harder to use than correctly adding the two numbers, what will we learn about a student who gets this question wrong?”

Even so, I’ve written about the tests’ equation editors before, and generally math experts have told me that they’re clunky but not an insurmountable challenge. Students will be able to figure them out. That likely still stands. But for students who are already struggling with the math, will the clunky tools deplete their patience?

There’s much more to Rasmussen’s critique of the digital interface, including claims that there are inconsistencies in the tools, that the models force students to round answers inappropriately, and that the tests don’t “attend to precision” as the common core’s Standards for Mathematical Practice demand.

The fact of the matter is, starting today, Smarter Balanced tests are being administered to students in 18 states. And students in another 10 states plus the District of Columbia are beginning to take PARCC, which has similar technology features.

Among Rasmussen’s suggestions for how to handle this: “We can work to uncouple Common Core from the testing consortia and try to save the potential of CCSSM [i.e., the Common Core State Standards for mathematics], even while we let the tests, testing consortia, and their corporate partners crash and burn.”

Related Tags:

A version of this news article first appeared in the Curriculum Matters blog.