Assessment

Smarter Balanced Field Tests Didn’t Mirror Classroom Learning, Study Says

By Catherine Gewertz — November 04, 2014 3 min read
  • Save to favorites
  • Print

High school students who took the Smarter Balanced Assessment Consortium field test last spring found it far more difficult than younger students did, especially in mathematics, according to a new report.

The report released by the consortium last week includes some interesting “lessons learned” from the field test of 4 million students. The report is based on responses from 19,000 students and 5,000 teachers and administrators. But it includes responses from only 13 of SBAC’s 22 member states, and each state created its own survey, so the collected responses can’t be seen as representative of the consortium population. But they do provide valuable snapshots of the field-test experience from many corners of the country.

Echoing what EdWeek found in its own national reporting on the field test, 70 percent of the test coordinators in the seven states that responded to the SBAC survey said that the field test had gone as well or better than they had anticipated. Many respondents also reported what we heard when we talked to educators giving the test: It required a daunting amount of keyboarding from young children who weren’t that strong at keyboarding yet.

The section of the report that deals with the rigor of the test items offers new insights, though, into students’ experiences with the test. Here are a few key takeaways:

Students at all grade levels found the test more difficult than their state’s previous test. They found the Smarter Balanced items “challenging” and “really hard” and said they “took more thought to answer questions.” One 10th grader said the test was “hard” because “if you didn’t know it [the answer] you couldn’t guess” like you could on multiple-choice tests. One 6th grader said: “It’s the first test I’ve ever taken where I actually learned something while taking it.”

Students at higher grade levels found the test tougher than those at lower grade levels. In one state with a large number of survey responses, only 14 percent of the students in grades 3 to 5 said the tests were “very difficult,” compared with 46 percent at the high school level across three states. The high school students said the math portion of the test was the hardest. The extended reading and writing passages of the performance tasks came in for some complaints from students, with one saying, “It was sooooooo long.”

Most students reported that the test did not reflect what they learned in class. Three states asked students about how well the test lined up with what they had learned in class. The consortium report said that answers varied from 10 percent to 35 percent of students reporting that the test was “very well aligned” with classroom instruction.

If the survey results even come close to being representative of the consortium’s membership, that means that a whopping two-thirds of students could experience the test as not very reflective of what they learned in class. In explaining the gap, the report pointed to the fact that many teachers are still learning to teach the standards.

“Assuming the assessments themselves are well aligned to the common core, this limited snapshot seems to indicate that there is still a tremendous amount of work to be done to deeply align classroom instruction with the standards,” it said.

Whether teachers’ instruction isn’t sufficiently aligned to the common core or whether the tests aren’t fully aligned, the findings offer a sobering picture of what students could experience when they take the operational tests this school year.

The report also covers other aspects of the field-test experience, such as schools’ technological readiness, administrators’ comfort level giving the test, and how the test interface worked for students. Take a look at the full report (linked earlier in this blog post) for those findings.

A version of this news article first appeared in the Curriculum Matters blog.