Opinion
Assessment Opinion

The Assessment Consortia: Field-Test Blues?

By Contributing Blogger — May 21, 2014 2 min read
  • Save to favorites
  • Print

This post is by Joan Herman, co-director emeritus of the Center for Research on Evaluation, Standards, and Student Testing at the University of California, Los Angeles

The media is full of stories about PARCC and Smarter Balanced field tests. Depending on the writers’ perspectives and with whom they’ve most recently conversed, the field tests are going swimmingly or technological glitches are rampant and schools can’t cope. “Wow, this is hard!” seems the common student refrain. For some, however, the exclamation marks despair, while for others, it marks pleasure in being challenged to think and solve problems in new ways. The teacher perspective, too, is portrayed diversely: general support for new college and career ready standards, but concern for teachers’ and students’ preparation for the new expectations; infuriation that results may be used prematurely to evaluate teachers and schools versus enthusiasm for PARCC and Smarter Balanced demands for deeper levels of learning. What’s the truth of the matter? For now, it is “All of the above.”

PARCC and Smarter Balanced field tests are accomplishments in their own rights. To date, Smarter Balanced has engaged upwards of 4 million students across 22 states in its field testing, while PARCC expects participation of a million students in 15 states by the first week in June. The field test epitomizes enormous change on a number of fronts, and there is no doubt that change is difficult for most all of us, even absent political ramifications. For both adults and children alike, the field tests demonstrate new, rigorous college and career standards; new technology-based testing platforms, and new types of tasks and technology-enhanced items to address the rigor of the new standards. Some questions have more than one correct answer; others ask students to construct an answer by “dragging and dropping” items, by underlining relevant evidence and/or by creating a graph or a geometric figure. Video and audio clips are in evidence. Still other tasks ask students construct a written response, for example to explain their reasoning, and extended performance tasks ask students to synthesize multiple sources and evidence-based reasoning to solve complex problems.

Yes, the consortia field tests mark a lot of change at once and change is hard. But recent national and international results remind us of the need for such change. The recently released NAEP report on 12th grade performance found fewer than 40 percent of high school seniors prepared for college. Bob Rothman’s recent blog post underscores particular areas of student weakness, for example, reading closely to find evidence to support conclusions and to make nuanced interpretations and mathematical modeling. These are exactly the areas where the consortia exams are poised to push us forward.

As we witness some struggles with the consortia field tests and the political surround, it’s well to remember as well the purpose of the field test: to test the accessibility, validity, and reliability of tests and their delivery system; to detect and learn from difficulties encountered; and to improve the system. Today’s difficulties can point the way to better testing in 2015 and beyond that supports for deeper learning goals for students.

Related Tags:

The opinions expressed in Learning Deeply are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.