States

Dispute Over Validity of Common-Core Exam Ignites New Florida Testing Fight

By Andrew Ujifusa — September 21, 2015 4 min read
  • Save to favorites
  • Print

After high-profile controversies about the Common Core State Standards, PARCC, testing disruptions, and the number and length of tests, Florida has another assessment squabble on its hands. This time, it’s a dispute over how well the Florida State Assessment matches Florida’s standards, and Florida’s students.

On Sept. 1, the state education department released a study conducted by Alpine Testing Solutions and edCount of the state assessments given in the spring of 2015. The study was commissioned by the state legislature to determine whether the Florida State Assessment was, in short, “an accurate way to measure students’ knowledge of the Florida Standards,” as well as whether results could be used as a factor in teacher evaluations and school accountability. The state test Florida used this year was originally created by the American Institutes for Research for Utah.

While the study identified problems with the state’s administration of the test due to technological issues, as well as the late delivery of some training materials related to the test, the Alpine and edCount study also gave general approval to the test items themselves, as well as how the tests were constructed and how they were scored.

Here’s a typical passage from the report’s conclusions, this one dealing with how tests were created:

“When looking at the process for the development of test blueprints, and the construction of FSA test forms, the methods and procedures that were followed are generally consistent with expected practices.”

You might remember that Florida tweaked the common core last year, but didn’t toss out the standards.

The report also stated that the test scores could also be used for decisions about individual students (although not as the sole factor), as well as for evaluations and other accountability policies.

A ‘Slightly Different’ Criticism

If the Florida department thought the report’s release would make everyone happy, however, it was mistaken.

Some pointed out that the study said it had not reviewed all of the test items, as the Sunshine State News reported. Here’s a passage from the report that helped to trigger their criticism:

“Every item that appears on the FSA was reviewed by Florida content and psychometric experts to determine content alignment with the Florida standards; however, the items were originally written to measure the Utah standards rather than the Florida standards. While alignment to Florida standards was confirmed for the majority of items reviewed via the item review study, many were not confirmed, usually because these items focused on slightly different content within the same anchor standards.”

Democratic Sen. Bill Montford, who is also the CEO of the Florida school superintendents association who isn’t a fan of the state’s approach to accountability, criticized the study, telling the Associated Press: “I don’t believe that the students of Florida should be subject to a high-stakes exam that is slightly different than the standards. It can make the difference between a student passing and not passing.”

There was enough negative reaction in Florida for the state department to release a statement on Sept. 18 designed to rebut the claims like those Montford made. Saying that the full report must be read in order to understand what it says, the department issues responses to several claims it said are inaccurate. Here’s an example:

CLAIM #1: Many test items on the FSA are not aligned with the Florida Standards.

FACT: The final report noted that “the majority of test items had exact matches with the intended Florida Standards” and that “for those that did not have an exact match, most represented a very close connection” with a slightly different standard (p. 37). For all but three out of the 386 total items, external reviewers identified connection to a standard that appeared on the Florida test blueprints, which define test content (p. 61 - 66). This affirms that the FSA accurately measures students’ knowledge of Florida’s content standards.

The department’s rebuttals also stress that Florida educators reviewed all assessment items prior to the test’s administration, to ensure that they were appropriate for Florida students, and that the items were also reviewed after the test.

During this year’s legislative session, lawmakers responded to backlash from schools and others by agreeing to put a cap on the amount of testing in a school year, among other changes. But the reaction to the Sept. 1 study casts doubt on the idea that assessment will stop being a political hot potato in the Sunshine State.

Related Stories:

Related Tags:

A version of this news article first appeared in the State EdWatch blog.