School & District Management

How Other Countries Would Fare on the Nation’s Report Card

By Sarah D. Sparks — January 17, 2018 4 min read
  • Save to favorites
  • Print

As U.S. educators and policymakers turn more attention to international tests to analyze American students’ progress, a new report suggests that the National Assessment of Educational Progress, otherwise known as the Nation’s Report Card, should be overhauled to look more like those global tests.

The NAEP sets a higher bar than international assessments and a new report, “How High the Bar?” by the National Superintendents Roundtable and the Horace Mann League argues that it may give the public an inaccurate view of American students’ progress.

The report compares the performance of U.S. students on the NAEP, and those of 40 other countries and education systems on the Program for International Reading Literacy and 35 education systems in the Trends in International Math and Science Study. The analysis used previous federal research that created statistical crosswalks between NAEP and the other two tests.

It found that no country, including the United States, would have more than half of its students overall meet the 2015 NAEP’s proficiency level for 4th grade reading. However, four education systems—England, Finland, the Russian Federation, and Singapore—all would have had higher percentages of their students meet proficiency than the United States. For example, England had 1 percentage point more and the Russian Federation had 6 percentage points more 4th graders who would have met NAEP’s proficient level than the 31 percent of U.S. 4th graders who did so.

Japan, the Republic of Korea, and Singapore would have had a majority of students meet NAEP proficiency in 8th grade math, and only Singapore would also do so in 8th grade science.

“Many criticize public schools because only about one-third of our students are deemed to be ‘proficient’ on NAEP assessments,” James Harvey, the executive director of the National Superintendents Roundtable, said in a statement on the report. “But even in Singapore—always highly successful on international assessments—just 39 percent of 4th graders clear NAEP’s proficiency benchmark.”

Peggy Carr, the acting commissioner of the National Center for Education Statistics, which administers the NAEP, agreed that the assessment is tougher than international tests—in fact, differences in difficulty between the NAEP and PIRLS have revealed worsening achievement gaps in reading—but she disagreed with the Roundtable’s assessment that NAEP is an inaccurate gauge.

For example, while less than 40 percent of U.S. 4th graders overall performed at the proficient level on NAEP reading in 2015, nearly half of those who were not from low-income families reached the proficient level; in states like Massachusetts, 70 percent of 4th graders who were not from low-income families scored at the proficient level in 2011, 2013, and 2015. In NAEP math, 58 percent on 4th graders who were not low-income reached the proficient level in 2015.

“One of the claims they should be called out on is on the appropriateness of content in 4th grade reading,” Carr said. “They imply it is too difficult for 4th graders, and that is simply not true. ... We have to have a range, so students at the bottom of the range can interact with the text as well as students at the top level.” While the international tests are norm-referenced, using the average of achievement across countries to set a baseline, by law the NAEP is criterion referenced, with questions set by the National Assessment Governing Board.

Common-Core Assessment?

The report also compared New York, Florida, and common-core state tests to NAEP. In 4th grade, for example, it found the Smarter Balanced assessment’s proficient level matched NAEP’s basic level in math and reading, while PARCC set its proficiency level close to “approaching proficiency” on NAEP in both subjects, and Florida and New York set their proficient levels at NAEP’s. That contradicts several previous studies mapping NAEP against state standards, which have found states overwhelmingly set their “proficient” levels near or below NAEP’s “basic” level of performance, not the “proficient” level. A new mapping study is expected late next month.

“These linking studies demonstrate the [NAEP] standards are not as high as we originally thought,” Carr said. “These are challenging standards, but they are not ‘grade level’ and I think that’s where the confusion starts.” She agreed with the report’s recommendation that NAGB should consider ways to clarify NAEP’s achievement labels, though she said she did not agree with its suggested new labels: low, intermediate, high, and advanced.

The report is likely to continue ongoing debates over how to use large-scale assessments under new state accountability systems.

“Teachers know that standards and assessments are essential in the pursuit of both educational excellence and equity. They also know there is an art and a science to this; benchmarks can’t be unreasonably high or unacceptably low. But, too often in the United States, assessment data are used not to inform or improve public schooling but to hold schools to account or, worse, to penalize them,” AFT President Randi Weingarten said in a statement on the report.

Related Tags:

A version of this news article first appeared in the Inside School Research blog.