« Command-Style Schooling in Russia | Main | What Frightens Me About a National Curriculum »

What Did the NAEP Scores Mean?

| 4 Comments

Hi Deb,

Welcome home from Russia! Hope you got your 36 hours of sleep. Given the sad history of Russia over the past century, it will be difficult for them to shake off the burden of so many decades of authoritarianism and totalitarianism. For sure, one can find dissidents under Czarism and under Soviet rule, yet not much of a democratic tradition. Even today, the idea of a free press or the checks and balances that we associate with a healthy democracy seem to be waning, and the forces of authoritarianism appear to be gaining ground.

I have learned over these many weeks of communicating with you that it is hopeless to try to persuade you that a national curriculum is not synonymous with totalitarianism. Many democratic nations in Europe and in Asia have a national curriculum and have not forsworn their democratic institutions or their freedom. A few years ago, researchers at the American Institutes for Research compared achievement on international tests in a dozen nations that participated in both TIMSS and PISA, and only two of them—Australia and the United States—did not have a national curriculum or national testing.

It is interesting and not surprising that you met democratic educators who are turning to private schooling. I had the same experience when I was in Poland in 1991. Some of the most ambitious reformers were opposed altogether to government-controlled education, what we call public education. In their experience, it was nothing more than a conduit for propaganda and mind control by the state. Even now, in this country, I have heard conservatives and libertarians refer to public education as "government schooling," which they do not mean as a compliment. Indeed, they use the phrase as a way to validate their belief in vouchers and any other funding mechanisms that allow children to escape the clutches of the government education system.

Your anecdote serves to bring home something that we have often discussed. Is public education in the United States still a cornerstone of our democracy? Would it be more or less democratic if there were common academic standards for the nation? Is it more democratic to promote public education or to promote alternatives to public education? Who should control the schools that educate the public's children?

These are questions that we will continue to debate and discuss, I am sure. Meanwhile, let me catch you up on what happened in your absence.

On Sept. 25, the National Assessment Governing Board released NAEP scores for 2007 in reading and math. The report was mixed, with scores up a bit in both subjects in fourth and eighth grades. As you can well imagine, the U.S. Department of Education trumpeted the gains and attributed them to No Child Left Behind, but the gains were really very modest. The report can be viewed here.

I read the reports, which I highly recommend, and this is what I found:

* Fourth grade reading scores were up by a modest 2 points from 2005 to 2007, from 219 to 221. Actually, scores for this grade on NAEP had been 219 in 2002. The biggest increase in reading scores occurred between 2000 and 2002, when the scores went up by six points. In other words, the gains since NCLB was enacted do not equal the gains recorded on NAEP in the years prior to NCLB.
* Eighth grade reading scores were up by only one point. The trend line for this grade in reading from 1998 to 2007 is a flat line. The score was 263 in 1998 and it is 263 in 2007.
* Fourth grade mathematics scores increased by two points, from 238 in 2005 to 240 in 2007. The trend line in this grade points steadily upward. The biggest gains occurred in the pre-NCLB period, when scores rose from 226 in 2000 to 235 in 2003.
* Eighth grade mathematics scores were up by two points, from 279 in 2005 to 281 in 2007. Again, the pre-NCLB gains were larger, when scores increased from 273 in 2000 to 278 in 2003.

After I read the NAEP reports, I wrote two articles. I referred to NAEP to debunk New York state's claims of historic eighth grade gains on its state tests in May and June 2007. On NAEP, the scores for eighth graders in New York state in both subjects were flat. The state assessment director wrote a letter saying that tests that sample students, like NAEP, are less accurate than those that test every single student. I suppose if the scores on NAEP had been good for New York state, the state Education Department would have been satisfied with its methodology.

Then on Oct. 3, I published an article in The New York Times arguing that NCLB was "fundamentally flawed" and should be radically overhauled. I pointed out in the opening paragraph that the test score gains before NCLB, as I showed above, were larger than those that have followed the implementation of NCLB. I also said that "the main goal of the law—that all children in the United States will be proficient in reading and mathematics by 2014—is simply unattainable," and that this had never been accomplished in any district, state, or nation. My article prompted a response from Secretary of Education Margaret Spellings. Her letter cleverly elided my first point that the gains in achievement were larger before NCLB by stating that "students have reached all-time highs" in achievement (which is untrue for eighth grade reading, which is exactly the same in 2007 as it was in 1998). She also cleverly revised the goal of NCLB: Instead of universal "proficiency," which is what the law calls for, she writes that the goal is to have "every child at grade level by 2014." Of course, if grade level is identified by a normed test, then by definition only half the children can possibly meet that goal!

I have been reading the debates in the press and on the blogs about what NAEP really says and what the scores really mean. I have been struck most forcefully by the fact that so many people who argue these questions have not read the NAEP reports, which are written in plain English, and have not seen the graphs, which are intelligible. Instead, they repeat what they read and what they heard. I would feel much better about the state of our democracy if everyone who opines made a point of reviewing the facts first before uttering an opinion about them. Commentators should be ashamed to recycle what they heard, instead of checking the source for themselves.

Diane

4 Comments

Diane is correct. NCLB testing has created a "moral hazard" in which states have a strong incentive to (in Lake Wobegon fashion) claim that all of their "kids are above average."

Mississippi claims that 88% of its 4th graders are proficient readers, Maryland 82%, and Massachusetts only 48%. So MS has the superior school system, yes? But the NAEP scores for these states are MS=18%, MD= 32%, and MA=44%.

This is exactly what you would expect to get when you allow the states to design their own test and set the passing score. There is not one parent in 1000 who will be sophisticated enough to figure this subterfuge out on their own.

I think that it is time for a national "no-stakes" test. My proposal is that the NAEP be given to all kids and the results delivered to the parents in easy to understand fashion.

I am convinced that, if this were to happen, hundred's of different conversations would start in each and every district--and we could start to get serious about fixing our schools.

Too many states are currently administering "feel good" tests, relative to the federal tests administered by the National Assessment of Educational Progress, commonly referred to as the "nation's report card."

In 2005 Tennessee tested its eighth-grade students in math and found eighty-seven percent of students performed at or above proficient while the NAEP test indicated only 21 percent of Tennessee's eighth graders proficient in math. In Mississippi, 89 percent of fourth graders performed at or above proficient on the state reading test, while only 18 percent demonstrated proficiency on the federal test. In Alabama 83 percent of fourth-grade students scored at or above proficient on the state's reading test while only 22 percent were proficient on the NAEP test. In Georgia, 83 percent of eighth graders scored at or above proficient on the state reading test, compared with just 24 percent on the federal test.

Oklahoma, North Carolina, West Virginia, Nebraska, Colorado, Idaho, Virginia, and Texas were also found guilty in their determinations of proficient as compared to the NAEP test indicated.

These are documented evidence for a national curriculum with national standards and determinations for proficiency. The existing NAEP standards could sserve as a more than adequate jumping off point. I would prefer to see these national standards etc., developed by the a committee of the 50 state DOE's as opposed to the Federal DOE. The feds can simply observe the results and report them for all to see.

The answer to why there are such discrepancies between state and NAEP scores is simple, all you have to do is read the NAEP report itself, which states:

"NAEP's current achievement level setting procedures remain fundamentally flawed. The judgment tasks are difficult and confusing; raters’ judgments of different item types are internally inconsistent; appropriate validity evidence for the cut scores is lacking; and the process has produced unreasonable results."

In short, the NAEP test is seriously flawed.

The reason for the discrepancies between student performance according to state assessments when compared to NAEP is not that "NAEP is seriously flawed." The reasons for the discrepancies reflect the fact that we do not have a national curriculum. NAEP was designed to assess student knowledge and performance across jurisdictions, and is necessarily broad to cover the various curricula across states.

If states adopt the NAEP framework as a replacement for their current state standards, NAEP would cease to be a low-stakes assessment.

Comments are now closed for this post.

The opinions expressed in Bridging Differences are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Follow This Blog

Advertisement

Most Viewed on Education Week

Categories

Archives

Recent Comments

  • hertfordshire security installers: Greetings. Great content. Have you got an rss I could read more
  • http://blog.outsystems.com/aboutagility/2009/04/challenges-of-scoping-and-sizing-agile-projects.html: I would like to thank you for the efforts you've read more
  • http://acousticwood.net/mash/2008/03/yeah_off_to_the_uk.html: Between me and my husband we've owned more MP3 players read more
  • buy cheap metin2 yang: When you play the game, you really think you equipment read more
  • Nev: Anne Clark - If a Dr. instructs a patient that read more