« McWhorter Makes Case for Direct Instruction | Main | Inauguration Resources Update »

Make Your Voice Heard: ELLs and SDs on "The Nation's Report Card"

| 1 Comment

In the wake of the attention being paid to English language learners these days (by this newspaper and others) as well as students with disabilities, the public will be given a chance to influence an important policy affecting those students over the next few weeks.

Two public hearings have been scheduled to discuss the options for testing ELLs and students with disabilities on the National Assessment of Educational Progress, or NAEP, commonly known as "the nation's report card."

The hearings, to be held Jan. 30 and Feb. 4, will focus on efforts to bring more uniformity to the rules governing when ELLs and students with disabilities can be excluded from the NAEP, and receive special accommodations on it. Currently, states' policies on exclusions and accommodations are all over the map, and as a result, the numbers of students they choose not to test, or offer special help, vary greatly. Those inconsistencies have led critics to question the legitimacy of NAEP scores in some jurisdictions.


The Jan. 30 hearing will be held in El Paso, Texas, at the University of Texas at El Paso, in the El Paso Natural Gas Conference Center, Wiggins Road, across from the campus library. The Feb. 4 event will be held in Washington, DC, at the Great Hall of the Charles Sumner School, 1201 17th Street, NW. Both hearings start at 9:30 a.m and last through the mid-afternoon. A committee of the National Assessment Governing Board is hosting the hearings, and public input is welcome. More details are available at the NAGB web site. They're also accepting written testimony, which you can e-mail to them at [email protected]

The governing board is mulling over a number of potential fixes to the exclusions/accommodations issue. These include setting uniform national policies for testing the students; altering the method for giving the NAEP through approaches such as "targeted testing"; adding "cautionary flags" if a jurisdiction's exclusion or accommodations numbers get too big; expanding or cutting the number of allowable accommodations; setting "reasonable" exclusion/accommodation rates, based on states' demographics and testing policies; and changing how exclusion and accommodation rates are reported to the public in NAEP reports.

It's a very tangled issue for the governing board, for several reasons. States and cities set their own policies on testing those populations. Many decisions are left to local education officials dealing with students' individualized education programs.

The surest sign of how much trouble this issue gives the governing board is the fact that they haven't found a solution yet. Maybe you can help them.

1 Comment

NAEP is supposed to track gains and losses in school achievement. That is a worthy goal. However, currently, it is difficult to do this fairly. The reason is that achool achievement is strongly related to student background variables. One can use background variables to predict achievement on a test like NAEP. Background variables vary considerably from one school to another and from one district to another. Thier distribution in the population can even change over time. This applies to the nation and to state NAEP achievement data. So how do we accurately estimate the effects of a national education law, such as No Child Left Behind, when the school population is in a state of change?

The answer is found in adjusting NAEP scores to reflect the change in the population. In 1990 there were relatively few ELLs and SWDs taking NAEP. In 2008, there were a large number taking the test. And education policy specialists would like to see more included in the annual NAEP assessment. So, in order to get an accurate measure of school achievement in 2008, one that we can compare with school achievement in 1990, we need to adjust both sets of scores (1990 and 2008) to balance for the greater number of ELLs and SWD in the 2008 sample. There are a number of statistical ways to do this. One is to report the scores for a standardized sample, along with the scores for the full national and state samples.

I have never understood why this isn't done. We spend so much money on NAEP to accurately measure achievement, but we don't make appropriate statistical adjustments that allow one to validly compare achievement across years. This is something that NAGB should consider and require.

Can you imagine economists comparing the cost of living today with the cost of living in 1976? How would they do it? They would use constant dollars, i.e. dollars adjusted for inflation. That way they can compare the cost of at TV then, versus the cost of a TV now and more accurately determine if the cost has increased or decreased.

Comments are now closed for this post.

Follow This Blog


Most Viewed on Education Week



Recent Comments

  • Linda: My problem with homework is they give too much and read more
  • Seo Article Writer: Hello I just see your site when I am searching read more
  • Car Insurance Guy: Ah!!! at last I found what I was looking for. read more
  • cyptoreopully: Hey there everyone i was just introduceing myself here im read more
  • Connie Wms: Good grief. We have gone round and round forever with read more