« One in Four Asian Children in New York City Is Poor | Main | A 'Bill of Rights' for English-Language Learners »

What's an English-Proficiency Score Good For?


How students score in reading and writing on an English-language-proficiency test is a good indicator of how they will score on their state's tests for reading, writing, and mathematics that are given to all students. That's what a study of 5th and 8th graders who took the English-proficiency test developed by the World-Class Instructional Design and Assessment consortium, or WIDA, concluded. Researchers for the study, which was commissioned by the U.S. Department of Education's Institute of Education Sciences, found that students' scores in the domains of reading and writing on the test were stronger predictors in how they did on regular academic tests than their scores in speaking and listening on the test.

The 5th and 8th graders in the study were from New Hampshire, Rhode Island, and Vermont, which are all members of WIDA and thus use the ACCESS for ELLs English-proficiency test. Students in those states also take the same academic content tests, called the New England Common Assessment Program.

This is the first study I've seen commissioned by the Education Department that looks closely at the relationship between any of the new English-proficiency tests that were created to comply with the No Child Left Behind Act and states' regular academic tests.

Essentially, the findings of the study show that ACCESS for ELLs is working in that it is providing scores that are meaningful in determining if English-language learners are ready to show what they know on the tests designed for native speakers of English. I would think this would be good news for the 19 states that are members of WIDA and use that test. Of course, we have to keep in mind that the study, conducted by the Regional Educational Laboratory at Education Development Center Inc., looks only at students in three states.

It appears that the English-proficiency tests used in some other states may not be as good at measuring where students stand in being able to do well on their states' regular tests. For example, in California, while 66 percent of 10th graders were able in 2006 to pass the state's English-proficiency test at a level considered to be "proficient," only 4 percent were able to pass California's regular English-language-arts test. That information was just reported in a new book The Latino Education Crisis: The Consequences of Failed Social Policies, by Patricia Gandara and Frances Contreras.

I hope more researchers will start examining how scores that ELLs get on English-proficiency tests translate into how well they can do on regular content tests and in mainstream classrooms.


Our school gives the WIDA ACCESS in the Spring, around the same time these same students take the SOL (Standards of Learning) tests. So, even if ACCESS is a good indicator, we would only see those results in June, shortly before we see the results of the actual SOL test. So many tests, so little time to over-analyze these poor kids.

Our state gives the WIDA ACCESS in January/February. We have to wait until end of May before getting those results back...giving a very short window of opportunity for teachers to hold LACs and make exiting decisions for the following year. Teachers barely have time to review it before the year is out.

I am not sure I am convinced the value of the ACCESS versus time per student is worth it. When the results come back it is usually information I already know by working with the students. Also, it is my belief that most of our native English speaking students would struggle with the ACCESS. The pictures used are not conducive to stimulating language. Most would only give one word answers. The "listening" portion of the test would be difficult at best for anyone to pick out the details after given a long, sometimes confusing, paragraph. I would like to test the faculty at my school on the "listening" portion and see the results. As you can see I do not have a lot of confidence in this assessment.

The ACCESS score is no help because the results are received after the CRCT has already been taken. The CRCT results would be more valid to anticipate the next year’s CRCT because the test is taken after the ACCESS. The length of time it takes to score the ACCESS makes it impractical to use as a predictor of success on the CRCT. Also, the CRCT tests specific goals that can be studied and prepared for. The ACCESS is much wider in scope and intentionally difficult.

In the last two years, all of my advanced 8th grade students have passed the CRCT language arts and most of them have also passed the writing test. Only two of them passed the ACCESS. I can conclude that the ACCESS requires a higher standard than the CRCT does.

While the ACCESS is not a fix-all test, it gives a hint about which areas need more emphasis. New test items are reviewed each year carefully, which hopefully weeds out some of the more confusing items. I was a part of the teacher review team last year. It is a good idea to email WIDA and let them know which items were considered confusing - as they do listen!

The use of language assessments as tests with quantitative scores that are used and misused for a variety of "accountability" purposes is yet another example of edumetrics run amuck. Statisticians and administrators who know next to nothing about second-language acquisition and literacy learning are trying to interpret this test data and judge if L2 proficiency scores "predict" achievement in language arts and content. However, they have no theory about what relationships might exist and whether or not these can be observed empirically. With native speakers of English, we assume that they all have a age-peer equivalent knowledge of their L1 when they enter school. Do we therefore predict that since they all speak English, they will all achieve at certain academic levels? So what are the assumptions when a student enters school not proficient in the language of instruction and of the tests, especially when the tests are designed for measuring the academic achievement of native English speakers? On the one hand, the implications are so obvious as to be laughable, and on the other, the misuse and misinterpretation of these assessments in measuring students' growth and schools' and program' effectiveness are very serious and worrisome.

The fact is that there are different learning curves and time periods for listening and speaking skills versus reading and writing skills. A language assessment instrument cannot measure all four skills accurately and these scores should not be used for making judgments about students' academic achievement in literacy and content. To use a popular analogy, does the size of the yield from an apple orchard predict the yield from an orange grove? Let's hold public policy makers and school officials accountable for the misuse of testing with English Language Learners.

I have to agree that the test seems extremely difficult to "pass." Last year I had two sisters, seniors. When the ACCESS scores came back in May, only one had scored high enough to exit. Yet, they both passed the state's graduation tests in all four content areas as well as writing, and graduated cum laude, in the top 10% of their class. So are our state's requirements for graduating too easy, or is the ACCESS test too hard?

I am relieved to read some of these - especially d.sampler, who said "...most of our native English speaking students (and some faculty?) would struggle with the ACCESS". That is the joke at my school - most of the native English-speaking kids here would qualify for my ELL services, if I were to give them the WAPT (WIDA's placement test). It's nice to know my school is not the anomaly!

I say, the tests are too complicated and too numerous, and there's a vast shortage of common sense among the powers in charge.

I will not claim that the ACCESS is flawless, but here in Georgia we had been subjected to the terrible LAB assessment before.

Compared to the LAB, the ACCESS is much better. I don't think many native English speakers could have passed the LAB with flying colors.

In that context, I find the ACCESS to be much more user-friendly, and effective for assessing student thinking, comprehension, and writing skills.

I am not crazy about the listening section, because I think that it tends to be more about memory than ability to listen for key information.

I'm willing to wait for improvements in the test as the kinks are worked out.

Meanwhile, I would appreciate a faster turnaround time on the score results. The ESOL kids want to know where they stand before they leave for the summer!

I agree that the access test is user friendly. From my experience working with the Access, I can actually predict who will be exiting the program every year by scoring 5.5 or higher on the Access. The trend I have noticed is that the high academic students are the ones that exit. The students with low reading and writing skills score low in these two domains.
I would like to see the results of the test come back earlier than May.

I am a K-2 ESL teacher. I spend more time testing students than teaching them. I would like to ask why lettersounds and reading are tested on the kindergarten test and rhyming is not on the k test but shows up on the first grade test. Every student I screen becomes an ESL student. I find this troublesome.

I am curious to know if ACCESS scores would be considered a factor in determining or justifying the allowance of accommodations on state mandated testing. If students are scoring less than proficient in the areas of reading and writing on ACCESS, could this data be used to write in or even waive the students requirement to take other CRT's?

Are any schools using ACCESS scores for purposes other than ESOL eligibility? For instance, is anyone using ACCESS test scores to inform classroom practice, or to cluster-group ELLs in classrooms and facilitate push-in ESOL services?

Comments are now closed for this post.

Follow This Blog


Most Viewed on Education Week



Recent Comments

  • Charles: ELLs in our state ARE required to take State standardized read more
  • Melissa: Maybe I'm just becoming jaded, but this feels to me read more
  • Anonymous: Are you kidding me....UNO is an organizaion that literally destroys read more
  • Meg Baker: Are any schools using ACCESS scores for purposes other than read more
  • Dr. Mendoza: This is great news i must say. Hopefully this DREAM read more