Assessment

Kindergarten-Readiness Tests: Navigating a Tricky Terrain

By Catherine Gewertz — October 15, 2014 3 min read
  • Save to favorites
  • Print

Pennsylvania is using a new kindergarten-entrance assessment this fall, and a review of the way it’s handling the project points to key challenges states face as they use tests to gauge what students know as they enter the K-12 system.

How kindergarten-readiness tests are designed and used is a question that is only growing as more states put such assessments into practice. As I reported in a recent story exploring Maryland’s new readiness test, the use of such tests is expanding, thanks in part to two U.S. Department of Education grant programs. But this trend has early-childhood educators worried—to say the least—and the report on Pennsylvania’s program illustrates some of the reasons why.

The “test” we’re talking about here is not a bubble sheet that 5-year-olds are expected to complete, or a list of questions they’re supposed to answer. The Kindergarten Entry Inventory, or KEI, is a set of 30 questions that teachers use to observe their students’ skills in five domains:


  • “approaches to learning";
  • social and emotional development;
  • English/language arts;
  • mathematics; and
  • health, wellness, and physical development.

Pennsylvania teachers are using the test this year with all entering kindergarten students.

In its study of how Pennsylvania piloted the assessment in 2012 and 2013, Public Citizens for Children and Youth found a number of problems, and concluded that the state should be “much clearer” about what the test is for, and “take proactive steps” to make sure that it is used to enhance early learning. It must provide appropriate training and resources to enable teachers to use it properly, the advocacy organization said.

The report contends that Pennsylvania hasn’t done enough to convey that the kindergarten-entry test is meant to inform and shape instruction, and to offer relevant information to parents about their children’s learning. Instead, the state has spent more time communicating about what the test isn’t, the PCCY said.

Department of education training documents, for instance, say that the “Kindergarten Entry Inventory is not a screening tool; it is not a diagnostic tool and therefore, cannot be used to place a child into a program, class, or special education. It is not designed to replace existing assessments which have been designed for a specific purpose such as a diagnostic or screener, it is in no means designed as a high-stakes assessment of comparison among early-childhood programs.”

As I found in my reporting for the story about Maryland’s assessment, many educators and early-childhood activists start from a deeply skeptical place when they hear about kindergarten testing. When I accompanied teachers to a summer training about their new kindergarten-readiness assessment, the trainers had to overcome some deep doubts about whether the test was age-appropriate, and how the resulting information would be used. It was pretty clear that any new kindergarten assessment is going to have to come with some serious and deep communicating if it’s going to win the support of teachers.

The PCCY also dinged the state for failing to reach out to parents and early-learning programs to tell them about the new readiness assessment, even though these sectors have a powerful influence on children’s readiness for kindergarten. The PCCY found that little or no professional development was being provided to help teachers respond instructionally to their students’ varied needs in light of the test data. It called on the state to step up efforts to provide such training.

It also urged local administrators to make sure teachers have the time they need to use the assessment thoughtfully with each child.

A version of this news article first appeared in the Curriculum Matters blog.