School & District Management

18,000 Would-Be Teachers Took The edTPA Last Year. How’d They Do?

By Stephen Sawchuk — October 19, 2015 3 min read
  • Save to favorites
  • Print

Pretty well, it turns out: Scores on the edTPA teacher-licensing exam seem to be on the rise, according to new research on the exam.

The report comes from the Stanford Center for Assessment, Learning, and Equity, the owner and creator of the edTPA. The exam is modeled on the certification process used by the National Board for Professional Teaching Standards and requires teachers to videotape part of a lesson, then analyze it, and submit lesson plans and other artifacts for review. (Worth noting: The test has competition from ETS, which has its own performance-based teaching exam, and from a third exam still under development.)

Here’s what we know from 2014, according the SCALE report: More than 18,400 candidates participated in edTPA, and the overall average score was a 44.3, up a bit from 42.8 during the 2013 administrations. (The total number of points differs based on certification field, but the majority of the fields have 15 rubrics and a total of 75 available points.)

Passing rates on the exam are higher than SCALE predicted they would be back in 2013. About 72 percent of candidates hit the nationally recommended cutoff score of 42 points. By contrast, SCALE estimated in 2013 that only 58 percent of candidates would pass at that cutoff score. Whether that’s because more states are now using the test for licensing decisions, increased faculty and student familiarity with the exam, or some combination of reasons isn’t entirely clear.

Meanwhile, most states have set the passing bar a bit lower, at a 41. Some states plan to revist the cutoff score in upcoming years or are phasing it in over time.

Notably, candidates in different licensing areas performed differently on the exam. Secondary and middle school teachers tended to do better, for instance. Use some caution in comparing, however, since some fields (i.e. visual arts) had far fewer candidates than others.

What’s up with special education, you ask? Good question. The edTPA analysts tried to figure out why the special ed scores were lower but didn’t reach any firm conclusions. Possibly it’s because special ed. is, in theory anyway, deeply individualized and dependent on IEPs, which complicates planning and instruction.

Here are a few other interesting tidbits I’ve pulled out from the report.


  • Unsurprisingly, scores were higher in states where the scores “counted” towards licensing, averaging a 45 in those states compared to about a 43 in states where the exam was used informally by programs for improvement or other reasons.
  • On the 15 rubric fields that make up a total score (each judged on a 1 to 5 scale), candidates did the best showing how they were “planning for content understanding,” with a mean score of 3.2. They struggled most with demonstrating “student use of feedback,” with a mean score of 2.5.
  • There were no significant differences in the performance of white candidates compared to Hispanic candidates; African-American candidates scored on average 2.4 points lower. Those differences are smaller than on other traditional teacher-licensing exams. (Disparities on licensing exams have been a big topic in the news lately.)

Meanwhile, make sure you out check a story I wrote for on the edTPA for the latest Education Week print issue. (It went to press Friday and should be online soon.) It’s about a couple of new sites and other online resources offering “assistance” to teacher candidates planning to take the edTPA.

It’s perhaps not surprising that, as the stakes have gotten higher, such services are starting to emerge. But they do raise questions about fine line between getting assistance and cheating, and whether these new performance-based licensing exams are especially vulnerable to the latter, according to some teacher education faculty.



See also:


Related Tags:

A version of this news article first appeared in the Teacher Beat blog.