Opinion
Education Opinion

Statistical Measures of Teacher Awesomeness: What Do Exam Scores Show, Exactly?

By Ilana Garon — June 26, 2013 3 min read
  • Save to favorites
  • Print

Over the last couple of days, my innate awesomeness as a teacher--at least as far as my students are concerned--manifested itself in two ways:

1) In the last two days of school--mandatory attendance days despite the fact that Regents are already scored, final grades had to be turned in last week, graduation is taking place during school hours (to which half the faculty can’t go because they’re mandated to stay at school supervising students), and only about three kids show up from every class of 30--I provided the ultimate learning experience: multiple periods straight of viewing “The Walking Dead” on our classroom Smartboard™, which any kid who showed up could attend. This was complete with class “discussion,” which included comments about foreshadowing whenever creepy music began playing, recognition of dramatic irony (in that we know Shane and Lori were a couple before Rick came back, but he doesn’t!), and lots of screaming whenever zombies showed up, which basically happens every minute in that show.

2) Regents scores came out. A cool 97 percent of my students who sat the English Language Regents exam passed it. Now, to be fair, in 10th grade only the Honors class sits the Regents. (The rest of the kids wait until 11th grade.) This group also had the highest number of tenth graders with scores in the 90’s, at least in my memory. The students in my Honors class thought this was very awesome news indeed.

Now, here’s the thing: No matter how many effusive messages I get from the students saying, “OMG Miss Garon, you are the best ever--that’s why I passed Regents,” there is just no way that I can take full credit for my students’ success rate as a class. I’m no statistician, so bear with me while I explain this: In spring 2012, my pass rate was only 79 percent. Conditions were the same: Only the Honors class took it, it was the same exam format, and we used the same review materials (copies of previous tests) both years. Did I randomly become an 18 percent better teacher over the course of one year? Unlikely, since I was doing most of the same things from year to year. Now let’s look at spring 2011. That year, my Honors class’ pass rate was similar to this year’s--97 percent. Interesting.

Rather than assuming that somehow I was a great teacher in 2011, then inexplicably deteriorated a lot in 2012, and then picked it up again in 2013, it probably makes sense to look at outside factors--particularly in the class that didn’t do as well. Now, for that group, Regents scores across the subjects have been lower. So it stands to reason that, due to whatever circumstances, that middle group either doesn’t test as well or is a bit slower than their peers. And that’s fine--every group of kids is different, such that despite writing lesson plans out day-to-day (with the expectation of reusing them in the future), I always find myself amending classroom procedures based on the group of kids I’m working with. That’s the way it should be; good instruction evolves to the needs of the students. But then, the process of measuring teacher improvement based on students’ test scores year-to-year must also be called into question.

My little Regents experiment of the past three years, however unscientific a model, shows me that students’ standardized test scores are a misleading indicator of a teacher’s performance. Quite simply, there is too much out of a teacher’s hands. By the same token that I don’t blame myself too much for the 2012 Regents group having lower scores than their peers in 2011 or 2013, because that group is a bit slower, I also can’t pat myself on the back too much for this current group’s scores: They simply came in better prepared and more sharp, as a group, than their predecessors. So while our time together during the school year was surely productive, there was also less I could have done to “mess them up.” This group would likely have done well no matter what teacher they’d had.

In the end, my showing of “The Walking Dead” and my Honors class’ Regents scores both give similarly superficial measurements of my success as a teacher. Both factors are correlated with awesomeness, but not causally linked--something I wish our politicians would remember when this comes up for debate.

The opinions expressed in View From the Bronx: An Urban Teacher’s Perspective are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.