« Technocratic Groupthink Inflates the Testing Bubble | Main | The Florida Flip-Flop Undermines Test Credibility »

A Florida Teacher Explains Why She Thinks VAM is a Scam

Guest post by kafkateach, originally posted at her blog here.

In my previous post about VAM, I chose to discredit value added models on a purely humanistic level and to leave the attacks on the validity and reliability of the algorithm to the mathematicians. In the past week, I have learned so much about how VAMs will now be used in annual teacher evaluations in Florida that I can completely leave the pseudo science behind the algorithm out of the equation. I do not even need to prove that value added models are merely overpriced tarot card readings to demonstrate the truly malicious nature of how these models will be used to evaluate teachers. The master plan to use value added models to dismantle the teaching profession and anyone's chance of ever collecting a pension again is truly brilliant in design. Keep reading to see the scam for yourself.

This week I was called in to sign on the 50% of my evaluation that is actually based on MY performance instead of my students' performance. Teachers have been working this entire year without a clear idea of how they would be evaluated, even the administrators do not understand the new evaluation system. My administrator tried to tell me that the 50% of my evaluation based on "Learner Progress" would be using the school wide average because I teach Social Studies. She firmly stated that only Reading and English teachers would have their own students' FCAT gains count for their evaluation. I countered that it was my understanding that it would be my assigned students that would count for my "Learner Progress" portion because I teach 9th and 10th graders who take the FCAT reading and, afterall, "we are all teachers of reading." She tried to look it up on her various evaluation district handbooks and couldn't find a clear answer. I managed to find it on the district website in about 5 minutes. We read it over and over again and after about 10 minutes of deliberation decided I was right, my "assigned students" gains or losses would be used. If the administrators who are explaining our evaluation system are giving out false information about which data will be used, how are teachers supposed to accurately understand their VAM score? Of course, the district and the state want to keep how VAM works as secret as possible. Once you see how VAM works, as I am going to explain, you will see why it's a total scam.

The administrator explained that the "Learner Progress" portion of my evaluation would not be available until August. Hmm...I got my students' FCAT reading scores in mid-May. I can see how much their reading comprehension grew or shrank under my tutelage, and if I had a few extra hours and the inclination, I could probably come up with a rough estimate of how much value I added on average. So why does it take three months for the district to come up with our value added rating? First red flag.

Next the administrator went through the 50% of my evaluation based on observable standards 2-8. I scored highly effective on 3 out of 7, the other 4 were rated effective. The only rating I contested was being rated effective on "communication with stakeholders." I thought, if anything, I had the documentation to prove what an amazing communicator I am with stakeholders. One week before it was due, we were told that we needed to provide evidence of parent contact from August to April. This would have been excellent to have had known back in August. To tell us one week before it is due, is either ineptitude or a deliberate "gotcha." Having seen how the public school system works, I will chalk it up to ineptitude. No administrator had ever bothered to explain our new evaluation system over the course of the year at one of our many faculty meetings, but they did manage to waste 30 minutes of our time at the last meeting raffling off pieces of chocolate and pens. Yes, future teachers of America, look inside your crystal ball and you will see yourself hoping and praying your name will be picked for a pen or piece of chocolate. Get out while you still can!!!

Now I was not thrown off by this "parent contact documentation from August to April" last minute curveball. I pride myself on being one of the few teachers who consistently checks their district emails, responds to parent and student emails in a timely manner, and maintains an updated webpage with upcoming tests and homework assignments. I purposely saved all of my emails to parents and students during the course of the year for a moment like this. Sha-bam! You want proof of parent contact? Here's my 3 inch thick printout of emails to students and parents, checkmate! Apparently that wasn't good enough. They checked off "effective." What a waste of toner (we have to pay for that ourselves these days)!

Overall, I scored a 43.5 out of 50 possible points. OK, that didn't sound too bad. Then came the math. The administrator did not go over the math, but luckily I accidentally signed the first copy with a black pen and I got to keep it because it needed to be signed in blue ink (sometimes bureaucracy can work in your favor). I wanted to see what score I needed to be rated "highly effective" and whether I was at risk of being rated "unsatisfactory." The good? I wasn't in jeopardy of an unsatisfactory rating. The bad? I would have to win the VAM lotto if I wanted to be rated "highly effective." The ugly? The "effective rating" was only a 14 point spread but the "needs improvement rating" was a 40 point range! I would be in jeopardy of being rated "Needs Improvement" if my VAM rating was "Needs Improvement" which would only give me 25 points and my total score would be 68.5 when I needed a 74 to be rated "effective." I didn't immediately freak about because I had no idea what being rated "Needs Improvement" meant. And guess what? My administrator had no idea what it meant either.

So I went home that night, waking up at 3 am wondering, what does it mean to be rated "Needs Improvement"? My district's website said nothing. I finally found an answer at the Florida Educators' Association's webpage under frequently asked questions. Teachers rated "Needs Improvement" for 3 years in a row would have their continuing contract status (the closest thing we have to tenure in Florida) revoked and they would be placed on annual contract. Once you are on annual contract, it doesn't really matter how you are rated because your contract does not have to be renewed.

On to the next frantic 3 am Google search, "value added ratings Florida determined." The St. Lucie county teacher performance system pdf gave me the final piece of the VAM Scam puzzle.

Read the following to yourself and see if your head doesn't start spinning.

For teachers with individual VAM estimates, once the state math and reading by grade files are received from FDOE, cut scores are determined by using the district mean for each grade by subject and comparing this mean to each of four calculations made for each teacher; 1) Teacher's VAM, 2) Teacher's VAM adjusted by a confidence level of .5 x standard error (SE), 3) Teacher's VAM adjusted by a confidence level of 1 x SE and 4) Teacher's VAM adjusted by a confidence 1.5 x SE (see Table 3 below) Teachers with all four calculations below the district mean will receive a student growth factor rating of "1" or Unsatisfactory. Teachers with all four calculations above the district mean will receive a rating of "4" or Highly Effective. Teachers with one of the four calculations greater than, or less than the district mean will receive a rating of "2" or Needs Improvement/Developing. All other teachers will receive a rating of "3" or Effective.

Did I mention before that I'm a history teacher and not a statistician? I did take statistics and probability in college almost 20 years ago. I didn't realize at the time that it would prove to be the most useful course I could take in preparation for a teaching career. I vaguely remembered that "mean" is a fancy word for average and I remembered what a standard deviation was, although I wasn't quite sure what a "standard error" was. With the help of my husband and a few Facebook friends, we came to the determination that the further you are from the district average, the more likely you are to be rated unsatisfactory. Fair enough. The following sentence is so bizarre, however, that I swore it must be a typo "Teachers with one of the four calculations greater than, or less than the district mean will receive a rating of "2" or Needs Improvement/Developing." So....I think that means that even if I score above the district average I can still be rated "Needs Improvement."

So why does it matter if most teachers are moved to annual contract status? Surely administrators and districts only want the "best and brightest" teachers in their classrooms? Surely you jest if you are at all familiar with school site and district staffing decisions. You're a veteran teacher with a PhD pulling in $70,000? You're fired. The district can get two new teachers for the price of one. Teach social studies and you can't coach a sport? You're fired. The Principal's nephew would like to try his hand at this teaching thing but there are no openings? You're fired. You're a woman in her reproductive years that keeps getting pregnant? You're fired. Dare to speak up at a faculty meeting? You're fired. Have your name mentioned on the front page of a major newspaper because you have 54 students in one period and your students are learning in conditions reminiscent of a Bangladeshi squatter encampment (see earlier blog post)? You're fired.

As someone who is not comfortable living life on my knees with duct tape over my mouth (you may have figured this out by now if you have been reading this blog for any length of time), I am not comfortable working on an annual contract. Teachers must be able to voice their concerns about administrative decisions that harm students without fear of losing their jobs. Eliminate continuing contracts and a culture of complacency, sycophants and fear will rule the schools. Senate Bills passed in state after Race to the Top state have included VAMs as a major portion of teacher evaluations all in the name of "Student Success" and "Educational Excellence" when in reality they have been immaculately designed to end the teaching profession as we know it and free state and districts from career teachers with pension aspirations. Some may brush me off as your typical history teacher conspiracy nut, but my daddy didn't raise no sucker. VAM is a scam.

[note: an earlier version of this post suggested that "standard error" is subject to manipulation. It is not.]


What do you think? Is VAM designed to erode job security for teachers in order to make them easier to fire, and more compliant?

You must be logged in to leave a comment. Login | Register
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

The opinions expressed in Living in Dialogue are strictly those of the author and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Follow This Blog

Advertisement

Most Viewed On Teacher

Categories

Archives

Recent Comments