« Full Page Ad in the NY Times: $178,633. The Center for Education Reform's Newsblast on DC Charters: Priceless! | Main | Slow News Day »

NYC's Trojan Horse

skoolboy has absolutely nothing of substance to say about Education Secretary nominee Arne Duncan, whom he has met exactly once. But he continues to mouth off about New York City's Teacher Data Reports, the NYC Department of Education's version of value-added assessment. Which are not to be used to evaluate teacher performance. But rather for instructional improvement. Excuse me, skoolboy has something in his eye.

It's hard not to view these Teacher Data Reports as a Trojan Horse. Just how is a tool that is designed for capacity-sorting supposed to function for capacity-building? After all, a teacher value-added measure might tell us something useful about which teachers are more or less successful in raising their students' test scores, but it tells us nothing about the specific instructional practices that account for their relative success.

How are Teacher Data Reports supposed to improve instruction? In her videotaped comments to teachers, Amy McIntosh, the Chief Talent Officer at NYC's Department of Education, says, "These reports will provide information that will help teachers and school leaders gain insights about important aspects of a teacher's practice ... Whether individual teachers have a greater influence on the learning of some groups of students than on others ... Finally, we can see what teachers might benefit from development focused on, say, the needs of English language learners, and which teachers might be best positioned to lead that kind of professional development ... We also think they will ... help you think about how you can share the techniques you use with your colleagues in your school or across the city."

Hmm. So the specific strategies for improving teaching practice are what, exactly? Having more successful teachers lead the professional development of less successful teachers? Expert practitioners don't always make expert coaches. Hall-of-Fame pro basketball player Isiah Thomas--unquestioned as one of the best point guards of all time--was a mediocre coach for the Indiana Pacers and New York Knicks.

Here's why. Teaching is an extraordinarily complex activity, with teachers making thousands of decisions in the course of their work. Successful teachers make many good decisions and some bad decisions, whereas less successful teachers make many bad decisions and some good decisions. But the capacity to reflect on one's practice and figure out which of those decisions are good and which are bad is exceedingly rare, as is the capacity to share this knowledge with others. In the absence of this reflective capacity, we're all prone to attribute our successes and failures to our pet theories, which may or may not be correct. A Teacher Data Report that provides reassurance that a teacher is successful will only solidify and reinforce a personal folk theory about the reasons for that success.

Yet the Teacher Data Report provides no evidence whatsoever about why a teacher is successful--the many daily practices that promote student learning. And if a teacher's personal theory is inaccurate, then sharing it with others will not improve instruction, nor student achievement. It could even make things worse, focusing attention on ineffective practices. A tool like the Teacher Data Report that claims to be useful for increasing teachers' capacity to teach students effectively, but instead is only useful for ranking teachers on their effectiveness, is a modern-day Trojan Horse.

For better or worse the best PD has always been given by teachers. What gripes me is that administration always starts up some new PD initiative by blowing their own horn about how great the PD is, but it's the teachers who provide the materials and do all the presentation work. If you actually log on to ARIS or ACUITY, you'll find a structure in place but no content. There is no "there" there. When I logged on to see the test scores and transcripts of my students I had a few minutes of "gee, that's interesting," but nothing concrete to use other than what I'm already using.

"But the capacity to reflect on one's practice and figure out which of those decisions are good and which are bad is exceedingly rare, as is the capacity to share this knowledge with others."

I find this statement to be exceptionally inaccurate and uninformed. Schools that are premised upon developing a strong and successful learning community engage in critical reflection on a regular basis. To think that teachers are so overwhelmed by the complexity of their day as to overshadow the ability to critically assess their practice (using a variety of measures, including their consistent use of formative assessment and data collection) is preposterous.

Effective teachers are well aware of their weaknesses and strengths, and work relentlessly to fortify their instructional practices. That's an inherent and integral responsibility of our job.

I believe you are taking the DOE's comments well beyond their intent. They are implicitly suggestive as to the role these reports may have, not mandates.

Yes, the Trojan Horse has arrived. But some of us do (even after a day of making thousands of decisions!) have the common sense and good fortune to understand the statistical limitations of such measures, as well as the beneficial information they may provide.

Should we designate an instructional leader on the basis of a single instrument? Of course not. But such a measurement (if consistent over a longitudinal period of time), combined with a variety of other evaluative tools, may help to identify those teachers who are positively impacting student achievement, and subsequently, the instructional practices they implement to generate such results. If such a teacher can effectively provide instructional support to their peers, all the better.

The effective writer, audience member and respondent today is expected to first commend the other party on a very interesting post and set of challenges.

There it is. Record my points.

Skoolboy is too hard on what may be a genuine Quality Control programs or valuable parts of one.
The Montgomery County (MD) Education Association -operates a "Peer Assistance & Review" (PAR) programs staffed on two year full rotations by trained peer evaluators on two year full-time rotations, offering a second, independent opinion to the administrators'. (www.mcea.nea.org)
Who knows whether it is a detailed, clinical, and articulate analysis, as aided by videotape and other technology that increases effectiveness, more than the support given by such programs? This skeptical numbers cruncher was impressed by a 20- minute presentation on PAR, one that met a Randi Weingarten nod of respect and approval. It exceeded my expectations of quality control and improvement in teaching.

Now, whether the MCEA (or the likes of the Gates Foundation) has an interest in coding the performances of teachers before and after peer assistance, and then finding improvement in (the crude and limited measures of) learning is something else. Maybe some teachers will keep their jobs after learning how to squelch a tic or habit that bothered an average of one child per year and stirred a parental complaint to the principal.

Hey Incredulous,

It's uncharacteristic for a skeptical numbers cruncher such as you to tout a program that appears to have no evaluative evidence about its effects. In any event, I wasn't talking about Montgomery County's Peer Assistance and Review program. Nor am I suggesting that a well-designed system of teacher peer coaching will be unsuccessful. What I am saying is that an initiative whose fundamental idea is "share what you think is working in your classroom with other teachers" is not a well-thought-out program of professional development.

The best PD I've ever had is from ASCD and is taught by current classroom teachers.

As exhausted as I may be by my "thousands of decisions" (could ya be any more condescending?), one would think self-preservation would dictate a bit of reflection on which of those decisions worked out OK.

Feel-good-let's-share-what-works! PD is not effective. But when a group of teachers is deliberately investigating and developing a set of unified practices and sharing data on outcomes, the results are generally pretty good for the kids.

Comments are now closed for this post.


Recent Comments

  • Lightly Seasoned: The best PD I've ever had is from ASCD and read more
  • skoolboy: Hey Incredulous, It's uncharacteristic for a skeptical numbers cruncher such read more
  • Incredulous: The effective writer, audience member and respondent today is read more
  • Mr. Benjamin: "But the capacity to reflect on one's practice and figure read more
  • Loren Steele: For better or worse the best PD has always been read more




Technorati search

» Blogs that link here


8th grade retention
Fordham Foundation
The New Teacher Project
Tim Daly
absent teacher reserve
absent teacher reserve

accountability in Texas
accountability systems in education
achievement gap
achievement gap in New York City
acting white
AERA annual meetings
AERA conference
Alexander Russo
Algebra II
American Association of University Women
American Education Research Associatio
American Education Research Association
American Educational Research Journal
American Federation of Teachers
Andrew Ho
Art Siebens
Baltimore City Public Schools
Barack Obama
Bill Ayers
black-white achievement gap
books on educational research
boy crisis
brain-based education
Brian Jacob
bubble kids
Building on the Basics
Cambridge Education
carnival of education
Caroline Hoxby
Caroline Hoxby charter schools
cell phone plan
charter schools
Checker Finn
Chicago shooting
Chicago violence
Chris Cerf
class size
Coby Loup
college access
cool people you should know
credit recovery
curriculum narrowing
Dan Willingham
data driven
data-driven decision making
data-driven decision-making
David Cantor
Dean Millot
demographics of schoolchildren
Department of Assessment and Accountability
Department of Education budget
Diplomas Count
disadvantages of elite education
do schools matter
Doug Ready
Doug Staiger
dropout factories
dropout rate
education books
education policy
education policy thinktanks
educational equity
educational research
educational triage
effects of neighborhoods on education
effects of No Child Left Behind
effects of schools
effects of Teach for America
elite education
Everyday Antiracism
excessed teachers
exit exams
experienced teachers
Fordham and Ogbu
Fordham Foundation
Frederick Douglass High School
Gates Foundation
gender and education
gender and math
gender and science and mathematics
gifted and talented
gifted and talented admissions
gifted and talented program
gifted and talented programs in New York City
girls and math
good schools
graduate student union
graduation rate
graduation rates
guns in Chicago
health benefits for teachers
High Achievers
high school
high school dropouts
high school exit exams
high school graduates
high school graduation rate
high-stakes testing
high-stakes tests and science
higher ed
higher education
highly effective teachers
Houston Independent School District
how to choose a school
incentives in education
Institute for Education Sciences
is teaching a profession?
is the No Child Left Behind Act working
Jay Greene
Jim Liebman
Joel Klein
John Merrow
Jonah Rockoff
Kevin Carey
KIPP and boys
KIPP and gender
Lake Woebegon
Lars Lefgren
leaving teaching
Leonard Sax
Liam Julian

Marcus Winters
math achievement for girls
meaning of high school diploma
Mica Pollock
Michael Bloomberg
Michelle Rhee
Michelle Rhee teacher contract
Mike Bloomberg
Mike Klonsky
Mike Petrilli
narrowing the curriculum
National Center for Education Statistics Condition of Education
new teachers
New York City
New York City bonuses for principals
New York City budget
New York City budget cuts
New York City Budget cuts
New York City Department of Education
New York City Department of Education Truth Squad
New York City ELA and Math Results 2008
New York City gifted and talented
New York City Progress Report
New York City Quality Review
New York City school budget cuts
New York City school closing
New York City schools
New York City small schools
New York City social promotion
New York City teacher experiment
New York City teacher salaries
New York City teacher tenure
New York City Test scores 2008
New York City value-added
New York State ELA and Math 2008
New York State ELA and Math Results 2008
New York State ELA and Math Scores 2008
New York State ELA Exam
New York state ELA test
New York State Test scores
No Child Left Behind
No Child Left Behind Act
passing rates
picking a school
press office
principal bonuses
proficiency scores
push outs
qualitative educational research
qualitative research in education
quitting teaching
race and education
racial segregation in schools
Randall Reback
Randi Weingarten
Randy Reback
recovering credits in high school
Rick Hess
Robert Balfanz
Robert Pondiscio
Roland Fryer
Russ Whitehurst
Sarah Reckhow
school budget cuts in New York City
school choice
school effects
school integration
single sex education
small schools
small schools in New York City
social justice teaching
Sol Stern
Stefanie DeLuca
stereotype threat
talented and gifted
talking about race
talking about race in schools
Teach for America
teacher effectiveness
teacher effects
teacher quailty
teacher quality
teacher tenure
teachers and obesity
Teachers College
teachers versus doctors
teaching as career
teaching for social justice
teaching profession
test score inflation
test scores
test scores in New York City
testing and accountability
Texas accountability
The No Child Left Behind Act
The Persistence of Teacher-Induced Learning Gains
thinktanks in educational research
Thomas B. Fordham Foundation
Tom Kane
University of Iowa
Urban Institute study of Teach for America
Urban Institute Teach for America
value-added assessment
Wendy Kopp
women and graduate school science and engineering
women and science
women in math and science
Woodrow Wilson High School