« The Accountability Plateau, Double-Standards, and the Defense of Sloth | Main | More on "The New Stupid" »

The New Stupid

A little while back, I published a piece titled "The New Stupid" in Educational Leadership. It's a piece that's perhaps more relevant today than when I wrote it, and one that folks continue to ask about. Anyway, given that things have slowed down for the pre-holiday week, I thought I'd share it over the next few days. So, here we go:

A decade ago, it was disconcertingly easy to find education leaders who dismissed student achievement data and systematic research as having only limited utility when it came to improving schools or school systems. Today, we have come full circle. It is hard to attend an education conference or read an education magazine without encountering broad claims for data-based decision making and research-based practice. Yet these phrases can too readily morph into convenient buzzwords that obscure rather than clarify. Indeed, I fear that both "data-based decision making" and "research-based practice" can stand in for careful thought, serve as dressed-up rationales for the same old fads, or be used to justify incoherent proposals. Because few educators today are inclined to denounce data, there has been an unfortunate tendency to embrace glib new solutions rather than ask the simple question, What exactly does it mean to use data or research to inform decisions?

Today's enthusiastic embrace of data has waltzed us directly from a petulant resistance to performance measures to a reflexive and unsophisticated reliance on a few simple metrics--namely, graduation rates, expenditures, and the reading and math test scores of students in grades 3 through 8. The result has been a nifty pirouette from one troubling mind-set to another; with nary a misstep, we have pivoted from the "old stupid" to the "new stupid." The new stupid has three key elements.

The first element of the new stupid is Using Data in Half-Baked Ways. I first encountered the inclination to energetically misuse data a few years ago, while giving a presentation to a group of aspiring superintendents. They were passionate, eager to make data-driven decisions and employ research, and committed to leaving no child behind. We had clearly left the old stupid in the rearview mirror. New grounds for concern emerged, however, as we discussed value-added assessment and teacher assignments.

The group had recently read a research brief high-lighting the effect of teachers on student achievement as well as the inequitable distribution of teachers within districts, with higher-income, higher-performing schools getting the pick of the litter. The aspirants were fired up and ready to put this knowledge to use. To a roomful of nods, one declared, "Day one, we're going to start identifying those high value-added teachers and moving them to the schools that aren't making AYP."

Now, although I was generally sympathetic to the premise, the certainty of the stance provoked me to ask a series of questions: Can we be confident that teachers who are effective in their current classrooms would be equally effective elsewhere? What effect would shifting teachers to different schools have on the likelihood that teachers would remain in the district? Are the measures in question good proxies for teacher quality? What steps might either encourage teachers to accept reassignment or improve recruiting for underserved schools?

My concern was not that the would-be superintendents lacked firm answers to these questions--that's natural even for veteran big-district superintendents who are able to lean on research and assessment departments. It was that they seemingly regarded such questions as distractions. One aspirant perfectly captured the mind-set when she said, "We need to act. We've got children who need help, and we know which teachers can help them."

At that moment, I glumly envisioned a new generation of superintendents shuffling teachers among schools--perhaps paying bonuses to do so--becoming frustrated at the disappointing results, puzzling over the departure of highly rated teachers, and wondering what had gone wrong. This is what it must have been like to listen to eager stock analysts explain in 1998 why some hot new Internet start-up was a sure thing while dismissing questions about strategy and execution as evidence that the stodgy questioners "just didn't get it."

Then as now, the key is not to retreat from data but to truly embrace the data by asking hard questions, considering organizational realities, and contemplating unintended consequences. Absent sensible restraint, it is not difficult to envision a raft of poor judgments governing staffing, operations, and instruction--all in the name of "data-driven decision making."

We'll pick up the other two elements of the new stupid tomorrow.

You must be logged in to leave a comment. Login |  Register
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Follow This Blog

Advertisement

Most Viewed on Education Week

Categories

Archives

Recent Comments