Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the American Enterprise Institute think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

College & Workforce Readiness Opinion

The Perils of Narrow Training in Education Research

By Rick Hess — September 17, 2018 2 min read
  • Save to favorites
  • Print

When I talk to doctoral students in education policy, I’m frequently struck today by two things: 1) the caliber of their methodological training and 2) how little time they tell me they spend reading, thinking, or talking broadly about education. When we chat about what they’re reading, favorite books, or which thinkers have influenced them, the answers tend toward the narrow, ahistorical, and, well, thin.

I recall a recent dinner with a table of impressive doctoral students from a half-dozen different elite universities. We got to talking about all this, and they observed that there’s not really much room for history, reflection, or the education canon in their training—because it’s so focused on mastering analytic tools, crunching data, and getting up to speed on current research findings. Their mentors, whom they uniformly described as smart and supportive, are intent on preparing them for professional success, which means getting them involved in research that will let them publish enough impressive stuff that they’ll be viable on the academic job market.

This is a song I’ve heard many, many times now.

I’ve been surprised, for instance, at how many aspiring scholars haven’t known that the ideas they’re researching have rich, useful histories—including teacher quality scholars who’ve never read Lortie, been challenged to distill the lessons from the (failed) 1980s push for career ladders, or even asked to reflect on what it truly means to be an “effective teacher.”

Does this actually matter? Yep. Too much emphasis on data-crunching and too little on history, context, and school culture can cause researchers to miss vital questions when trying to understand how or why some intervention did or didn’t work as intended. A narrow focus on that which is readily collected and analyzed can make it all-too-easy to forget that measures like graduation rates are partial, limited, and sometimes deceptive proxies for things we actually care about. A lack of awareness can lead researchers to overinterpet findings or to miss obvious caveats (such as taking care to point out that higher test scores might reflect better instruction, or increased test preparation, or a reallocation of instructional time from untested subjects . . . and that determining the answer really matters).

Let’s be clear. I’m not saying that graduate training in education policy has gotten “worse"; in important respects, it’s gotten better. But I am saying that training has gotten unduly narrow, that this matters, and that something can and should be done about it. Now, I don’t see any bad guys here. Good methodological training is an important and time-consuming thing, and the push to publish is an understandable response to job market dynamics.

In a world awash in numbers and urgent fads, though, where foundations, advocates, superintendents, and policymakers are hungry for exciting ideas and “data-driven” answers, it’s vital that scholars be a bulwark of wisdom, historical memory, and thoughtful caution. I fear that the shift in preparation is leaving them ill-equipped to grow into such a role and that we risk raising a generation of education policy experts with a thin grasp of education. And that’s not good for anyone.

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.