Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the American Enterprise Institute think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

Education Opinion

The 2012 RHSU Edu-Scholar Public Presence Rankings

By Rick Hess — January 04, 2012 6 min read
  • Save to favorites
  • Print

Today, RHSU unveils the 2012 Edu-Scholar Public Presence rankings. The metrics, as explained yesterday, are designed to recognize those university-based academics who are contributing most substantially to public debates about schools and schooling. The rankings offer a useful, if imperfect, gauge of the public impact edu-scholars had in 2011, factoring in both long-term and shorter-term contributions. The rubric reflects both a scholar’s body of academic work--encompassing books, articles, and the degree to which these are cited--and their 2011 footprint on the public discourse. The following table reports the 2012 rankings.

Click chart for larger view, with zoom

2012 RHSU Rankings

Rankings were restricted to university-based researchers and excluded think tankers (e.g. Checker Finn or Russ Whitehurst) whose job is more focused on influencing the public discourse. After all, the intent is to nudge what is rewarded and recognized at universities. (The term “university-based” provides a bit of useful flexibility. For instance, Tom Kane currently hangs his hat at Gates, and Tony Bryk his at Carnegie. However, both are established academics who retain a university affiliation and campus digs. So they’re included.)

The scores reflect, in roughly equal parts, three things: articles and academic scholarship, book authorship and current book success, and presence in new and old media. (See yesterday’s post for the specifics.) The point of measuring quotes and blog presence is not to tally sound bites but to harness a “wisdom of crowds” sense of a scholar’s footprint on the public debate--whether that’s due to their current scholarship, commentary, larger body of work, media presence, or whatnot. We worked hard to be careful and consistent, but there were inevitable challenges in determining search parameters, dealing with common names or quirky diminutives, and so forth. Bottom line: this is a serious but inevitably imperfect attempt to nudge universities, foundations, and professional associations to consider the merits of doing more to cultivate, encourage, and recognize contributions to the public debate.

The top scorers? All are familiar edu-names, with long careers featuring influential scholarship, track records of comment on public developments, and outsized public and professional roles. In order, the top five were Linda Darling-Hammond, Diane Ravitch, Eric Hanushek, Larry Cuban, and Richard Arum. Darling-Hammond and Ravitch lapped the field, cracking 200 points on a scale where only a handful of scholars topped 100. Rounding out the top ten were Terry Moe, Paul Peterson, Pedro Noguera, Daniel Koretz, and David Cohen. Notable, if not too surprising, is that the top ten are all veteran, accomplished scholars. This reflects the nature of the scoring, which heavily weights the influence of a scholar’s body of work and not simply whether a scholar collected a bunch of press clippings or blog mentions in 2011.

Stanford University fared very well, claiming three of the top five scholars (and six of the top fifteen). Harvard University claimed four of the top fifteen, and NYU claimed another three.

By category: Darling-Hammond posted the top Google Scholar score, at 83; Cuban topped the books category at 37.5; Ravitch topped the Amazon rankings with a 19.7; she also posted the high score in the education press category, at 41.5; twelve scholars topped the blog mentions by maxing out at 50 points (although, without the cap, Hanushek would have taken the prize quite handily); and Arum topped the general press mentions with a 26.8.

A number of top scorers, like Ravitch, have books of recent vintage. For instance, among the top ten, just in the past two years, Moe published Special Interest, his unflinching critique of teacher unions; Darling-Hammond published The Flat World and Education; Peterson published Saving Schools; Cohen published Teaching and Its Predicaments; and Noguera published Creating the Opportunity to Learn. And Arum doubtless benefited from the continuing outsized impact of his oft-cited Academically Adrift.

As with any such ranking, this exercise ought to be interpreted with appropriate caveats and caution. That said, it’s revealing that a number of sober, less-controversial scholars--like Arum, Cohen, Dan Koretz, and Bob Pianta--dotted the top twenty. Meanwhile, less senior scholars who punched above their weight included Roland Fryer, Sara Goldrick-Rab, and Patrick McGuinn.

Given that professional norms vary (note that few economists crack the top twenty), it’s interesting to eyeball the results discipline by discipline (admittedly, there’s a bit of fuzziness when it comes to pigeonholing some scholars). The top-ranked economists were Hanushek, Hoxby, Roland Fryer, Hank Levin, and Tom Kane. The top-ranked political scientists were Moe, Peterson, Richard Elmore, Mike Kirst, and Bruce Fuller. The top-scoring sociologists were Arum, Noguera, Gary Orfield, Adam Gamoran, and Tony Bryk. Top scorers in the area of teacher education and curriculum and instruction were Darling-Hammond, Gloria Ladson-Billings, David Berliner, Ken Zeichner, and Carol Tomlinson.

The emphasis accorded to an established body of work advantages senior scholars at the expense of junior academics. And, given that the ratings are a snapshot of 2011, the results obviously favor scholars who recently penned a successful book or big-impact study this year. But both of these also accurately reflect how thinkers can disproportionately impact public discussion--so I’m disinclined to see problems in such a “bias.”

There’s also the challenge posed by bloggers like Jay Greene, Goldrick-Rab, Bruce Baker, and Sherman Dorn, whose own blogging or think tank critiques mean that they are publishing with great frequency. The key: the aim was not to measure how much a scholar writes, but how much resonance their work has. Flagging blog entries and newspaper mentions in which a scholar is identified by university affiliation here serves a dual purpose: avoiding confusion caused by common names while also ensuring that scores aren’t unduly padded by a scholar’s own blogging (since those posts generally don’t include an affiliation). If bloggers are provoking discussion, the figures will reflect that. If a scholar is mentioned sans affiliation, that mention is omitted here; but that’s true across-the-board. If anything, that probably tamps down the scores of well-known scholars for whom university affiliation may seem unnecessary. C’est la vie.

If readers want to argue the relevance, construction, reliability, or validity of the metrics, I’ll be happy as a clam. I’m not sure that I’ve got the measures right, that categories have been normed in the smartest ways, or even how much these results can or should tell us. That said, I think the same can be said about U.S. News college rankings, NFL quarterback ratings, or international scorecards of human rights. For all their imperfections, I think such efforts convey real information--and help to spark useful discussion. That’s what I’ve sought to do here.

I’d welcome suggestions regarding possible improvements--whether that entails adding or subtracting metrics, devising smarter approaches to norming, or what have you. I’d welcome critiques, concerns, questions, and suggestions. Take a look, and have at it.

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.