« The RHSU Edu-Scholar Public Influence Scoring Rubric | Main | 2015 RHSU Edu-Scholar Public Influence: Top Tens »

The 2015 RHSU Edu-Scholar Public Influence Rankings

Today, we unveil the 2015 RHSU Edu-Scholar Public Influence Rankings. Simply being included among the 200 ranked scholars is an honor, given the tens of thousands who might be included. The list of qualifying scholars includes a qualitative component, though the actual scores are composed entirely of quantitative metrics. The rankings include the top 150 finishers from last year's rankings, along with 50 "at-large" nominees chosen by a selection committee of 31 automatic qualifiers (see yesterday's post for all the requisite details).

The metrics, as explained yesterday, recognize university-based scholars in the U.S. who are contributing most substantially to public debates about education. The rankings offer a useful, if imperfect, gauge of the public influence edu-scholars had in 2014. The rubric reflects both a scholar's body of academic work—encompassing the breadth and influence of their scholarship—and their footprint on the public discourse last year.

Here are the 2015 rankings (click chart for larger view). Please note that all university affiliations reflect a scholar's institution as of December 2014.

top 200-revised.jpg

Only university-based researchers are eligible for the rankings. The rankings don't include full-time think tankers, advocates, and such. After all, the point is to encourage universities to pay more attention to the stuff of scholarly participation in the public square. (The term "university-based" provides some useful flexibility. For instance, Tony Bryk currently hangs his hat at Carnegie. However, he is an established academic (at Stanford) with a university affiliation. So he's included. The line is admittedly blurry, but it seems to work reasonably well.)

No exercise of this kind is without complexities and limitations. The bottom line: this is a serious but inevitably imperfect attempt to nudge universities, foundations, and professional associations to do more to cultivate, encourage, and recognize serious contributions to the public debate.

The top scorers? All are familiar edu-names, with long careers featuring influential scholarship, track records of comment on public developments, and outsized public and professional roles. In order, the top five were Diane Ravitch of NYU, Linda Darling-Hammond of Stanford, Howard Gardner of Harvard, UCLA's Gary Orfield, and Harvard's Paul E. Peterson. Rounding out the top ten were Andy Hargreaves of Boston College, Arizona State's David Berliner, Stanford's Larry Cuban, Yong Zhao of U. Oregon, and Arizona State's Gene V. Glass. Notable, if not too surprising, is that the top ten are all veteran, accomplished scholars who have each authored a number of (frequently influential) books, accumulated bodies of heavily cited scholarly work, and are often seen in the public square and working with state and district leaders. That reflects the intent of the scoring rubric, which weights the broad, lasting public influence of a scholar's work much more heavily than a short run of ephemeral visibility.

W. Steven Barnett of Rutgers, a leading authority on early childhood, was the highest-scoring new entrant. He debuted in the top twenty, claiming spot #17. Marc Lamont Hill of Morehouse, Jeannie Oakes (who returned to UCLA from the Ford Foundation), and UPenn's Angela Duckworth were the other new names to debut in the top fifty.

UPenn's Shaun Harper, who chairs the university's Center for the Study of Race & Equity in Education, made the biggest single leap from last year, climbing 85 spots to #42. Other returnees making especially big jumps from 2014 included Sara Goldrick-Rab of U. Wisconsin, Laura Perna of UPenn, Andy Porter of UPenn, Amy Stuart Wells of Columbia, Jim Ryan of Harvard, and Malachy Bishop of U. Kentucky.

Stanford University and Harvard University both fared exceptionally well, with Stanford placing four scholars in the top 20 and Harvard placing three. New York University was the only other institution to place multiple scholars in the top 20.

In terms of the most scholars ranked, Stanford topped all others with 22. Harvard was second, with 18, and Columbia was third, with 14. Overall, more than 50 universities placed at least one scholar in the rankings.

A number of top scorers penned influential books of recent vintage. For instance, among the top ten, just in the past year, Yong Zhao released Who's Afraid of the Big Bad Dragon: Why China Has the Best (and Worst) Education System in the World; Andy Hargreaves coauthored Uplifting Leadership: How Organizations, Teams, and Communities Raise Performance; and David Berliner and Gene V. Glass published 50 Myths and Lies That Threaten America's Public Schools: The Real Crisis in Education.

As with any such ranking, this exercise ought to be interpreted with appropriate caveats and caution. Given that the ratings are a snapshot of where things stand as we start 2015, the results obviously favor scholars who penned a successful book or influential study in 2014. But that's how the world works. And that's why we do this every year.

A few scholars tended lead the field in any given category. For those keeping score at home, here's a quick review of the category-killers:

  • More than two dozen veteran scholars maxed out on Google Scholar.
  • When it came to book points, Ravitch, Darling-Hammond, Gardner, Peterson, Cuban, Carol Tomlinson of U. Virginia, Nel Noddings of Stanford, and Martin Carnoy of Stanford each maxed out. Vanderbilt's Joseph Murphy scored the highest Amazon ranking at 19.7.
  • As far as attention in the education press, Ravitch led the pack. When it came to mentions in mainstream newspapers, Ravitch was joined by Darling-Hammond and Orfield.
  • In terms of web visibility, Ravitch, Darling-Hammond, Gardner, and Duckworth were at the top of the heap.
  • The wild-and-woolly world of social media, reflected in Klout scores, was dominated by Ravitch and Lamont Hill.

If readers want to argue the relevance, construction, reliability, or validity of the metrics, I'll be happy as a clam. I'm not sure that I've got the measures right or even how much these results can or should tell us. That said, I think the same can be said about U.S. News college rankings, NFL quarterback ratings, or international scorecards of human rights. For all their imperfections, I think such efforts convey real information—and help spark useful discussion.

That's what I've sought to do here. Meanwhile, I'd welcome suggestions for possible improvements and am eager to hear your critiques, concerns, questions, and suggestions. So, take a look, and have at it.

You must be logged in to leave a comment. Login | Register
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Follow This Blog

Advertisement

Most Viewed on Education Week

Categories

Archives

Recent Comments