Once upon a time, rankings were limited to the sports pages, where avid fans knew them by heart. But that changed when U.S. News & World Report made a name for itself by applying the practice to colleges and universities. In short order, high schools followed, until today at least seven national and local lists exist.
It's only natural to wonder at some point how one school stacks up against others. But I urge caution in jumping to conclusions based on the results. That's because many factors determine the position of a particular school. As Michael Winerip explained: "Anybody can make up any formula to measure anything, which gives lots of places a chance to be best at something" ("In Lists of Best High Schools, Numbers Don't Tell the Whole Story," The New York Times, Jun. 4). In other words, rankings can be misleading. They don't always mean what they seem.
One of the serious shortcomings of published rankings is that they don't take into account non-cognitive outcomes. If one of the goals of education is to create lifelong learners, then it's important to determine student attitudes toward the subject matter they've been taught. As I've written before, it's altogether possible to teach a subject well but to teach students to hate the subject in the process. When that happens, it's a Pyrrhic victory. Yet that unintended consequence is not reflected in any list of best high schools in the U.S.
Another problem with rankings is that a tiny change in one factor used can result in a dramatic overall change, either positively or negatively. Publishers of lists typically include an asterisk to indicate any such anomaly. But how many readers remember anything except the ranking itself? As Gerald Bracey noted: "Ranks by their very nature can make small differences in scores seem big" (Setting the Record Straight, Heinemann, 2004). What he means is that schools often bunch up. Any attempt to spread them out is almost always artificial and arbitrary.
Finally, rankings of schools are tightly correlated with ZIP codes. It's no surprise, therefore, that schools in the suburbs post better academic outcomes than schools in the inner cities. This has been the case for so long that I wonder why so much attention is paid to the various lists that are periodically published. The other obvious point is that schools with selective admissions policies shine. For example, 37 of the top 50 schools in Newsweek's index require a combination of entrance exam scores, grade-point average, state test results and assessment of writing samples.
So let's take rankings of schools with a healthy dose of skepticism. Being at or near the top of such lists confers bragging rights, but it's a mistake to believe much more than that is warranted.