US News rankings: Solid math or empty statistics?

TBy Charlie Gasner

The Crimson White
University of Alabama

tUSCALOOSA, Ala. (U-WIRE) – At a back-to-school press conference in August, interim University of Alabama President Barry Mason dismissed the importance of the newly released Princeton Review college rankings, which piqued local interest by ranking the University as the nation’s third-best party school.

Mason said he’d like the University’s focus to be on academic rankings “that have meaning,” not student surveys with “flawed methodology” like Princeton Review’s.

Last week, Mason said he was “pleased to announce that for the second consecutive year, the University of Alabama has been named among the nation’s top 50 public universities by US News & World Report.”

The University tied with Ohio University, the University of Massachusetts, the University of New Hampshire and the University of Vermont for 45th place among public colleges, up from last year’s rank of 48. For the second consecutive year, US News placed the University in its second of four tiers of national research universities with significant doctoral programs. Most of the Southeastern Conference’s member schools, including Florida, Georgia, Kentucky, Tennessee and Auburn, fall into this category.

The US News rankings are a source of pride for many faculty members and administrators, who see them as confirmation of a job well done. Margaret Garner, a professor in the department of family medicine, said she thinks the University’s recent move up the US News rankings — it was listed as a fourth-tier school as recently as the mid-1990s — shows that people around the country are finally noticing the good job the University has been doing for a long time.

Cathy Andreen, the University’s director of media relations, said she was extremely pleased that the University was ranked among the top 50 public schools and that the rankings were “a wonderful reflection on our faculty and students’ dedication.”

“It reflects the hard work our faculty and students put into academic endeavors,” Andreen said.

But are the rankings as statistically sound as US News claims? It depends on whom you ask.


U.S. News started publishing college rankings in 1983, and its initial methodology was simply to ask university presidents to rank the top five colleges in each category (national doctoral, liberal arts, etc.). In 1988, according to a University of Chicago Magazine report, US News brought in a statistician to develop a more scientific formula by which colleges could be ranked — but fired the statistician after her formula placed a small seminary at the top of the list. US News hired a second statistician, who came up with a methodology that placed Yale University on top.

The editor responsible for the issue, Mel Elfin, was quoted in the Washington Monthly as unapologetic about his determination to get the Ivy League schools on top of his rankings.

“When you’re picking the most valuable player in baseball and a utility player hitting .220 comes up as the MVP, it’s not right,” he said.

Since then, Harvard, Yale or Princeton has occupied the top spot every year but one — 1999, when California Institute of Technology came out on top. After that issue, US News recalculated its formula to de-emphasize per-student spending, an area in which Caltech enjoyed a significant advantage. Since then, Harvard, Yale and Princeton have enjoyed another uninterrupted run on top. This year’s edition lists Princeton first, Harvard second and Yale third.


As the coordinator of the University’s Capstone Poll, which conducts local surveys and offers survey consulting services, Michael Conaway is used to dealing with survey data. But Conaway said he was frustrated by the lack of explanation US News provided along with its data.

“What’s more notable about their Web site and report is what they don’t say,” he said.

Conaway said neither US News magazine nor its Web site seemed to explain how it arrived at the current ranking system, or why certain categories were given greater weight than others.

“I did not find any explanation on the Web site as to the reasons they included certain components and not other possible ones or why they weighted the selected components they way they did,” he said.

A quarter of the ranking was based on peer assessment, but Conaway said he wondered why the assessment was restricted to higher-level administrators.

“Why would you only talk to presidents, provosts and deans of admission?” he said. “Why not, say, department heads?”

Another component Conaway wondered about is the measure of “graduation rate,” which accounts for 20 percent of US News’ ranking.

“What does your ‘graduation rate’ tell you about the quality of your graduates?” he said. “It tells you that they graduated.”

As an alternative measure, Conaway suggested counting the percentage of new academic hires from each school’s recent graduating class to give some idea of whether a school is producing quality graduates.

But Conaway acknowledged that coming up with a perfect ranking system is a nearly impossible task.

“I don’t mean to demean their effort. I think they’ve made a great effort,” he said. “And they do tell you that the ranking is just a guide — but who’s going to remember that caveat? How could they possibly pass up the drama attendant to announcing the number one school?

“But sometimes, some number actually isn’t better than no number. You don’t know what this composite score means, because they won’t tell you.”


In 1997, US News commissioned the National Opinion Research Center to write a report on the college ranking methodology. Its first criticism of the US News system was its refusal to acknowledge that “non-traditional students” — students, often found on urban campuses, who take classes for personal enrichment with no intention of obtaining a degree — can skew the rankings.

“We note, however, that the US News ratings are developed for traditional students entering college shortly after high school; that is, students 18 to 24 who attend full time and may have applied to and chosen among several institutions,” the report said.

“Thus, this discussion is about criteria for such traditional students. We believe it is impossible to rate institutions with the same set of indicators for both traditional and non-traditional students. As the proportion of non-traditional students attending higher education institutions grows, US News might want to consider developing a separate rating system and publishing a separate guide for nontraditional students.”

At a UA System Board of Trustees meeting last year, Vice Chancellor of Academic Affairs Charles Nash presented graduation statistics showing the University’s rates to be far above its sister institutions. Sixty-one percent of the University’s 1994 entering class graduated in six years or less, compared to 39 percent at the University of Alabama in Huntsville and 34 percent at UAB.

Nash pointed to non-traditional students as the primary reason for this discrepancy and said these statistics should not be viewed as a comparative lack of quality at the system’s urban campuses, but US News’ formula regards a non-traditional student who takes classes for a year at a commuter campus as equivalent to a degree-seeking student who flunks out or transfers.

According to the latest US News ranking, UAH’s 25th percentile of incoming freshmen scores a 21 on the ACT and its 75th scores a 27, as opposed to 21 and 26 at the University. UAH boasts a slightly higher percentage of classes with fewer than 20 students enrolled (45 to 44) and has almost half as many “lecture-hall” classes with more than 50 students enrolled (7 percent to 13 percent of all classes). Its freshmen are slightly more likely, 28 to 27 percent, to come from the top 10 percent of their high school classes.

But UAH ranks in US News’ third tier, while the University sits in the second.


Despite its flaws, Conaway said, the US News rankings can be valuable when the dramatic aspect is removed.

“People who want to use these should look at the component scores,” he said. “The biggest problem is that they’ve tried to come up with a formula to measure academic quality. There seems to be some false precision going on here, something akin to inferring how much space angels need to dance and then rigorously measuring the number of dance-spaces on the head of a pin.

Students who know what they want to do before going to college, Conaway said, should ignore the rankings entirely.

“They should look at department information, not universities as a whole,” he said. “If I want to get a job doing chemical engineering, I would want to know what percentage of graduates from that program got jobs doing chemical engineering and the average starting salary. This seems simple compared to the U.S. News list, but it’s not clear even here that the measurement we are likely to get actually reflects better academic quality.”

As for that Princeton Review party-school rank, in Conaway’s mind the US News rankings still do a better job “in terms of quality control.” Princeton Review’s data, Conaway said, would seem to be much more easily manipulated than US News’.

But Conaway noted, “Does US News really measure the things it purports to measure?” He said he thinks the component statistics are fine standing alone, but he’s skeptical about trying to tie them into an all-encompassing formula.

“There’s no place in the world where you can find a book where all universities are represented in terms of academic quality,” he said.