Remember when the Mafia fixed the World Series? That scandal so shocked America that it showed up in The Great Gatsby and gave us the heart-rending story of a little boy pleading with “Shoeless” Joe Jackson (one of the cheaters) to “say it ain’t so, Joe.” And of course, we Dodgeball fans have barely gotten over Lance Armstrong’s lies. Now we learn that another beloved American competition has been fixed: the U.S. News and World Report college rankings.
Yes, it turns out that some college administrators have been feeding U.S. News doctored numbers to nudge up their schools on that hallowed list.
Let’s get two things straight: We all love rankings, top ten lists, and contests. And we all know that most of them are kind of stupid. There are some contests where coming in second is a lot worse than coming in first—like presidential elections, and World Wars. (It was small consolation to the Kaiser in 1918 that he had almost defeated England.) But what exactly does it mean when Entertainment Weekly ranks Scarlett Johansson as “hotter” than Megan Fox? Exactly as much as it means when U.S. News rejiggers its annual lineup of colleges to place Princeton ahead of Harvard: nothing, not one thing at all.
And that was the case well before we learned that colleges and universities were inflating their statistics to try to get a higher ranking in the supposedly authoritative U.S. News rankings. The truth is that U.S. News—a news magazine that went bankrupt and stopped reporting news—now makes a huge annual business out of ranking American colleges based on criteria that are totally arbitrary.
Look at the “formula” U.S. News uses to rank colleges:
“Undergraduate academic reputation.” Some 22.5 percent of each school’s ranking is based on . . . surveys emailed to administrators at other schools, inquiring about “intangibles such as faculty dedication to teaching.” So to find out how dedicated the teachers are at Swarthmore, they ask the provost of Stanford.
"Freshman retention and graduation rate." For U.S. News, “The higher the proportion of freshmen who return to campus the following year and eventually graduate, the better a school is apt to be at offering the classes and services students need to succeed.” Or it might mean that a school is timid about what kind of students it admits, and that easy grading makes it very hard to fail. (I went to Yale—just try flunking out of that place. They will do everything short of sending tutors to your dorm room.)
“Faculty resources,” which smooshes together numbers like average class size, faculty qualifications, the use of adjunct and part-time teachers—and even faculty salaries—but leaves out how many classes are taught by grad students with halting English.
“Student selectivity.” This doesn’t measure how smart the students are, but how prestigious the school already is. So a famous school that’s the automatic first choice of valedictorians will far outscore another college whose students have equal SAT scores—simply because it’s more popular among guidance counselors.
“Financial resources.” So schools that charge very high tuitions, then plow it back into indoor rock-climbing facilities, will outscore cheap schools that spend their money on books.
“Graduation rate performance.” This is a weird one, where U.S. News looks at how well it bet, in a previous edition, that the school would do at improving its graduation rate—and sees if the school beat the point spread. Huh?
“Undergraduate academic reputation ratings,” based on…surveys of high school guidance counselors. So people who have been out of college for many years, who now work in high schools, are providing the “news” about what’s happening in universities? How would they know?
Percentage of alumni who give money to the school. This makes some sense, since it reflects how happy graduates are with what they paid for. A pity it’s the last, least important criterion.
What standards should we be using for judging colleges? I can suggest a few: How solid are the school’s “core” or “general education” requirements for every graduate? How rigorous are the course requirements for individual majors—for instance, must English majors study Shakespeare, and history majors the U.S. founding? How much intellectual freedom is there on the classroom? Is political speech free on campus? How sane is the dorm life? How safe are students on campus? These are just a few of the questions that college rankings pros seem never to ask. And yet they are the ones that really matter. Why not ask current students and faculty, confidentially, to rate their institutions, and report what they actually say? Their answers will be fudge-proof, free of administrators’ tinkering and—as we I've often found—sobering.