1/30/2009 Jill Colford Schoeniger ’86

College Rankings Unplugged

  • l 000646 copy3
Some college administrators rush to their bookstore on the day U.S. News & World Report publishes its college-rankings issue. Others hustle past. But in either case, they know phones will ring with inquiries from alumni, parents, donors and the media about why their institution has jumped or dropped in the annual survey.

It's a day that John Burness '67 does not miss now that he has retired as senior vice president for public affairs and government relations at Duke University. For three decades at the University of Illinois at Urbana-Champaign, Cornell University and Duke, he spent considerable time analyzing and dissecting rankings, speaking with those responsible for them and explaining the results to his various constituencies.

His take on the whole system? In a word: cynical.

  • john burness
"It's a huge moneymaker for U.S. News & World Report," Burness says. "What they do every year or so is do some tweaking to the methodology. Arguably on their part it is to refine the rankings. But if the rankings remained exactly the same every year, people wouldn't buy the magazine. Why would they? So there is an incentive for the publisher of these different rankings to have different institutions surface each year."

Another way for college-ranking magazines such as U.S. News, Forbes and Money to shake things up—and sell more issues—is to create new categories. U.S. News, for example, created two new two rankings for this year's issue: up-and-coming colleges and institutions favored by high-school counselors.

But in fairness to the folks at U.S. News, which published the first annual rankings in 1983, Burness believes they had no idea the rankings system would spawn a cottage industry and change the way people look at colleges. "Some of the people I have talked to at U.S. News & World Report are a little embarrassed that the rankings have taken on a life of their own," says Burness, now a visiting professor of the practice of public policy at Duke.

In a Sept. 5 essay in the Chronicle of Higher Education titled "The Rankings Game: Who's Playing Whom," Burness further explained: "Ours is a competitive culture, and it should be no surprise that many people are interested in such external assessments of the quality of American higher education. After all, students and families spend as much as $50,000 a year to go to college, and it is reasonable for them to want a credible, independent assessment to help guide their thinking."

The biggest complaint colleges and universities have about such rankings is that it is virtually impossible for anyone to create a system that would capture the essence or uniqueness of an institution.

The increasing interest in college rankings by parents, prospective students and guidance counselors reflects a culture that seems obsessed with lists: from which CEOs make the most money to which college football teams had the best season.

"We live in a measurement society," Burness says. But he is adamant in his belief that numbers simply do not tell the story when it comes to ranking a college.

Part of his reasoning is that the numbers themselves are not as solid as the publishers would like you to believe. "The fact that you go up a few places or down a few places doesn't really matter. It really doesn't. I've seen that at enough institutions to know," he says. "I find the precision they purport to have is rather silly. Are these roughly the top institutions? Yes, but to do it in the way they do to say this institution is No. 1 and this one is No. 5 is really disingenuous."

He points to the California Institute of Technology to prove his case. "One of my favorite examples is the year that CalTech jumped from eighth place to first in one year," he explains. "You could talk to anyone at CalTech, and, of course, they didn't know what they had done differently in that year than in any other year. They knew they were fortunate to be named No. 1, but they also knew that as soon as the methodology changed, they would probably drop back in the pack of the top 10, which is what happened shortly thereafter."

The CalTech example leads Burness to another of his pet peeves, which is the fact that the rankings come out annually. "The one-year nature of the surveys almost implies that any innovation that might occur within an institution will automatically manifest itself within 12 months. It's just not so."

He points to Franklin & Marshall, where he serves as a member of the Board of Trustees, to make this point. "Take our College House System. By any measure I have seen, it is perhaps the single biggest factor that has changed in recent years on campus. It has changed the way students and faculty look at education, and it has been a significant add-on to the quality of the intellectual life of students," he says. "But you are not going to capture that change in one year or two years or even three years. It has to be around for a little while."

While the inconsistency in the methodology and the yearly nature of the surveys rankle Burness, he is most concerned, as are many of his higher-education peers, that the numbers cannot capture the intangibles of the college experience that are far more important to students.

"I really make the point whenever I talk to students that these rankings are nice and they are useful to take a look at. They give you some sense in a perceptual way of where an institution might be, but don't rely on them," he says. "Go visit these schools, sit in on classes and get a real feel for the place. Ultimately you'll find the fit that makes the most sense for you. I think that is the right advice."

It's the same advice he gave to his own children when they were looking at colleges, pointing out the institutions that are making changes for all the right reasons and not simply trying to play the ratings system to enhance their stature.

"The institutions I really respect are the ones that are driven by their own view of what their mission is and how they best can accomplish it," he says. "I look at Franklin & Marshall and see the kinds of investments it has made to improve the quality of the education it offers to students because it is trying to give students a better education. It is not because it is looking at national rankings."

Ultimately Burness believes there's one simple way to determine whether a college or university is doing its job: talk to its students.

"I recall a wonderful conversation I had at a dinner the night before F&M's Commencement," Burness concludes. "I was talking to four or five students who were about to graduate. I asked them, 'What did you learn about yourself in your years at F&M?' One of the students thought for about a split second, and he said, 'I learned that there is probably nothing that I put my mind to that I can't accomplish.' As a parent or even a trustee, that is exactly what you'd want to hear about a college or university."
Story 11/7/2012

A Big Sound from a Small Department

Music ensembles have been part of the fabric at Franklin & Marshall College almost since its...

Read More
Story 11/7/2012

In Dog He Trusts

By professional standards alone, Steve Stochaj ’83 is an accomplished Franklin & Marshall...

Read More
Story 11/7/2012

Political Passion

It was Oct. 4, just one day after President Barack Obama and his Republican opponent, Gov. Mitt...

Read More