For a system designed to quantify our nation’s best and brightest, college rankings are pretty stupid.
Take the annual U.S. News & World Report rankings of graduate schools. The magazine ranks programs in everything from political science to chemistry, providing a list of the top 100 programs in each subject. However, the <a href="http://www.usnews.com/education/best-graduate-schools/articles/2011/03/14/social-sciences-and-humanities-rankings-methodology-2012">methodology</a> for these rankings is inane and flawed. The only criteria used in the rankings are “peer assessment surveys sent to academics in each discipline.” Even within the same field, how does a professor in New England compare a university in Florida’s sociology department with one at a university in California, neither of which he or she necessarily has much knowledge of?
The result is a perfect example of circular reasoning. Rankings are determined solely by people’s perceptions of a program’s quality. These views in turn affect the way people perceive a program, which then affects those same perceptions until the process repeats next year. The result is that certain programs are thought to be stronger than others because, well, they’re ranked higher. And why are they ranked higher? You see where this is going.
This year the University chose not to participate in the Times Higher Education rankings. The rankings are published every year by the Times Higher Education, a British education magazine. Remarking on the decision not to participate, President William Powers Jr. told the Faculty Council <a href="http://www.dailytexanonline.com/content/ut-withdraws-status-education-survey-because-approach">last Monday</a>, “We’re okay if we are going to do poorly on academic rankings but if the methodology is designed against a big state research university we often won’t participate.”
The University’s issue with these rankings is that they’re reliant on quantitative measures that create a per-student ratio. The U.S. News rankings for undergraduate institutions also contain several inputs that are divided by the total number of students, such as per-student spending (15 percent of the formula) and alumni giving rate (5 percent of the formula).
These rankings are widely known to influence prospective students and thus the university administrators seeking to lure those students to enroll. More surprising though, is <a href="http://www-personal.umich.edu/~bastedo/papers/BowmanBastedo.ResHE2009.pdf">a 2009 study</a> by University of Michigan researchers Michael Bastedo and Nicholas Bowman that found that college deans’ perceptions of institutional quality was also affected by the rankings, and that those rankings “have a strong influence on internal decision making” within colleges.
In a recent blog post on the U.S. News and World Report website, Robert Morse, the organization’s director of data research, addresses claims by critics that “this has drawn attention to the inadequacy of existing funding regimes while others have chosen to shift resources to areas that shape prestige, resulting in a negative effect on social equity.”
However, Morse <a href="http://www.usnews.com/education/blogs/college-rankings-blog/2011/03/24/are-college-rankers-becoming-too-powerful">counters that</a> “the bottom line [is] U.S. News is not running the colleges and does not play any role in making higher education policy at a state or national level.”
According to Bastedo and Bowman’s findings, Morse is dead wrong. If administrators are manipulating data and making internal changes solely in order to improve their ranking, US News is, in fact, driving the educational policy of this country.
The U.S. News rankings do not measure the educational quality of an American university, nor do they describe how well a college serves its students. Such rankings can tell you how a school stacks up in terms of average SAT scores or graduation rates, but little more. Additionally, the current system gives absolutely no incentive to college presidents and administrators to keep tuition costs reasonable. Last month, famed author Malcolm Gladwell <a href="https://media.scoopreprintsource.com/CondeNast/5695CN_UnvChicagoLawSchool_061111.pdf">his own ranking</a> of American law schools that included costs as an input. Not surprisingly, Gladwell’s rankings looked drastically different than those put out by U.S. News.
The corresponding increase in college tuition and the importance of college rankings over the last 20 years may help to explain another skyrocketing cost: university administrator salaries, which rose <a href="http://www.goldwaterinstitute.org/article/4941">61 percent between 1993 and 2007</a>. A university administrator who can raise a school’s national ranking can justify a six-figure salary.
However, the true culprits behind the US News snake oil scam are the people who read them. Universities give college rankings clout because a school’s rank has a measurable effect on the number and quality of applicants. The 2009 study by Bastedo and Bowman found that a university moving into the “Top 50” produced a 3.6-percent decrease in their acceptance rate, allowing the university to be more selective. Thus, advancing in the rankings has a tangible effect on the quality of students applying to a school, even if they don’t accurately reflect the education that school provides.
The University is not at all served by participating in rankings that do not fairly evaluate this institution. Perhaps next year Powers and the administration should consider boycotting any rankings that seek to quantify the University based on unfair or biased standards. If they’re going to fix the rules against you, there’s no point in showing up to play.
— Dave Player for the editorial board.