Statistical stand-off

Shabab Siddiqui

The University released a report Thursday evaluating efficiency and graduation rates, its first campus-wide, research-based rebuttal to criticisms of its productivity.

The report was authored by Marc Musick, a sociology professor and associate dean of the College of Liberal Arts. Musick also signed off on a report compiled by the liberal arts deans in July in response to the now-infamous “Seven Breakthrough Solutions” pushed by businessman and former UT professor Jeff Sandefer and the Texas Public
Policy Foundation.

The report stacks UT against other public universities across the country using various efficiency measures. The report asserts that while much improvement is needed, UT ranks near the top among public universities in many areas, such as six-year graduation rate (13th), percentage of students graduating per tuition and tax dollar (10th) and faculty employed per taxpayer and tuition dollar (2nd).

The report does a fair job in contextualizing the issues that have recently surrounded higher education. It presents UT’s first on-paper rebuttal after months of political rhetoric. It also puts forward the University’s primary argument, which is that despite its dismal 51-percent four-year graduation rate, UT graduates more first-year students than any other public research university.

However, like other numerically-decorated higher education reports in the last few controversy-filled months, statistics are the paint to a predetermined picture. In its seemingly-arbitrary efficiency measurements, the report cites other universities’ six-year graduation rates, where it compares comfortably at No. 13, as opposed to its four-year graduation rate, where it sits at No. 21.

Another efficiency measure in the report adds the annual tuition and state contribution per student and divides it by the number of professors to calculate the “professor efficiency in dollars per student tuition and state funds.” This does not at all take into account factors such as salaries of administrators, faculty, staff and students. It also does not take into account endowed chairs and departmental stipends. Yet UT ranks second in this largely unrevealing statistic.

Perhaps the most confusing aspect of the report is its claim of a causal relationship between tenure and tenure-track professors on campus and median SAT scores. While there’s little doubt of the tremendous value these professors can have on a campus, the effect of the sheer number or proportion of full professors is likely only a correlation. SAT scores, which are said to be predictive of a student’s graduation rate, have much more to do with admission standards. UT’s median SAT score of 1165, which is lower than most of its peers, is more likely attributable to the state’s guaranteed-admissions rule. If a high school senior is in the top 7 percent of his or her high school class and is set on attending UT, he or she does not have much of an incentive to do well on the SAT.

The report deftly uses UT’s size, combined with relatively low tuition and low state investment, to present high levels of efficiency. The irony is that the same quantities-of-scale argument that serves the report well is sometimes used in the opposite context. President William Powers Jr. explained at a Faculty Council meeting in March how the University tries to avoid surveys and rankings that penalize large research institutions by dividing factors by the total number of students.

As the University carries out its he-crunched-she-crunched statistical battle against number-skewing detractors, the real victim is students. As Musick said in his report: “One of the most important student outcomes for any university is degree completion. Just as important, if not more so, is the need to provide an excellent educational experience that produces high-quality degrees and an academic foundation for their students’ lives.”

In the name of efficiency, let’s not forget education.