The Fannie and Freddie University

by Victor Davis Hanson

PJ Media

It’s More than Just PC

The traditionalist critique of the university — I made it myself over thirteen years ago in the co-authored Who Killed Homer? — was that somewhere around the time of the Vietnam War, higher education changed radically for the worse. Note I am talking mostly about the liberal arts. America remains preeminent in math, physics, hard sciences, medicine, and engineering, subjects that are largely immune to politicization and race, class, and gender relativism. The top students, and often the more hard-working, gravitate to these fields; indeed, in my general education courses on the ancient world, I often noticed that math and science students did far better than did their sociology or anthropology counterparts.

Such excellence in math and science explains why the world’s top-rated universities in all the most recent rankings are overwhelmingly American. (Indeed, liberal arts professors piggyback on such findings and often, in a sense quite fraudulently, point to these polls as if to confirm their own superiority.)

I spent a great deal of my life in the university as a student and professor and now as a researcher. Higher learning in the arts and humanities has enriched American life for 200 years. Small liberal arts colleges like Hillsdale, St. John’s, St. Thomas Aquinas, and dozens of others continue to be models of enlightened learning. But all that said, increasingly public universities and the larger private institutions have become morally and fiscally bankrupt. Here are some reasons why.

Monotony of Thought

By 2011 we all know that faculties are overwhelmingly liberal. That in and of itself would not be so alarming if they were not activist as well. By that I mean academics are not just interested in identifying supposed past American sins, but also in turning disinterested instruction into political advocacy, especially along race, class, and gender lines. Rosie the Riveter, the Japanese internment, and Hiroshima all deserve study, but they are not the sum total of World War II. Today’s average undergraduate may know that African-Americans were not integrated into American units during World War II, but they have no clue what the Battle of the Bulge, a B-29, or Iwo Jima were. They may insist that global warming is real and man-caused, but would have trouble explaining what exactly carbon is.

The effect of politicized learning on the quality of education was unfortunate in a strange sort of cyclical fashion. The more “–studies” classes saturated the curriculum, the less time there was for classical approaches to literature, philosophy, language, or history. The more the profile of the student body became more important than its preparation, the more these classes had to be watered down, as if thinking the right thoughts could justify the absence of the old rigor.

Deans begin quoting the ethnic profiles of the incoming classes, the supposed expanded diversity of the faculty, their own commitment to various progressive causes, and kept absolutely mum about the average GPAs and SAT scores of the new student body or the content of the new curriculum. And why not? No provost was ever fired for having fewer students graduate with less skills; many were for not “reaching out” to “underrepresented” groups.

A Blank Check

We know all the other pathologies of the modern university. Tenure metamorphosized from the protection of unpopular expression in the classroom into the ossification of thought and the proliferation of the mediocre. Faculty senate votes did not reflect raucous diversity of thought among secure professors, but were analogous to Saddam’s old plebiscites in their one-sided voting. Tenure created the notion of a select cloister, immune from the tawdry pursuit of money and neurotic worry over job security so true on the crass “outside.”

Campus ethics and values were warped by specialization in both faculty instruction and publication. The grandee that butchered a graduate class every semester was deemed more valuable to the university than the dynamic lecturer who enthused and enlightened three undergraduate introductory classes each term — on the dubious proposition that the former serially “published” peer-reviewed expansions on his dissertation in journals that at most five or ten fellow academics read.

Not teaching at all was even preferable to teaching very little, as a priestly class of administrators evaded the “burdens” of instruction. The new bureaucrats were often given catchy titles: “Assistant to the Provost for Diversity,” or “Associate Dean for Cultural Studies,” or the mundane “Special Assistant to the President for Internal Affairs,” in the manner of late Soviet apparatchiks or the power flow charts of the more mediocre corporations. Although the faculty was overwhelmingly liberal, it was also cynical, and understood that the avalanche of self-serving daily memos it received from the nomenklatura need not be read. I used to see entire trash cans filled each morning with reams of xeroxed pages, as professors started off their days by nonchalantly dumping the contents of their mail slots. Most of the memos read just like those “letters” congressmen send to their constituents, listing a dean’s or vice-provost’s res gestae and detailing how they were “working for you.”

Lala Land

Self-invention proliferated. Under the system of “faculty governance” (analogous to carpenters assuming the roles of the contractor and architect), curriculum, hiring, promotion, and firing were managed by peers. An article “in progress” or “under review” was passed off by committees as good as published (And why not? You, in hand-washes-hand-fashion, might be on the other end of a faculty committee and need the same life raft someday). Linda Wilson-Lopez, a third generation one-quarter Mexican-American, was deemed as much a victim as if she had just crossed the Rio Grande. Old white guys in their sixties, who were often hired sight unseen in the early 1970s, suddenly demanded diversity hires — with the assumption that when the music stopped in the 1980s they had already found chairs and the new discrimination did not apply to the already tenured. (Had affirmative action involved replacing sixty-something, full-professor white males, it would have had a very different reception). Proposals for envisioned research on sabbaticals were as common as post-sabbatical reports of actual work were rare.

Careers were destroyed by charges of “racism,” “sexism,” or “homophobia,” rarely through smearing a Mormon in class, or skipping a week of instruction to junket at a conference. All of the above is well-known, as hundreds of exposés in the last thirty years have explained to us quite well why college graduates are both so politicized and so lacking in knowledge and the inductive method. We see them screaming in videos at Occupy Wall Street demonstrations — full of self-pity it is true, but also in a sense worthy of pity as well. Nothing is worse than to be broke, unemployed, and conned.

Money is the Game Changer

There is a new element in the equation. Debt. Almost every year, tuition climbed at a rate higher than inflation. It had to. Higher paid faculty taught fewer classes. “Centers,” run by professors who did not teach and full of new staff, addressed everything from declining literacy to supposedly illiberal epidemics of meanness. Somewhere around 1980, the university was no longer a place to learn, but a sort of surrogate parent, eagerly taking on the responsibility of ensuring that students were happy, fit, right thinking, and committed. That required everything from state-of-the-art gyms replete with climbing walls, to grief counselors, to lecture series and symposia on global warming and the West Bank. All that was costly.

To pay for it, the federal government guaranteed student loans and the university charged what they wished — with the hook that the interest need not be paid until after graduation. For an 18-year-old, taking on debt was easy, paying it back something to be dealt with in the distant future — especially when the university promised higher-paying jobs and faculty reminded college students that their newly acquired correct-thinking was in itself worth the cost of education. There was little competition. Trade schools were still looked down upon, and online instruction was in its infancy.

The result, as we now know, was a huge debt bubble, one of nearly $1 trillion in aggregate borrowing that rivaled the Freddie and Fannie frauds. And yet the debt no longer comes with guarantees that the liberal arts and social science graduate will find employment, either of the sort that he was trained for, or necessarily more remunerative than the federal clerk or the union tile setter. Starbucks from 7-7 each day will not pay off that Environmental Studies degree from UC Irvine.

As the economy cooled, cash-strapped parents increasingly had little money to ease the mounting burdens. What was once a rare $10,000 student loan became a commonplace $50,000 and more in debt. Living at home until one’s late twenties is in part explicable to the mounting cost of college and the accompanying dismal job market — and the admission that many college degrees are no proof of reading, writing, or thinking skills. (Note as well that the themes and ethos of the university were not “life is short, get on with it,” but rather population control, abortion, careerism, metrosexism, etc. that contributed to the notion that one’s 20s and even 30s were for fun and exploring alternatives, but most certainly not to marry, have children, get a job, buy a house, and run the rat race.)

I noticed about 1990 that some students in my classes at CSU were both clearly illiterate and yet beneficiaries of lots of federal cash, loans, and university support to ensure their graduation. And when one had to flunk them, an entire apparatus was in place at the university to see that they in fact did not flunk. Just as coaches steered jocks to the right courses, so too counselors did the same with those poorly prepared but on fat federal grants and loans. By the millennium, faculty were conscious that the university was a sort of farm and the students the paying crop that had to be cultivated if it were to make it all the way to harvest and sale — and thus pay for the farmers’ livelihood.

How could a Ponzi scheme of such magnitude go on this long?

Lots of reasons. The university was deeply embedded with a faux-morality and a supposed disdain for lucre. “College” or “university” was sort of like “green” — an ethical veneer for almost anything imaginable without audit or examination (Whether a Joe Paterno-like exemption or something akin to Climategate or the local CSU campus where the student body president recently boasted that he was an illegal alien and dared authorities to act — to near unanimous support from the university.) Since World War II, a college degree was rightly seen as the key to middle class upward mobility. That belief was enshrined, and so we forgot to ask whether everyone was suited for college, or whether the college educated per se were always more important to the economy than the self-, union-, or trade-schooled welder, concrete finisher, or electrician.

If Only They Were as Fair as Wal-Mart …

The “part-timer” or “adjunct faculty” now became a sort of Messenian helot to square the circle of the universities lacking the resources to meet their pretensions. With dozens of PhD applicants for each liberal arts or social science tenure-track job (graduate schools likewise turned out far more doctorates than were needed, given their own desire for the prestige and the smaller load of graduate instruction), universities found plenty of cheap labor. When the full professor retired, his courses could be outsourced to itinerant part-time lecturers, for thousands less dollars per class in salary and benefits. That the faculty hated Wal-Mart and yet treated its campus employees far worse than did the retailing bogeyman was assumed, but never acknowledged. In some sense, those hired in the 1960s and 1970s before the “fall,” like senior California public employees now ready to retire, were the proverbial rat in the snake’s belly that had to make its way out, with the understanding that never again would anything like it make its way in.

And So?

But what cannot go on will not go on — at least for most universities without the billion-dollar plus endowments. The present reckoning is brought on not by introspection, self-critique, or concern for our increasingly poorly educated students, but by money, or rather the lack of it. Higher education is desperately searching for students with cash, loaned or not. And it is, by needs, panicking and will ever so slowly start changing. For-profit tech schools, online instruction, and the two-year junior college deliver a cheaper “product,” one not necessarily any longer an inferior one, given the nature of the contemporary university curriculum and values of the faculty.

It used to be that one did not dare go to a DeVry or Phoenix for-profit school for computer certification or accounting, because one would miss out on the rich undergraduate experience, both social and intellectual — best exemplified by the core curriculum of some 50-60 units in liberal arts and sciences. But if the university is serially subsidizing panels about global warming, lauds Palestinian activists, and runs workshops on homophobia (all without balance and counter-opinion), and if its GE required courses, whether so titled or not, are too often little more than the melodramatic obsessions of over-specialized, ranting professors who otherwise would have small audiences, then why spend the money and go through the charade of classically liberal instruction, especially given that the trade school is cheaper and more honestly pragmatic?

Much that was good will fall along with more that was bad. But it was a comeuppance long overdue. With hubris comes nemesis — leading to atê or ruin.

©2011 Victor Davis Hanson

Print Friendly, PDF & Email