Like many others, I’ve begun to worry about the fate of higher education in American society. Having spent most of my professional life in academia, my instinct is to regard the university system as sacred – as Wisdom’s Workshop, to borrow the historian James Axtell’s recent book title. Liberal democracy relies fundamentally on a very well educated citizenry. And modern civilization more generally relies on a significant number of us possessing hard-earned historical perspective on what is true and what is good, and hard-earned scientific perspective on the full reach of human potential. For 800 years, universities have represented one of the most important institutions for fertilizing and cultivating that perspective, and even protecting it against humanity’s tendency toward self-destruction through war, tribalism, and superstition. Despite how radically our society has changed over that span of time, universities have remained critical in supporting our ability to know ourselves and to know the world better and better, and to steward that knowledge from one generation to the next. That is what makes the university system sacred. As modern civilization’s primary workshops of wisdom, universities are designed to safeguard the primary sources of human progress. Any threat to the university system should worry us. Today, there appear to be multiple, and the most frustrating thing of it is, those threats seem to be mostly self-imposed.
Hundreds of universities across America appear to be systematically rendering themselves defunct by condoning appalling levels of bureaucratic bloat and illiberal thinking. A political rallying cry of late has been centered on the unsustainably escalating cost of college tuition and student debt. As Forbes put it in a 2018 feature story, the cost of college education in America has simply skyrocketed. The average American college tuition price has doubled since the late 1980s, rising 8 times faster than average American wages. Student loans now make up the largest share of non-housing debt in the United States, more than both credit cards and car loans. Student loan debt current sits at over $1.6 trillion. And what exactly do our young people owe so much money for? In many cases, of course, they owe money for having received a legitimately higher education that will liberate their professional growth. But undeniably, a huge chunk of all that money is owed to bureaucrats who have absolutely nothing to do with education, nothing to do with the 800 year-old legacy of modern knowledge.
Take the University of Michigan, for example, which is one of the most popular and prestigious state universities in America, attracting tens of thousands of applicants each year from across the country. The average cost to attend the University of Michigan for out of state students is now over $50,000 per year. Where is all that money going? Well, according to recent audits, at least $11 million of it is going to an army of administrative bureaucrats each year. The university provost’s office, for example, has no less than 39 senior staff and an additional 26 junior staff on its payroll. The university’s College of Literature, Science, and the Arts houses no less than 14 administrative offices that are completely separate and independent of the college’s academic departments, where all of the actual learning takes place. How much do all of these non-academic bureaucrats get paid, and what exactly do they do? Well, again according to recent audits, the University of Michigan’s highest paid administrator who has nothing to do with teaching or research is the Vice Provost for Equity and Inclusion & Chief Diversity Officer. He’s paid just under $400,000 per year. Then there’s the Associate Dean at the Office for Health Equity and Inclusion who’s paid $220,000 per year. There’s another administrator in that same Office of Equity and Inclusion with the job title of Faculty Lead. He’s paid over $218,000 per year. There’s a so-called Special Consultant for Communications and Engagement for Diversity, Equity, and Inclusion Strategic Plan who’s paid over $164,000 per year. No less than three Assistant Vice Provosts for Equity, Inclusion, and Academic Affairs who all earn $134,000 per year. Even the Business Administrator in the Office of Diversity, Equity, and Inclusion rakes in $134,000 per year. The School of Nursing has it’s own Diversity and Inclusion Specialist, who’s paid $123,000 per year. The list goes on and on like this, costing tax payers and students at the University of Michigan and countless other universities across the country millions and millions of dollars every year. Is this money well spent?
To put these salaries into perspective, the average assistant professor at the University of Michigan, who would normally be responsible for teaching multiple courses per year as well as maintaining robust scholarly research, earns just $65,000 per year. Assistant professors are like the infantry in the university system’s army of educators. They are the gritty, ambitious, up-and-coming teachers and scholars who have arguably the greatest aggregate impact on college student instruction. Is a so-called Chief Diversity Officer really worth six assistant professors? How about fully tenured professors—the generals of the university system’s educating force, the scholars who have often spent decades contributing new knowledge to their fields while teaching upwards of thousands of students along the way? On average, full professors earn an impressive $148,000 per year at the University of Michigan—about as much as one of Michigan’s Associate Deans of Diversity, Equity, and Inclusion—a bit less than the university’s Director of the Office of Multicultural Initiatives.
Beyond bureaucratic bloat, illiberal thinking is seeping into college curriculums at an alarming rate. A 2019 Gallup survey of over 3,000 full-time U.S. college students revealed that nearly 20% believed that colleges should be forced to censor speech that someone might find offensive, and nearly 80% were in favor of establishing safe spaces on campus designed to censor potentially upsetting ideas. Moreover, students in hundreds of American college and university programs are being taught to racialize practically everything, from history and math to space exploration. In many cases, the emphasis of liberal arts education is shifting catastrophically from focusing on what can unite humanity to focusing on what divides us, as more and more professors dwell on what Westerners have done wrong—who we’ve oppressed, what we’ve destroyed—at the expense of teaching students about what the West has gotten right—who we’ve liberated, what we’ve discovered. Just the term, “Western Civilization” has become taboo. Shakespeare is being cancelled. Every conceivable act of injustice is being excavated from Western history by Western universities in a masochistic frenzy of self-flagellation, even while present day atrocities elsewhere in the world like the Uyghur genocide in China, the persistence of slavery across Africa and south Asia, and the brutalization of women and LGBTQ individuals in conservative Islamic and Christian communities in the Middle East are obfuscated or even justified on the basis of cultural relativism. Of course it’s important for any society to be self-critical, but the extreme that universities are now taking this to is not wisdom, and it’s certainly not conducive to a more harmonious, self-confident, cosmopolitan society. Instead, it’s dragging us back into the more cynical, suspicious, retributive, zero-sum kind of thinking that oppressed our ancestors for thousands of years—the exact opposite of what universities are supposed to do. And we’re paying more for this nonsense than ever before.
So… yeah, I’m worried about the fate of our university system. Still, after 800 years, universities nonetheless remain one of our best investments in progress. Despite the worrying trends I’ve highlighted, I think it’s safe to say that the majority of university faculty and administrators and students across America are still in the business of expanding human knowledge for the better. Many of our civilization’s best philosophers and authors and scientists and artists continue to thrive on the historically unique patronage and political protection that universities provide. And, certainly, the longevity of the university system makes present trends, however worrying they may be, seem more ephemeral. As the historian James Axtell puts it in his book, Wisdom’s Workshop, published in 2016, “The university is now more than eight centuries old, but… instead of aging, it has in the last century gained new vigor and proliferated…around the globe. On that basis alone, the latest alarms and hand-wringing over its prospects seem unwarranted.” As a professor at the College of William and Mary with an insider’s perspective, Jame’s Axtell’s historical optimism does help give me hope. As his book shows, the university is not only one of the most important institutions of the modern world, it is also one of the most resilient.
Universities as we know them emerged in medieval Europe during the 1100s, when increasing political stability, agricultural productivity, and booming commerce enabled the Catholic Church to invest in a more dynamic educational system that could incorporate the West’s rediscovery of ancient wisdom. As James Axtell explains, existing education dating from the Dark Ages was limited largely to religious observance and Biblical transcription, and needed to be expanded to integrate “the influx of new Greco-Roman and Arabic learning—in philosophy, mathematics, science, medicine, and law—that arrived after 1100…chiefly via Arab scholars and translators in Spain.”
The oldest European universities have murky beginnings despite whatever their boosters may advertise, but Bologna University was certainly up and running in the 1100s, while Paris, Cambridge, and Oxford were granting degrees by the early 1200s. Trying to find the campus of even the most prestigious university seven or eight hundred years ago would have been practically impossible, because university campuses didn’t really exist back then. Aside from monasteries, medieval institutions of higher learning were simply distributed networks of teachers and students that worked out of rented quarters around town. There were no central buildings, no clock towers, no squares, no signs, certainly no gymnasiums. In crowded medieval cities, new students would simple seek out a particular professor at his home or workshop or rented flat. And, for their part, when professors were called on by new students, they tended to engage in a very informal vetting process before deciding whether or not they would help find the student local accommodations. As Axtell writes, professors might “probe the lad’s academic qualifications: was he born…free, a baptized Christian, at least 14 years of age, able to read and understand spoken Latin, and preferably, to write it as well? If he passed, the master might have him sign a parchment matricula and take him under his wing as a member of his academic [family] and supervised inmate of his rented multi-room quarters.”
Only by the mid 1300s did a recognizably academic campus begin to emerge in some of the wealthiest locations around Europe, owing to the necessity of housing, feeding, protecting, and governing growing populations of students who might have traveled weeks away from home to enroll. Libraries were still barely relevant, as accumulating hand-copied manuscripts was often prohibitively expensive. Only after the spread of the moveable type printing press did university libraries become standard. Modern students wouldn’t really be able to recognize a medieval university as such until after the year 1500, when dining and lecture halls, libraries and dorms all began to embody academic life for the first time.
Although the look and feel of universities became more familiar by the 1500s, their fundamental purpose remained definitively premodern for additional centuries, and here’s why: medieval universities were not in the business of pursuing truth. They conducted nothing like research, as we understand that term today, for university leaders and students alike were convinced that the capital T truth was already known. The Bible supplied all of the knowledge of how the world worked and what humanity’s place was within it, and so the major function of universities until as late as the 1700s in many cases was to deepen their student’s knowledge of revelation, and to enhance their ability to reconcile Biblical teaching with secular concerns. When ancient philosophy began flooding back into Europe via Islamic scholarship, for example, the primary task for most medieval university students was not to learn this new material for its own sake or for the sake of improving European economics, medicine, industry, or culture, but to learn how to reconcile Islamic scholarship with Christian dogma.
To become a full-blown professional scholar with the capacity to teach at a medieval university, most students would need to master the tools of that reconciliation, notably grammar, rhetoric, and Aristotelian logic, along with arithmetic, geometry, music, and astronomy. There was no such thing as history as we know it, nor was there anything remotely resembling modern science or engineering. In the Middle Ages, secretive guilds and apprenticeships independent of the university system were the source of engineering knowledge, while natural philosophy was only indulged in by a smattering of wealthy or well-connected individuals. The best that most university students and professors could do through the 1500s or so was to study Aristotle’s writing on natural philosophy, along with a handful of Arabic scholarship by luminaries of medieval Islam like Avicenna.
Still, as Axtell points out, although medieval universities had a fundamentally different purpose in society, they did not, as it turned out, have a fundamentally different value. “They became repositories of knowledge, ancient and modern, and special workshops of judgement and opinion…Admittedly, they were not incubators of startling new discoveries or practical inventions, as modern research universities are expected to be… As economic engines, they were of modest horsepower and reach… [Yet], what they contributed most to a society increasingly dependent on written documents and devoted to the rule of law was trained personnel for the administration of church and state… [and] beyond their contributions to [administration]…universities enhanced European culture by focusing powerfully and steadily on the life of the mind.” In other words, the university system that was beginning to mature throughout Europe during the 1400s and 1500s institutionalized knowledge in the West for the first time in over a thousand years, empowering Europeans to gain a foothold in the methods of reason that would become so important for the establishment of modern philosophy, science, economics, law, and medicine in the generations to come. Although there would be no institutional science in the West to speak of until the late 1600s, the scaffolding of rational inquiry that science would require was slowly being erected as medieval university students leaned more heavily on reason to navigate Europe’s growing knowledge of the world.
Queen Elizabeth I was critical in securing that function for England’s universities in particular. In 1571, after over a decade of Elizabeth’s growing interest in the institutional development of Oxford and Cambridge, the British Parliament officially established both universities as self-governing corporations, which empowered them to play an even larger role in setting the national tone of a learned society.
Humanism began to infiltrate Oxford and Cambridge right from the outset of Elizabeth’s reign in the mid 1500s. In their attempt to mature beyond the medieval scholastic obsession with hairsplitting argumentation over how to reconcile Christian theology with rediscovered classical philosophy, English university professors subordinated dialectic squabbling to innovations in logic and rhetoric designed increasingly to appeal to reason rather than religious tribalism. Even more explicitly modern-looking was the manner in which Oxford and Cambridge scholars began to systematically rationalize their curriculums according to late-breaking scientific insights. As Axtell points out, “They downgraded the study of metaphysics in general, including Aristotle’s, [and] they also dropped his physics and astronomy from the curriculum because these had been superseded by the recent discoveries of Tycho Brahe, Copernicus, Kepler, and Galileo. Aristotle continued to be read in college tutorials and explicated in university lectures, but faculty were no longer afraid to disagree with him.” Likewise, it was in the beginning of Elizabeth’s reign that English universities replaced Roman numerals with the Arabic number system we know today, in an effort to, as Axtell puts it, “improve students’ receptivity to and readiness for the emerging world of physical science.” By 1600, university students at Oxford and Cambridge were expected to be comfortable enough with Arabic numbers, including the relatively radical mathematical concept of zero, to understand Copernicus’ and Kepler’s mathematical reasoning for a heliocentric model of the universe featuring elliptical planetary orbits. All of these modernizing developments at the university level in England and elsewhere through the late 1500s and early 1600s were reciprocal with the wider Scientific Revolution unfolding across Europe at that time. Beyond modernizing mathematics, a new age of scientific instrumentation, which was essential to the Scientific Revolution, blossomed concurrently within English universities helping to reinforce Europe’s wider cultural shift toward modern epistemology. Faculty and wealthy students alike began to make a habit of purchasing equipment that would aid in studies on surveying, navigation, optics, astronomy and geography, including scales, compasses, dials, slide rules, maps, quadrants, and globes. Some very wealthy university members even purchased telescopes to replicate Galileo’s bombshell discoveries of moons orbiting Jupiter and Venus’s heliocentric phases, and to see for themselves the lunar mountains and craters that defied 2000 years of belief in celestial perfection. Of course, fueling all of this phenomenal academic development was the spread of printed books in the wake of Gutenberg’s moveable type printing press.
As universities modernized, however, retrogressive politics threatened to pull Europe back into medieval ignorance and chaos. The forces of destruction have had the upper hand against the forces of progress and production throughout human history, simply because it’s always been easier to join the mob than to resist it. Just so, when universities throughout Europe were beginning to accrue the world’s first mass-printed books, religious zealots and robbers alike pounced to burn and steal them. As Axtell documents, “Many college collections, [including] both [Oxford and Cambridge] libraries, were the target of various Protestant and Catholic commissions, iconoclasts, and thieves between 1535 and 1557…Whenever and wherever…manuscripts were found, even precious pre-Catholic Anglo-Saxon documents, they often received ill treatment…Crowds paraded plundered books…through the streets and then burned them…By 1556 Cambridge’s university library had been reduced from 500-600 volumes to just 175.” Then, as now, knew knowledge, and, in particular, new efforts to pursue truth despite orthodoxy, were often unwelcome. It would take generations for universities to normalize the institution of libraries stocked with hundreds or even thousands of new books, and upgrade them from gloomy, restrictive caverns of dust into professionally staffed, comfortably furnished cathedrals of learning. By 1600, for example, Cambridge University’s main library consisted of barely over 1,000 books for the first time in its nearly 400 year history. Oxford’s collection of around 16,000 books was much more impressive, representing one of the largest libraries of early modern Europe. Yet, even these numbers are minuscule by comparison with what came later. Today, Cambridge’s main library alone houses over 8 million books, while Oxford boasts over 12 million.
The emerging institutional symbiosis between universities and their libraries was profound. As reading became the backbone of modern learning through the 1600s, the entire culture of learning underwent a massive shift. Prior to the modern age, the personal authority of individual professors and orators, priests and princes was the source of what educated people believed was true. Learning was largely public and entirely oral – attending lectures in the first medieval universities was simply an institutionalized version of how humans had acquired beliefs for literally hundreds of thousands of years. The spread of books fundamentally changed this. As Axtell puts it, “book reading became silent and private…[and] tended to invest authority or ‘truth’ in the text, the printed page, rather than in a speaker’s reputation, social rank, or accent. This shift enabled students to question their lecturers more often than had been usual and to search for their own solutions to intellectual problems… Owning one’s own book and having access to larger and improved libraries also lightened the heavy scholastic burden of memory, allowing for more creative uses of mind.”
Along with books, reliance on scientific instruments and machines for novel insight on what was true continued to blossom throughout the 1600s and 1700s in Europe, and through the second half of the 1700s in America when universities like Harvard, Yale, and Princeton began to hit their institutional stride. Harvard, for example, had been relatively bereft of books and instruments even as late as the year 1700, but by the eve of the Revolutionary War the college manage to raise substantial funds from patriotic private benefactors to purchase teaching materials featuring the latest mechanical philosophy from Isaac Newton, including Harvard’s first classroom telescopes, pendulums, barometers, quadrants, and air pumps. The most famous apparatus collected by an American college in the Revolutionary era, however, was probably Princeton’s beautiful and highly intricate mechanical model of the solar system crafted by the Philadelphia clockmaker David Rittenhouse in 1767, which was widely regarded as a ‘miracle of the age.’ By then, all learned people knew that the Earth and all of the other planets revolved around the sun, but to actually see that prospect modeled in motion first-hand through ingeniously machined brass made a life-long impression on aspiring scientists. This widespread Western embrace of new books and new scientific instruments in the 1700s characterized not only the culmination of the Scientific Revolution but a deep cultural shift in Western institutions of higher learning from maintaining traditional beliefs to discovering new knowledge.
Over the course of the 1800s, academic research itself was institutionalized, primarily in Germany, where thousands of aspiring scientists and professors from America went to study between 1850 and 1900. A new consensus emerged in the German universities of the 1800s: the ultimate goal of academia was not to become educated in what was already known; that was just the first step of becoming a professional scholar. The ultimate goal was to undertake original research in a specialized subject to publish new, more advanced ideas and discoveries. As Axtell put it, “Reading widely, deeply, and thoroughly in his field was certainly expected [of a scholar]: he would face a grueling oral exam from several professors if he wanted a degree. But the student was soon ‘encouraged to produce, to try his own powers, to see facts for himself, and then to begin investigation.’ [Over the course of the 1800s], the German universities…put a premium on published scholarship as the coin of their [academic] realm…This emphasis on original scholarship worked its way into the curriculum as the Doctor of Philosophy…which was expected to ‘contribute in some measure, small though it may be, to human knowledge.’” This 19th century German model of academia inspired many of the most prestigious universities around the world to begin emphasizing original research, especially in America, and it is the scholarly ethic that defines modern academia to this day.
The concept of tenure went hand-in-hand with the new emphasis on original research, and was likewise an institutional export from Germany’s best universities. Tenure was a radical concept – it essentially made well-established professors immune from political backlash, liberating them to follow their search for truth wherever it might lead them. For professors who published new ideas that might threaten political, cultural, religious, or even economic orthodoxy, tenure protected their livelihood—they couldn’t be fired for offending people’s beliefs in their pursuit of truth. If a professor published new anthropological or paleontological evidence showing that different ethnic groups of Europeans were much more closely related evolutionarily than people assumed, for example, that might prove highly politically offensive to turn-of-the-twentieth-century German or American culture. But if the professor published such findings according to professional standards buttressed by good faith evidence, he didn’t have to worry that offending people would cost him his livelihood. If a tenured physicist published breakthrough results on a series of experiments in electromagnetism, he didn’t have to worry that threatening to overturn established theory might undermine his professional survival with the gatekeepers in his field. German professors were explicitly freed from those worries because German ministers of education understood that the genuine pursuit of new knowledge required institutional protection. As Axtell writes, “In Germany that freedom was safeguarded by…faculty status as civil servants, and established traditions of university autonomy and faculty mobility. It did not…extend to political activity or speech outside the university…But without the faculty’s…essential freedom, an institution…no matter how richly endowed, no matter how numerous its students, no matter how imposing its buildings, [was] not, in the eye of a German, a university.’” It took some time for the rest of the Western world to accept the wisdom of tenure—the American Association of University Professors, for example, officially adopted the concept in 1915 amid the World War tension between loyalty and free speech. War put extreme pressure on university faculty to abandon objectivity, and so the American Association of University Professors bravely determined that the only way to ensure their institutions of knowledge preserved any integrity as such was to implement the system of tenure to shield their professors from a wartime mob mentality.
In Germany, the highest academic principle to be protected was called Wissenschaft, which is often translated into English simply as knowledge or science, but which Axtell points out was more specifically understood to be “‘knowledge in the most exalted sense of that term, namely, the ardent, methodical, independent search after truth in any and all of its forms, but wholly irrespective of utilitarian application.’ Wissenschaft was thus the chief goal and process, the ‘highest calling’ of the scholar.” The extent to which scholars practiced this pure form of the pursuit of truth, and especially the extent to which they published their insights, came to largely determine who deserved the protected status of tenure. By the early 1900s, “publish or perish” was a common admonition to up and coming professors. Although America’s best universities had lagged behind their Old-World counterparts for centuries in terms of the quality and quantity of academic research, they raced ahead between the end of World War I and the beginning of the Cold War, when a national ecosystem of research, as Axtell puts it, emerged to support America’s maturation as a global superpower. In America, perhaps more than anywhere else in the second half the 20th century, universities came to be regarded as “the premier producers and purveyors of knowledge in a world irrevocably beholden to new knowledge for its wealth, health, and wisdom—for the essentials of its wellbeing and progress.”
As Clark Kerr, the former chancellor of the University of California system, pointed out at the end of the 20th century, out of barely eighty-five institutions in Western Civilization established established over half a millennium ago that were still in existence, only the Catholic Church, the British Parliament, and 70 universities remained. That exceptional durability is a testament to universities’ historical importance. Upon their founding in the dark, cold, and foul-smelling cities of medieval Europe, universities provided humanity with a crucial foothold of institutionalized learning and literacy on which most subsequent revolutions in knowledge would rely. Tenure has emerged as one of the most crucial facets of that foothold as a principle of protection against the vicissitudes of politics, but dozens of university leaders are allowing the mob to shatter that protection and persecute professors who don’t tow the right line. Resignations and legal settlements are piling up for the sake of indulging the moral panic of the moment. If university leaders allow this to continue, then the university system may very well render itself defunct within a single generation. For universities to thrive for another 800 years or more, university leaders must rediscover the strength to prioritize knowledge over whatever administrative or political pressure ebbs and flows around its pursuit. For the sake of our civilization, I hope they are up to that task, or else the future may come to feel more like the past than we bargained for.
I’m Brad Harris. So Long.