Even if states like Georgia and Iowa succeed in their efforts to curtail or ban faculty tenure from public universities, how many of higher ed’s problems would that solve? It’s true that the hegemony of the woke left on American campuses is far more dangerous to intellectual diversity and freedom of debate than was the mildly social democratic Keynesian consensus that William F. Buckley Jr. criticized 70 years ago in God and Man at Yale. But the American university’s problems go much deeper than the progressive Democratic monoculture among all but a few professors and administrators, or the job-for-life guarantees for even the most ineffective and unproductive scholars. Even if tenure vanished tomorrow, the fatal problems that afflict American higher education would remain: stalled industrialization and managerial bloat.
The term “industrialization” is ambiguous and can refer to two different things. Specialization refers to increases in output made possible by the assignment of different tasks to different workers according to the principle of the division of labor. Mechanization, including automation, refers to the replacement of workers—generalists or specialists—by machinery or software.
In The Wealth of Nations, Adam Smith provided the classic case of industrialization-as-specialization with his example of the pin factory. Smith described a pin factory in which workers specialized in 18 individual pin-making tasks, permitting 10 workers to produce 48,000 pins per day (this was before maximum hours laws, to be sure!). In contrast, according to Smith, the same number of workers could only produce 10 or 20 pins if they were generalists who performed every task in the pin-making process themselves. The pin factory, therefore, was 50 times as productive as a single pin worker, even though the technology was the same.
While specialization achieves productivity growth by dividing tasks among workers in teams and in factories with existing technology, mechanization and automation increase productivity by completely replacing workers with technology. Although we associate factories with machinery, labor-intensive factories like the Venetian Arsenal existed long before the machine-based industrial revolution. And while some technologies may be suited to factory-based specialization, other innovative technologies may empower generalists and the self-employed. For example, the electric assembly line complemented and reinforced the specialized division of labor in automobile factories. But the personal computer and attendant software have empowered self-employed consultants and the like in home offices. As high-tech artisans, they can produce high-quality reports that in the past would have required the work of typists and typesetters and graphic artists.
In the 19th and 20th centuries, industrialization in both senses—factory organization based on specialization within the firm, and mechanization which allows human beings to be replaced by labor-saving technology—has transformed agriculture and manufacturing. Thanks to the high productivity of industrial agriculture, a tiny percentage of the population can feed the entire United States and other countries as well. Since the end of the Cold War, offshoring has allowed companies to replace many American workers with low-wage foreign workers in China, Mexico, and elsewhere, and the role of automation in eliminating other American manufacturing jobs is often exaggerated. Still, it seems likely that most manufacturing will eventually be largely robotized, in the next century if not in this one.
But in the last century, industrialization stalled in the case of the predominantly male “liberal professions” of the clergy, doctors, lawyers, and professors. (K-12 teachers, mostly female, always had lower status in the United States.) These professions were not called “liberal” because their practitioners were mostly on the political left (though nowadays they are). “Liberal” derives from the Latin word liber and originally meant “suitable to a member of the tiny free citizen elite,” in a society in which most people were slaves or lowly craft workers. These occupations were liberal because they combined high status and good pay and were appropriate for the children of the bourgeoisie and aristocracy. The liberal professions were artisanal, in the sense that their practitioners were generalists who were often self-employed, like blacksmiths or cobblers, but with much higher prestige and incomes.
In the modern industrial era, the members of the liberal professions have fought desperately against two threats to their high social status and autonomy. The first has been the threat that the overproduction of doctors, lawyers, and professors (we will ignore clergy) will cause practitioners to sink down into the mass of poorly paid, low-status artisans and industrial workers. The second threat to liberal professionals has been the displacement of these high-status but old-fashioned artisans by innovative organizations that can provide consumers with equal or better services much more cheaply, with some combination of factory-style specialization and technology: Medicine Inc., Law Inc., and Higher Education Inc.
To keep their fees and wages artificially high in an era of mass democracy, the medical, legal, and academic professions around 1900 engaged in a wave of cartelization, forming self-regulating professional associations to forestall direct government regulation in an age of universal suffrage. At the same time, they sought to maintain high fees or wages by limiting the number of Americans who could practice these professions, requiring expensive four-year undergraduate educations for admission to newly established medical schools, law schools, and doctoral programs. To further minimize the pool of economic competitors, the professions imposed generalist-oriented licensing exams requiring knowledge of many subjects in the case of doctors and lawyers, but not professors. Meanwhile, professional regulations, mostly written and enforced by the guild organizations of the liberal professions themselves, made it all but impossible for private sector entrepreneurs to modernize the provision and lower the cost of medical, legal, and academic services.
Both of these strategies to save the liberal professions—keeping fees high by limiting entry, and maintaining the work style of the autonomous artisan rather than the factory team worker—have been failing over the last few decades. Law schools and Ph.D. programs, motivated by the desire to keep tuition revenues flowing, have turned out far too many lawyers and Ph.D.s for the few well-paid jobs that are available, creating a proletariat of underemployed, low-wage, often part-time lawyers and academics.
In medicine and law, the old model of the generalist who is self-employed or belongs to a partnership with a handful of others is being replaced by factory-style organization. In Europe, most physicians have long been salaried employees of hospitals. But it is only in the last generation that most American physicians have become salaried employees of hospitals and other organizations, rather than independent practitioners. Meanwhile, large law firms are growing into law factories. From the perspective of the old-time country lawyer this is a tragedy, but it is progress from the point of view of the consumer who buys manufactured goods made by factories, not self-employed blacksmiths who keep their own hours.
Deans create jobs for deanlets, who then claim they are overworked and need deputy deanlets.
Tenured university professors are among the last of the old, generalist, liberal professionals in the United States. They belong to a dying species. Already, according to some estimates, three-quarters of American undergraduates are taught by “non-tenure-track faculty” (the number is somewhat lower but still high when for-profit colleges are excluded). Most of these “NTTs” are poorly paid graduate students or Ph.D.s who live from contract to contract, often teaching at several colleges or universities in a single semester, while making less money than public school teachers, plumbers, and electricians. Ironically, for an institution whose members are mostly progressive Democrats, the American university resembles a Victorian sweatshop, with a small group of privileged workers and managers lording it over a swelling mass of exploited, underpaid, nonunionized proles.
Given the shift of the teaching load from well-paid tenured professors to low-wage adjuncts and lecturers, you might expect tuition bills to go down for American students. After all, most of their classroom instructors are making next to nothing! Instead, tuitions at American universities for decades have risen faster than the rate of economic growth or inflation.
Where is all the money going? Faculty compensation has not risen significantly in the last generation, thanks to the cost savings made possible by the exploitation of low-wage adjuncts and lecturers. Some of the money from rising university costs is going to fancy buildings and recreational facilities, but most of it fuels “administrative bloat”—overpaid university presidents and other top officials, whose excessive salaries are partly justified by their need to supervise an expanding caste of university bureaucrats with make-work jobs.
The surplus bureaucrats found on today’s college campuses come in two major forms—retinues and diversity bureaucrats. The retinue phenomenon has long been familiar to students of bureaucracy: The more people below you whom you supervise, the more money you can demand to be paid as a manager. So you get a ratchet effect, in which university presidents who aspire to be paid like CEOs or Wall Street bankers get raises for supervising a proliferating number of vice presidents, and the vice presidents in turn can claim raises by multiplying their own retinues. Deans create jobs for deanlets, who then claim they are overworked and need deputy deanlets.
A more recent phenomenon is the growth of DEI (diversity, equity, and inclusion) jobs in bloated university administrations as a concession to left-wing identity politics activists. The expansion of the diversity bureaucracy on American campuses in the last decade or so has been amazing. In 2018, Mark Perry, a professor of economics at the University of Michigan, calculated that the university at which he taught had nearly 100 diversity administrators, of whom more than a quarter made more than $100,000 a year.
Like all bureaucrats, whether public, private, or nonprofit, administrators and supervisors tend to come up with unnecessary tasks to justify their salaries. At many universities, in order to be considered for salary increases in annual merit reviews, professors are now required to have authors on reading lists who are racially and sexually (not intellectually, religiously, socioeconomically, or geographically) “diverse,” as determined by the latest U.S. national census.
Needless to say, having bureaucrats label reading list authors by “race” is disturbing in itself. During the Third Reich, the Nazi legal theorist Carl Schmitt suggested that a “J” be put next to the name of all Jewish authors in scholarly citations, to warn Aryan scholars and students of Jewish bias. Will American university professors in their annual salary evaluations be punished by university diversity bureaucrats for having too many Jews on course curricula?
And what about gender and sexual orientation? How university diversity bureaucrats, assigned to evaluate professors in annual salary reviews based in part on the number of nonmale, nonheterosexual authors they assign in their reading lists, will be able to ascertain which authors, many of them long dead, are or were straight, gay, lesbian, or nonbinary, is a mystery. The DEI curriculum review committee’s notes on William Shakespeare, who wrote passionate love sonnets to a man, might read like a report to J. Edgar Hoover’s FBI: Shakespeare, William. Married with children, but with suspected LGBTQ+ proclivities. Assign to Approved Nonbinary Authors Quota as well as Cisgender White Male Heterosexual Authors Blacklist.
The contemporary American university is an enormous Kafkaesque bureaucracy teetering on top of a small Dickensian sweatshop. If we don’t count the sports teams and the research institutes, the university consists of preindustrial artisans, the instructors, divided between a small and shrinking group of elite tenured artisans and a huge and growing number of impoverished apprentices with no hope of decent jobs—with all of the artisans, affluent and poor, crushed beneath the weight of thickening layers of middle managers.
Apart from useful research, most of which could be done just as well in independent institutes, the product of all but the most prestigious American universities consists of diplomas which are rendered progressively more worthless each year thanks to credential inflation. According to the Federal Reserve of New York, the underemployment rate for recent college grads—that is, the percentage working in jobs that do not require a college diploma—was 40% at the end of March 2021. True, workers with college diplomas tend to make more than those without them—but at least some of the premium comes from Starbucks baristas with B.A.s pushing high school graduates into even worse jobs.
In a productive economic sector, labor-saving technology and/or the factory-style division of labor result in what might be called the virtuous circle of industrialism: Prices for consumers fall, wages for workers rise, and the ratio of managers to productive workers stays the same or shrinks. In the American university, however, technological stagnation, artisanal production, and administrative bloat result in rising prices for consumers, falling wages for the majority of productive workers (nontenured instructors) and more and more bureaucrats per worker over time.
I have had the privilege of teaching at different times as a graduate student instructor, nontenured faculty member, or professor of practice at Yale, Harvard, Johns Hopkins, and the LBJ School at the University of Texas at Austin, which I will depart by choice in the summer of 2022 in order to write full-time. My conclusion, as an outsider with one foot on campus, is that the practices of the producers of higher education and the needs of its consumers are no longer compatible. It is in high school, not college, that American students need to learn skills that allow them to obtain good jobs, innovate, and build wealth, with the help of employer-provided training when appropriate. For most young Americans it is a waste of time and money to spend four years being socialized into the weird woke lingo and thought patterns of a small national oligarchy that the vast majority of Americans can never realistically hope to join.
The abolition of tenure by public and private universities alike would be a step in the right direction. But by itself it would not replace the pre-modern academic sweatshop with a modern enterprise; indeed, it might result in an increase in the number of underpaid adjuncts and lecturers teaching swindled students at overpriced schools. Still missing are the entrepreneurs who can revolutionize post-high school teaching in America, the way that Henry Ford revolutionized automobile manufacturing with the assembly line—and also the politicians determined to reform laws and regulations to permit radical entrepreneurship in this decrepit and imploding sector.
I don’t know how to radically reform U.S. higher education. That is a challenge for the future Jeff Bezos or Elon Musk of higher ed. But I am pretty sure that whatever succeeds the failed multiversity will include much greater specialization and interdependence. There might be team teaching tailored to individual students, perhaps, along the lines of the “team medicine” that is used by hospitals and clinics to put together ad hoc teams of medical workers to help individual patients. There also needs to be deployment of technology to increase productivity in more imaginative ways than simply filming 19th-century lectures and putting them on the web, enabling professors to boost their teaching scores by entertaining bored students with animated PowerPoints, or writing software that can grade multiple-choice tests.
The next American university—but stop right there. Maybe there shouldn’t be a next American university. Maybe instead there should be an industrialized, high-tech higher education sector that looks nothing at all like the American university in its present, dying form. In the 21st century, there is no reason other than inertia for vocational training departments, scientific and engineering research institutes, blatantly ideological and partisan political disciplines like gender studies and ethnic studies, and buckraking sports teams to be part of the same gargantuan nonprofit organization, with hedge-fund-like endowments and expensive urban real estate exempt from taxation.
Why not just break up Big U? Why not apply the logic of antitrust to the bloated, wasteful nonprofit academic sector? Let corporations do their own vocational training, like McDonald’s with Hamburger University, or contract with free-standing trade and professional schools. Let research be done by independent science and engineering institutes, with apprenticeships for young scientists and engineers. Let gender and ethnic studies departments be left-wing nonprofits; there will be plenty of liberal and leftist donor money for them. Let the former college sports teams go professional and let their professional players unionize.
The American university has existed in its modern form as a dysfunctional conglomerate that devours ever more of society’s resources with ever worse results only since World War II. Now in its 70s, it is not long for this world. Rather than keep it alive on a morphine drip of student and taxpayer money, we should let it pass away and be replaced by institutions which, in truly innovative ways, use technology and teamwork to lower prices, raise wages, and keep the number of managers to a minimum.
Michael Lind is a Tablet columnist, a fellow at New America, and author of Hell to Pay: How the Suppression of Wages Is Destroying America.