Navigate to News section

Who Should Pay for College?

Making students and their families pay small fortunes for employment that is not guaranteed is pure sadism. Hamburger University shows us a better way.

by
Michael Lind
June 07, 2021
Christopher Dilts/Bloomberg via Getty Images
The McDonald’s Corp. Hamburger University employee training facility in ChicagoChristopher Dilts/Bloomberg via Getty Images
Christopher Dilts/Bloomberg via Getty Images
The McDonald’s Corp. Hamburger University employee training facility in ChicagoChristopher Dilts/Bloomberg via Getty Images

As part of its American Families Plan, the Biden administration wants Congress to spend $109 billion for two-year colleges like community colleges, along with $39 billion for two free years of college at “minority-serving institutions,” $80 billion for Pell grants and $62 billion for college completion and retention efforts. A case can be made that higher education, like primary and secondary education, should cost nothing to students and their families. Likewise, steering many students away from four-year colleges to two-year community colleges may be a good idea. But before Congress pours more money into America’s dysfunctional system of postsecondary schooling, we need a national debate to answer the question: Who should pay for higher education, and what kind of education should they pay for?

The United States currently suffers from an oversupply of college graduates. According to the Federal Reserve in 2020, 41% of recent college graduates work in occupations that do not require a college diploma. A 2021 study by Burning Glass found that—of the 4 in 10 college graduates who are underemployed in their first job—two-thirds will still be underemployed five years later. Of those, three-quarters will be working in noncollege jobs after a decade.

The overproduction of college graduates in the United States is hardly a cost-free endeavor, especially for the recipients of those excess credentials. Mass producing college graduates far in excess of jobs that require diplomas is not only a personal setback in many cases—delaying adult life, marriage, and family formation—but also a diversion of personal, family, and national resources into the dead-ends of crippling student debt and wasted taxpayer subsidies.

So who has an incentive to control costs by lowering the oversupply of college graduates? Not students, who are urged by parents and teachers to believe that they should be ladder-climbing winners, not losers—even in a highly saturated college graduate labor market. Not universities, which can make money from plentiful student loans and government grants even if many of their students do not complete their degrees or find jobs. Not lenders, whose business model is to make student loans guaranteed by the government—regardless of whether degrees in philosophy, women’s studies, or postmodern cultural theory make their recipients richer or poorer.

Nor is there any incentive for elected leaders to tell voters the truth about how badly their money is being wasted. That’s because, when it comes to higher education policy, one party—the Democrats—has been captured by the higher education industrial complex, and seeks to keep pumping taxpayer dollars into the pockets of university administrators and professors to train students for jobs that in fact require only a high school education or a few weeks of on-the-job training.

All but a few American students just want diplomas that will help them get jobs. Useful degrees mean vocational or professional training, whether the vocation is nursing aide or brain surgeon. Even many liberal arts majors, today as in the past, go on to graduate or professional programs in law, business, or medicine anyway.

With that in mind, we can ask the question again: Who should pay for higher education?

There are three possible answers: students, taxpayers, and employers. To figure out which of these should pay for higher education, I suggest we first answer three other questions: Who benefits the most from higher education? Who knows the most about what kinds of higher education are needed? And who has the greatest incentive to avoid excess costs?

The answer to all three questions may surprise you: employers.

Until relatively recently, people throughout history learned trades or professions through apprenticeships with employers. In the late 19th century, however, the northeastern white Anglo-Saxon Protestant (WASP) oligarchy refashioned schools that originated as Protestant seminaries like Yale, Harvard, and Princeton in the image of Oxford and Cambridge—institutions whose purpose was to socialize a homogeneous national ruling class. Around the same time, the bureaucratization of the professions resulted in the migration of elite professional instruction from the offices of practitioners into specialized law schools and business schools. Schools of architecture, public policy, and communications were created even more recently.

With the exception of schools of architecture, which still offer undergraduate degrees, most professional schools in the past century have made a bachelor’s degree a prerequisite for admission. The effect (and probably the intent) of increasing the time and expense of professional education in this way was to protect well-born WASPs from competition in the most prestigious and remunerative professions from their northern white Protestant country cousins and bright but lower-income Jewish and Catholic Americans, to say nothing of Black and Hispanic Americans.

Another consequence of replacing apprenticeships for professions and trades with post-college professional schools and post-high school technical schools has been to deepen the social distinction between genteel professions and vulgar working-class trades. The social prestige factor may explain why, among all Americans in 2019 who were engaged in any kind of post-high school education, four-year institutions accounted for 66% of undergraduate enrollment, while two-year institutions like community colleges accounted for only 34%.

In the absence of other reforms, pouring money into the community college system is not the answer either. Turning out even more community college graduates to compete with America’s oversupply of four-year college graduates for a dwindling number of well-paid jobs might only worsen the problems of underemployment and excess credentialism.

Instead, it would make more sense to shift responsibility and payment for most postsecondary education back to employers. They benefit the most, they know the most about the skills needed in their sectors, and they have the greatest incentive to avoid wasting resources on unnecessary credentials.

Consider the benefits of postsecondary education. Employers benefit directly from workers with desirable skills. By contrast, workers benefit only indirectly. For most of them, education is the means to the end of a job, and a job is the means to the end of income. Nurses who win the lottery can quit their hospital jobs. But hospitals cannot do without nurses.

The benefit of postsecondary education to society as a whole is also indirect. Society has an interest in dynamic firms, government agencies, and nonprofits. But taxpayers have no interest in multiplying the number of citizens with vocational skills who can’t find employment in the relevant vocations.

Employers also have the greatest knowledge of the skills needed in their industries in the present and in the foreseeable future. By contrast, students, parents, teachers, and the high school counselors who advise them can only guess what skills will be in demand in the U.S. job market decades hence. When I was in high school in the 1970s, some guidance counselors advised students that the jobs of the future included data entry processing—typing holes into punch cards that were fed into ceiling-high IBM computers—a well-paid clerical occupation that was eliminated by technological advances only a few years later. The government is no more likely to forecast skill shortages better than employers themselves.

But how do we address the problem of cost control? Named after former Secretary of Education William Bennett, “Bennett’s Disease” refers to the alleged tendency of government subsidies to inflate the cost of higher education and other subsidized sectors. Scholars debate the degree to which subsidies (rather than other factors) explain why university tuitions have outstripped inflation and economic growth for decades in the United States. Another factor may be the decline in wages for once well-paid, working-class occupations, which results in more competition for college diplomas that give access to a dwindling number of well-paid jobs.

In some cases, employers may benefit from having a “reserve army of labor” in the form of surplus college graduates competing for low-wage, noncollege jobs—but only if somebody else pays for it. No rational business would spend its own dollars on what often amounts to a four-year-long residential basic aptitude test. The answer to the third question above—who has the greatest incentive to avoid waste and excess costs?—is therefore also employers.

Most occupational training in our advanced industrial society, including for professions like law, medicine, and business, should be organized and chiefly funded by employers, with the help of the state. This conclusion might seem surprising. But as we have seen, this was the historical norm. Even education in the “learned professions” like law and medicine was based on apprenticeship until after the Civil War.

This is not an idea that most employers themselves will welcome. Employers naturally prefer to select workers from a large pool of applicants with exactly the skills they need—skills already acquired at the expense of someone else, the worker or the taxpayer. This is a classic case of businesses privatizing the profits while socializing the costs.

Now let’s consider what a higher education system funded and organized by employers might look like. If employerswho benefit the most from the skills of their workerswere required to pay the full costs of postsecondary education for their workers, there would be a closer match between needed skills and training. There would also be far less money wasted on useless diplomas or certificates.

Admittedly, there is one issue with employers paying to train their own workers, and it’s a big one: the free-rider problem. Rational employers will hesitate to train workers who, on the completion of their training, might quit and take their skills to a rival firm. The solution to the free-rider problem is for all of the firms, government agencies, or nonprofits in a particular sector to pool their resources in joint training schemes from which any firm within the sector can benefit. The federal or state governments can play a role by incentivizing or requiring all firms in a sector to take part in the training scheme and by matching their contributions with taxpayer funds.

Should students pay for at least part of their own educations? Hell no.

Should students pay for at least part of their own educations? Hell no. It is and always has been immoral to make students, even from the richest families, pay for postsecondary vocational or professional education. The reason is simple: Most people in modern industrial societies are wage earners. To survive without depending on government welfare or private charity, they must sell their labor to employers. To tell most adult citizens that they must live by selling their labor, but that they must also inherit or borrow the money to purchase a credential permitting them to sell their labor in particular trades—even when the immediate beneficiaries of their skills in those trades are their employers—is sadistic.

In an ideal world, firms, agencies, or nonprofits—including legal services corporations, medical services corporations, and educational services corporations—would hire people right out of high school. Rather than hiring from outside for different positions at different levels, firms should be incentivized to pay repeatedly for the specialized training of long-term employees to help them climb up the intrafirm career ladder. With successive rounds of training paid for by hospitals, some talented nursing aides could become nurses, and a few of them could eventually become doctors. The most accomplished paralegals could work their way up to become senior attorneys within the same legal services firm.

A good model for this kind of employer-sponsored postsecondary education already exists: Hamburger University, the training facility for restaurant managers operated by McDonald’s Corp. in Chicago.

Benefits? The McDonald’s chain benefits from having skilled managers, so the company along with franchises trains managers at their own expense. Most students pay nothing.

Skills? The McDonald’s company presumably knows exactly which managerial skills the recipients of Hamburger U’s bachelor’s degree in Hamburgerology actually need.

Cost control? The student body is limited and mostly chosen from McDonald’s employees who show signs of promise as potential managers of McDonald’s franchises.

Unlike conventional universities, lenders who make student loans, or politicians who peddle higher education as a magical cure-all, McDonald’s has no financial or political incentive to train hamburger managers in excess of the number of hamburger establishments that need to be managed.

We are not doomed to live forever with the costly and wasteful educational mistakes of the 20th century. We should pressure four-year institutions to downsize faculties that turn out underemployed or unemployed graduates. Tax breaks for colleges and universities could be conditioned on renting or refurbishing parts of their campuses for use by entry-level and mid-career vocational training programs supervised by sectoral trade associations. Fewer sociology and theater arts majors, more nursing aides and robot repair technicians.

Forget Harvard and Yale, America’s knockoffs of aristocratic Oxbridge. If we want a more egalitarian and efficient society that serves the interests of workers, employers, and taxpayers alike, Hamburger University shows us the way.

Michael Lind is a Tablet columnist, a fellow at New America, and author of Hell to Pay: How the Suppression of Wages Is Destroying America.