In the late 1990s, Michael Lind wrote about different “republican” epochs in American history: the Anglo-American republic (from the Founding to the 1920s), the Euro-American republic (1920-1965), and the multicultural American republic (1965 to the present). Lind hoped the fourth American republic would be a transracial one based on the idea of what he called “liberal nationhood”—classically liberal in respect to human rights, individual property, free speech and markets, but also liberal in the more statist, New Deal-like economic sense.
If the United States is to survive as a unified state under its present configuration, something like Lind’s idea of liberal nationhood will have to coalesce. But there’s one big problem with constructing a new concept of American-ness centered around liberal ideals, no matter how one defines them: Americans hate each other.
As we approach the 50-year anniversary of George McGovern’s landslide defeat at the hands of Richard Nixon, America’s “cold civil war” shows no signs of letting up. If anything, it’s about to reach a boiling point. Between Donald Trump’s election, the near-nationwide riots last summer, the media’s grotesque gaslighting regarding the violent activities of antifa and Black Lives Matter, the events of Jan. 6, and a continuing belief across large sections of the American right that the presidential election was in some way stolen, things are not going well in the United States.
A short list of American civil sectors operating in a state of total inadequacy and on the verge of revolution-inspiring dysfunction would include health care, higher education, housing, job training, water (and agriculture), infrastructure, and public transit. In other words, nearly every service and institution on which the majority of citizens depends.
Without question, replacing Trump with President Joe Biden represents a restoration of America’s ancien régime, but for how long? Whether due to incompetence, a lack of strong leadership, or genuine political hurdles, the Biden administration will almost certainly enact only a small fraction of its purported agenda prior to next year’s midterm elections. Then there is the question of who exactly this agenda would actually benefit.
Less than 200 days into the Biden experience, Democrats have shown no sign of actually mobilizing legislation to restore the right to collectively bargain—the so-called “Pro-Act”—just as Obama sat on his hands regarding its predecessor labor legislation, the Employee Free Choice Act. Democrats have likewise done little to pass a substantial federal minimum wage hike, throwing up their hands and pretending to be powerless in the face of the Senate minority leader. (This should come as no surprise, considering the Biden administration, like the Democratic Party itself, is the home of America’s corporate and bureaucratic elite.) The administration’s negligence and failures will eventually enrage the Sanderistas on the left as well as the burgeoning pro-worker, statist political right. As a result, what’s left of Democrats’ working-class base will support just about anyone in 2024 other than Vice President Kamala Harris. Another Trump presidency isn’t just possible, it’s the way things are trending, and if Trump does take over the presidency again, the lid will come off this country. The summer of 2020 showed that Weimar Republic–level street violence between warring extremists is very much on the table in today’s United States.
Perhaps a version of Lind’s liberal nationhood was possible back when Barack Obama took office in 2009. But Trump’s 2016 election should have illustrated the extent of Obama’s failure to reconstruct New Deal-level social safety nets, move the country past race, and mount a sustained attack on corporate power. Instead of FDR 2.0—which most of America was ready for by 2008—the country witnessed the birth of woke corporatism, with Obama as condescending frontman. The result was the entrenchment of racial division and racialist doctrines along with the exponential growth of corporate power and oligarchical wealth.
The result was the entrenchment of racial division and racialist doctrines along with the exponential growth of corporate power and oligarchical wealth.
So if the dream of rebuilding America around notions of liberal nationhood is likely a mirage; if postmodern multicultural nationalism has clearly failed; and if rebuilding a more hard-edged version of U.S. nationhood around Reagan and Trump-style nostalgia for the Anglo-American past remains repugnant—what, then?
The answer is separation: Red and blue America finally accept that for the past 60 years they’ve grown irreconcilably apart, and that they are in fact separate nations—each worthy and deserving of independence from the other. Dissolve the United States, I say, and start an American Union that works more like its European counterpart. The breakup, in fact, has already begun.
In June, Florida Gov. Ron DeSantis sued the federal government and the Centers for Disease Control over COVID-related restrictions preventing the cruise ship industry from reopening. DeSantis also recently sent Florida state troopers to the U.S.-Mexico border in Texas and Arizona at the request of the governors there. Not long after, South Dakota Gov. Kristi Noem followed suit. In the same month, five rural Oregon counties voted in favor of joining Idaho. The plan for a “Greater Idaho” also includes six counties in Northern California that fancy themselves as a breakaway “State of Jefferson.”
A few weeks later, 11 heavily armed men wearing tactical gear were arrested in a standoff with police on Interstate 95 in Massachusetts. The group holds to the “Moorish sovereign ideology,” believing that the United States has no right to force them to adhere to any laws. One of the members could be heard challenging the fictions of America’s racial pentagon, declaring, “I’m not ‘Black,’ I’m not ‘Hispanic,’ I’m not ‘Latin-American.’ ... I am a Moors national.” Another member of the group proclaimed America’s 13th and 14th Amendments “fictitious entities.”
In 1962, the historian Daniel Boorstin warned that the United States had entered a dangerous new era defined not by reality, but what he presciently called “the image.” The tremendous growth of television and film in the 1950s had distorted the purpose of media, leading to the creation of what he called “pseudo events” and false flags intent on deceiving the public. Rather than coverage of critical information and narratives of cultural examination—which a liberal democracy requires to survive—the American public was being manipulated by made-up news and frivolous mythmaking. Worse, the public was complicit in its own deceit. Americans wanted their national storytellers to manipulate them. They yearned not for reality, but for the image.
Boorstin didn’t live to see how hand-held digital delivery systems would transform America’s youth into amateur exhibitionists and the rest of us into expert voyeurs, but the very concept of social media matches the essence of Boorstin’s image thesis. We still call it “social” even though everyone knows these platforms are the most socially destructive technologies released into the world since the CIA helped distribute crack cocaine to American cities in its effort to fund Contra fighters in Nicaragua in the 1980s. In a remarkably short period of time, Silicon Valley tycoons have transformed the entire country into device addicts. Today, more than 85% of the U.S. population owns a dopamine-addiction stick (also known as a smartphone) that they spend between three and six hours a day looking at.
The marketing of the smartphone represents the greatest instigation of mass addiction in world history. Predictably, it has brought with it socially deleterious consequences that rival those of the Opium Wars. Silicon Valley’s massive data-mining efforts don’t just create “filter bubble” echo chambers that sire political division based on petty grievances and clickbait sold as news; they also provide the means to manipulate the nation’s “consumers” (a group we used to refer to as “people”) into impulse-buying items they often don’t need and routinely cannot afford. Often the items are manufactured in China, destroying the American industrial and retail sectors that employ—or should employ—millions of Americans. Hence the only sizable employers left in many American towns are prisons, Amazon distribution centers, and fast-food outlets, with the unemployed and underemployed being shuttled around the corners of this new American labor triangle.
On a national level, the centi-billion-dollar social media industry is one of the biggest contributors to our national demise, routinely allowing sociopathic halfwits to organize online mobs to enforce groupthink and destroy the lives of others at a speed and frequency unfathomable 20 years ago. Social media’s detrimental effect on our political culture should be obvious, yet Americans exhibit nowhere near enough skepticism regarding the encroaching role of corporate technology in their lives—regardless of whether they are being constantly surveilled by multinational companies or turned into human data farms and spied on by government agencies.
The tough reality is that America, as a country, isn’t very smart. Individually, Americans are fun, creative, and hardworking, but together ... well, we’re pretty moronic, generally intolerant, and habitually lack curiosity. Even our “educated” classes often come across like children dressed in oversize costumes, delivering stilted dialogue in a middle school production of Richard III. Not convinced? Smoke some of America’s now-increasingly legal cannabis and watch nearly any viral anti-vaxxer video or public footage of a schoolboard meeting on critical race theory. Try to keep a straight face. I dare you. No one, left or right, has any idea what they’re talking about. It’s all a bunch of smug adolescents masquerading as adults, battling it out in front of the cameras, excited to be part of the national image of the day.
In the mid-1990s, around the same time Lind wrote The Next American Nation, the Harvard political scientist Robert Putnam published “Bowling Alone: America’s Declining Social Capital,” one of the most widely read academic essays of the era and later transformed into a bestseller respected and cited by academics, the rarest of feats in American publishing.
In his work, Putnam documented the many troubling ways modern life was eliminating civic engagement and healthy social interactions that used to come about through participation in groups like bowling leagues and organizations like the Masons or the Rotary Club. Americans were bowling more often but—how poignant—they were doing it alone. In the 1990s, Americans were still joining groups, but not ones that provided any close social contact. Rather than joining the Elks Club, Americans in the 1990s were participating in organizations with more activist missions—like the Sierra Club or the NRA—that required little besides paperwork and a mail-in donation. Rather than cavorting with their neighbors in meeting halls where they broke bread together, danced, and spoke to people with differing political views, Americans were spending far more time isolated and alone.
The release of Apple’s iPhone was nearly a decade away when Putnam began writing about the decline of America’s “social capital.” Still, by the end of the Clinton era, it wasn’t hard to see the future: a society plagued by dopamine-device addiction, obsessed with the self, and yearning for social attention, but with no clue how to acquire it in a healthy or sustainable way. Technology and suburban living had transformed leisure into an insulated experience taking place inside American households: children frozen in front of the Nintendo, teenagers sitting in their rooms with headphones on, and parents glued to the television set.
In his new book, The Upswing, Putnam charts the course of America’s sense of community and civic-minded togetherness over time, and discovers that “bowling alone” is nothing new. According to his findings, American society exhibited weak social, political, and cultural ties throughout most of the 19th century until about the mid-1890s. At the end of the Gilded Age, American culture gradually started to turn away from its selfish “I” focus. By the 1920s, the positive trends became distinct. The country moved away from a selfish, individualistic orientation toward a more egalitarian-minded one—cooperative in everyday behaviors, cohesive in expression, and characterized by a more altruistic mindset.
Then, during the early 1960s, nearly all the positive markers of American togetherness started to shift back again in the opposite direction. Americans were no longer recreating on front porches where they could converse with passersby and interact with neighbors. Instead they began living in detached suburbs with large, fenced-in back yards separating them from their neighbors, with whom they stopped speaking. When the job market demanded it, Americans packed up, moved across the country, and repeated the cycle. Trust in the government fell off a cliff. People no longer trusted the companies they worked for, or even their coworkers. Many began feeling hostile toward those with differing political ideals and social mores, who they imagined were responsible for the perceived lamentable state in which the country found itself.
By the 1970s, all the data points Putnam and his research team tracked—social, political, economic, and cultural—descended downward in a pattern every bit as dramatic as the “upswing” the country had witnessed over the previous 80 years. Putnam uses an “I–we–I curve” to refer to the rise and fall of American togetherness between the 1890s and the present.
Like most in the field of political science, Putnam is not a prognosticator. But he appears to believe that because the progressive and New Deal eras coincided with the birth of America’s brief “we” orientation, a new progressive era would be just the trick to jump-start a new upswing in American togetherness. Robert Reich, secretary of labor in the Clinton administration, maintains a similar, more economically focused position; it’s a standard line of thought for boomers and early Gen Xers of the establishment left. Barack Obama and the entire oeuvre of his presidency were the embodiment of this mindset (which, given what happened in 2016, should tell you a lot about how it went).
As a proud prognosticator, I’d like to offer up a different thesis than the “I-we-I” curve Putnam found: that the precipitous rise and fall of American cohesion was a brief accident of history in an otherwise selfish, narcissistic, liberal-individualist “I” oriented nation, to borrow Putnam’s term. The decades of American cohesion experienced mainly between 1920 and 1960 were an anomaly; the success of Franklin Roosevelt, the New Deal, and the “liberal consensus” that followed briefly afterward were the result merely of Roosevelt’s unique political genius and the tail winds of winning two world wars while all most of Eurasia was reduced to rubble.
The Cold War brought about even more anomalous, unifying circumstances for Americans, both social and economic. The “fight against communism” convinced American corporations to keep paying New Deal-level taxes and provide better employee benefits, but only for a short while. The Cold War also persuaded more rural, independence-minded Americans to tolerate “big government” in order to “beat the reds.” But when the Cold War ended, those groups went right back to their more traditional anti-statism under the Reagan coalition. Clinton and Obama operated on the same neoclassical liberal wavelength as the GOP, but with less talk about “family values,” which greatly pleased elites.
Altogether, the historical contingencies of the 20th century have concealed the dramatic failures of the progressive era and both political parties to address the rot at America’s core: a combination of liberal individualism, atomistic narcissism, and Reaganite social Darwinism that primarily benefits corporate titans at the expense of the aspirational delusions of the poor.
What’s more, the specific dates of Putnam’s historical curve of American cohesion align almost perfectly with the country’s legislation on immigration restriction—a fact that neither the political scientist nor his readers want to think about too thoroughly. The first American restrictions on immigration started in the 1880s and 1890s, and piecemeal changes were made throughout the oughts and 1910s. By the early 1920s, harsh and highly limiting restrictions were put in place even as Anglo America culturally destroyed German America. The eradication of German culture in the United States established firm ethnopolitical expectations for all Americans going forward: Conform to Anglo individualist political norms, speak English, and think like a Protestant—or get the hell out.
These nationally dominant WASP norms defined and held together the so-called “Greatest Generation,” which created unparalleled wealth and opportunity for their boomer children. In the 1960s, prosperous young boomers began dismantling this Anglo-Protestant system of norms, but had offered little of substance to take its place. In the rapture of what Tom Wolfe called America’s “Third Great Awakening,” self-righteousness disguised itself as “social activism” and new-age enlightenment. Spoiled elites—highly unappreciative of the New Deal’s populist approach and the decades of labor battles that created the pressure for change—turned a blind eye while corporate America worked diligently to return the American economy to Gilded Age levels of inequality and worker disdain.
The idea that America’s inherent hyperindividualism was concealed during the “we” period of 1890-1960 by historical contingency is supported by Putnam’s own major study on ethnic diversity and its effect on community togetherness from 2007. Given Putnam’s progressive political commitments, he seems to have initially conducted the research hoping to prove that more immigration and ethnic diversity strengthens American communities. But what he found was nearly the opposite.
According to “E Pluribus Unum: Diversity and Community in the Twenty-first Century,” greater diversity of ethnicity and national origin coincided with weaker cultural ties and greater social conflict in nearly every location and circumstance that Putnam and his team examined. Putnam struggled for years with how to present the research. He eventually published the findings in a foreign journal.
“What we discovered in this research, somewhat to our surprise, was that in the short run the more ethnically diverse the neighborhood you live in, the more you ... everyone ... all of us tend to hunker down and pull in,” Putnam later summarized in an interview.
Yet despite the data he and his research team found, Putnam insisted that diversity and more immigration benefit American society as long as we overcome certain behaviors and attitudes—a claim based on the progressive assumption that people’s natural reactions can be conquered in the long run through deliberate effort and ideology promoted, presumably, through education.
The fact that diversity apparently reduces social outgoingness for most people—and greatly increases social anomie—illustrates something wrong with the Anglo-liberal individualist project, namely that it is largely at odds with the nature of the human (and animal) enterprise. The idea that there are aspects of humanity and nature itself that resist the “fixes” of new social constructs is something progressives, with their devotion to often one-dimensional understandings of science and Enlightenment rationalism, have a lot of difficulty accepting.
No matter how it’s framed, though, Putnam’s findings on diversity and social cohesiveness contrast with the woke ideals that now dominate the country’s elite class. The woke crowd would have us believe that constantly focusing on racialist differences—whether real or imagined—will light the path toward a just American society. But if we examine the lessons of Putnam’s research without the filter of contemporary progressivism, what we find is a lucky country whose otherwise destructive cultural tensions and character flaws have been cantilevered by blessings of history outside its control.
There’s a growing sense that there’s no point in participating in federal elections. Our goals for governance are not compatible with the blue people. We accept that now.
Last month, I spoke by phone with a political operative who runs the blog Red State Secession under the pseudonym Chris Rhodes, where he advocates, among other things, for the secession of Southern Illinois. “Some of my clients are liberals and would be pretty upset,” he explained, never giving me his real name. In our conversation, Rhodes explained how seriously some conservatives plan to push back against a political imbalance they feel gives them “little or no voice at all.”
When it comes to universities, the legacy media, Hollywood, and Big Tech, it’s hard not to see where Rhodes is coming from. But in terms of elected representation, the notion that red state conservatives do not receive adequate voice is one from a parallel universe. The states of Wyoming, Montana, North Dakota, South Dakota, Idaho, Utah, Nebraska, and New Mexico have 16 senators between them, despite combining for a total population smaller than that of Los Angeles alone. Meanwhile, the state of California—with a population nearly 3 million larger than all the provinces in Canada combined—is provided with just two senators. New York is similarly underrepresented at the federal level.
Many countries in the world have bicameral legislatures like the United States, but in the United Kingdom, Canada, and Australia, their Senates (or Houses of Lords) can usually only debate and amend legislation. The rest of the Anglosphere’s upper houses have almost no ability to block legislation, a daily concern in U.S. governance. Of the wealthy countries with upper chambers holding binding votes, most operate in a far more representational manner, such as the Italian Senate, whose members are distributed proportionally. Nothing in modern liberal governance is more outdated—and more inherently anti-democratic—than the structure of the U.S. Senate, whose members until 1913 were directly appointed by state legislatures, a good indication of the Founders’ less-than-democratic intentions.
It’s a testament to the power of cultural myth that the United States has never witnessed a sustained movement to overhaul our intentionally oligarchical system of government. The fact that there’s never been any major effort to abolish the Senate or to rearrange the Supreme Court—both of which represent massive checks on democracy—largely results from the culture of Constitution-worship most Americans blindly accept without a second thought. In the history of the world, few elites have pulled the wool over the eyes of their citizens longer than the American Framers have in their creation of “a machine that would go of itself,” as scholar Michael Kammen refers to the U.S. Constitution.
Perhaps American elites refrain from challenging the Constitution, generation after generation, because in the end, they admire the Framers’ design of the “most exclusive club in the world.” Coastal leftists like Nikole Hannah-Jones can dye their hair the color of a tomato frog and offer up the pose of a punky radical, but in their backgrounds and approach, the “anti-racist” crowd is cut from the exact same cloth as the authors of the Federalist Papers: educated elites contemptuous of democracy and all the rebellious messiness it inevitably brings. The now out-in-the-open alignment between the social justice set and business elites helps explain why red-state populists loathe Pulitzer Prize winners who try to control—from the comfort of democratically unaccountable private institutions—what children learn in public school.
In my conversation with Rhodes, he laid out two different scenarios for how rural secession in America could work. In states dominated by blue urban centers, more rural counties could break off, join with other rural counties on their borders and form new, larger rural states, such as a Greater Idaho. That’s scenario No. 1. In scenario No. 2, Rhodes described the possibility of rural counties declaring independence together—maybe as one country, maybe as separate countries, or maybe as separate states. This makes little sense, but such are the considerations of a desperate and frustrated movement.
“There’s an enormous pool of conservative resentment that hasn’t been expressed yet. The pandemic put many people out of work. If there’s another economic collapse [like 2008] many of these people will have nothing to lose,” Rhodes told me. That prediction seems a bit more realistic.
“There’s widespread belief on the right that the election was stolen,” Rhodes continued, repeatedly reminding me it’s a position he agrees with. “There’s a growing sense that there’s no point in participating in federal elections. Our goals for governance are not compatible with the blue people. We accept that now.” Rhodes then gave me the cellphone number of the leader of the Greater Idaho movement and emailed me a long list of secession-related materials. “Given what happened last summer, I’m personally amazed at how restrained the American right has been thus far,” he said.
Disturbed, I reached out to Michael Lind, hoping he’d provide some answers, even if they weren’t palliative. Over the phone, I told him about my prediction that a breakup of the country was inevitable in the medium to long term. He politely disagreed.
“Basically, all the cities even in the red states are liberal, and if you drive out a few miles, the suburbs surrounding them are red,” he told me. “Unless you have an archipelago of cities connected by underground tunnels, or a Berlin airlift, secession doesn’t work. In terms of class and culture there is a secession going on, though. That’s where we’re headed with the melting pot: You’re going to have a working-class melting pot that’s frozen out of power and lives out in the boondocks. Then you’re going to have an urban college-educated elite melting pot [with all the power and wealth].”
“I have a lot of skepticism for the future of [America’s] fourth republic at this point,” Lind admitted. “The first one was dominated by the Southern planters until the 1860s. Then it was run by railroad lawyers and corporate attorneys until 1932. Really, the only time working-class interests were represented in American politics significantly was in the New Deal era. We’ve been moving back in the oligarchic direction ever since. ... I’m very optimistic about race, but not about economy and politics.”
“If secession won’t work, what happens if we don’t solve our economic and political divide?” I asked him.
“We become Peron’s Argentina,” he answered, without missing a beat. “It’s where I think we’re going to end up anyway. We’re showing signs of it ... your capitalists are mostly a rentier class, living off unearned income from banking, agriculture, finance, or stock sales … elites tied to the military and a nationalist coalition of interests. The working class is immobile. It cannot move at all.”
The American future Lind and I discussed was increasingly Latin American in character—a state of affairs where the threat of instability and violence is perpetual. Left and right are rarely meaningful distinctions. Political parties and movements merely represent the facades of elite factions tied to various competing parts of the corporate and security states. Dysfunction and power entrenchment rule the day, and there’s little hope of individual mobility across class lines. Maybe underground tunnels aren’t such a bad idea after all.
B. Duncan Moench (@DuncanMoench) is Tablet’s social critic at large and a scholar of political thought and American character studies.