A human skull sits on a wooden shelf in a Philadelphia museum cabinet with a one-word label: “LUNATIC.” It is one of hundreds of human skulls amassed in the early 1800s and affixed with descriptions by Samuel Morton, a professor at the Academy of Natural Sciences of Philadelphia who sourced the artifacts from grave robbers and army surgeons, the latter sending packages from the battlefield in the name of scientific advancement. In his laboratory, Dr. Morton filled the skulls with liquid lead, which he then poured into measured beakers to determine their volume. He published his findings in leading journals, his skull measurements presented as evidence to support his claim that the “lunatics,” along with essentially any race that was not white, possessed smaller craniums and less intelligent brains—and thus were categorically inferior to himself and the rest of the white, leisured classes of Philadelphians.
Hailed then for his groundbreaking scholarship on the taxonomy of human merit, Morton is now an infamous exemplar of a generation of race scientists who abused the authority of medicine and research as a veneer to justify profitable slave ownership. In her new book, Atlas of AI, Kate Crawford, a professor at the University of Southern California, is less concerned with today’s applicable legacies of Morton’s racism as she is with the broader social implications of “technical systems trained on that data”—namely, how yesterday’s institutional abuse of medical expertise to subjugate minorities informs the present misuse of technology against a much wider proportion of the population. Though the degree of suffering may be less, the scale of influence is exponentially greater.
Artificial intelligence technologies and systems of machine learning are deployed by powerful elites to protect their own wealth and status. Morton’s skull tags and data-driven storytelling are elevated to a higher form of sophistication by the way Crawford demonstrates how our current social arrangement is dictated by “a logic that has now thoroughly pervaded the tech sector: the unswerving belief that everything is data and there for the taking.” As she writes, “The field of AI is explicitly attempting to capture the planet in a computationally legible form. This is not a metaphor so much as the industry’s direct ambition.”
Beyond the pollution, plundering of rare earth materials, and myriad other finite physical resources needed to create the raw machine capacity which powers the great, invisible cloud in the sky, Crawford is concerned about the human infrastructure that is vital to the function of artificial intelligence technologies. Again, there’s the material element, with criminally underpaid armies of digital serfs employed by Amazon’s Mechanical Turk program, physically clicking away to sort images of red tricycles and stop signs to feed the algorithms of autonomous vehicles.
And then there’s something more abstract and perhaps more pernicious—the human infrastructure of our emotions, behaviors, ideas, gestures, and bodily motion. Some of this human behavior is sifted from material given over voluntarily on social media. Other data sets capture how we talk and what we look like in various states of emotion and are then distilled into computational equivalents by an explosively expanding surveillance technology sector. Without such banks of computationally distilled human expression, some of the more novel applications of artificial intelligence would simply not be possible.
Major banks and “AI giants like Amazon, Microsoft, and IBM have all designed systems for affect and emotion detection”—helpful tools that can do things like predict criminal behavior with surveillance cameras before it happens, matching a person’s public expressive behavior to the facial expressions that are believed to be statistically common among known criminals. Such systems can be equally valuable for sorting and labeling a candidate pool of potential hires; places like Goldman Sachs have for several years now used an “AI system to extract micro-expressions [and] tone of voice” in recorded job interviews, which allow them to measure the personality expressions of aspiring bankers against those that were commonly found across the bank’s best revenue-generating employees.
It’s not exactly clear yet what the full social implications of applied artificial intelligence will be. In the corporate hiring environment, where new candidates are selected for how well they mirror a company’s top performers, the potential for what will soon seem like quaint racial and gender biases blossoms into an even wider discriminatory filter, squeezing the personalities and quirks of idiosyncratic expression selected by elite institutions into the smallest acceptable range of human behavior. Some evidence on the ethics of this kind of abstraction can be gleaned from military applications, where AI has long been heralded as a “force multiplier,” and helped shift in recent years the debate on the general morality of widespread overseas drone strikes to upgrades of algorithms that murder with acceptable levels of “precision and technical accuracy.”
Some of us will no doubt gladly trade occasional news of suspect corporate practices of unfortunate algorithmic victimization for the continual improvements to our quality of life promised by AI. What could anyone really do, anyway, to avoid being implicated in the influence of our artificially intelligent society? Over the past 30 years of work, with a 15-year hiatus from writing fiction while he produced scripts and humor books that are actually funny, the novelist Mark Leyner has been one of the more entertaining provocateurs asking that question, satirizing the bizarre and increasingly obscene ways we use technology to organize our society, and us.
In his most recent novel, Last Orgy of the Divine Hermit, Leyner continues with the depraved, hyperbolic style and plot construction that made him as famous as an experimental novelist could be in the ’90s—featured on magazine covers, interviewed by The Paris Review back when that still meant something, sitting on late-night talk show couches, playing the role of a cult-comic figure for those who soaked up his irreverent, confrontational narrative personas. Part of the sensory thrill of a Leyner novel is how those confrontational positions manifest as a high-velocity assault on celebrities, cultural totems, taboos, and anything that sniffs of commerce, technology, or money. It amounts to something of a dare for the reader, testing her willingness to be complicit in his pugilistic skepticism, which has been slightly softened in his more recent work, that nearly any human endeavor tangled up with our modern marketplace of cultural goods and technology products could have any merit at all.
Last Orgy of the Divine Hermit is perhaps the most explicit of Leyner’s fictions in how clearly it foregrounds what he believes does have value. In previous novels, the direction and target of his generosity was coded, buried in layers of postmodern tricks—and not always successfully communicated. Here, it’s on the surface, almost sentimentally by Leyner’s standards, in a loving, if unusual relationship between a father and daughter, a connection that persists despite the constant barrage of external forces, including the degrading physical condition of the father, that threaten their bond.
Their story is twisted up in an ethnography they’re writing from a bar in Chalazia, a fictitious Eastern European nation known for its extremely violent street gangs, including the “fanatical offshoot of the Chalazian Children’s Theater.” These awful criminals are prone to dramatic dance flourishes as they dismember their victims and fling their eyeballs against the glass of the very bar where the ethnography is being composed.
As the violence and “savage psychopathology” bubbles at the door, the story of the father and daughter, as well as the people documented in the ethnography itself, presents itself to the reader in bits and pieces of dialogue interspersed with the conversations of other bar patrons who are there on father-daughter night. Some of the bar’s patrons are related, while others are “role-playing” for the fun of it—reciting passages of the Chalazian folktale of a father sharing a symbolic story about his pending death to his daughter, a narrative that is told from the exam room of a New Jersey optometrist by a patient who is reading off an eye chart.
Confused yet? This entire high wire performance is punctuated by incessant riffs, jokes, and non sequiturs in psychedelic patterns that seem unique to Leyner’s own style of fiction. “It is very important to me, maybe excessively important to me, that what I do is unlike anything else,” Leyner has said. It’s an ambition he often accomplishes, which has driven Leyner’s cult following and was part of what accounted, I suspect, for David Foster Wallace writing in a letter to Jonathan Franzen that he felt “searingly envious of” Leyner.
Yes, Leyner can be tedious to read, and the value proposition of his manic, hyperbolic jump-cutting prose can sometimes seem questionable. Yet those same idiosyncrasies and annoyances make his project seem all the more vital in the surrounding desert of American letters, which has become increasingly tame, homogenized, and soberly lifeless since 9/11, particularly as technologies and artificial intelligence systems continue the swift march of categorizing everything into discrete labels for the purposes of data manipulation and monetized outputs. Of the bar patrons, real and role-playing, the optometrist’s patient reads:
The numerous spoken-word karaoke screens stream an array of variant versions of the folktale, which can differ wildly from one another (some fairly radical alterations to the original have been attributed to algorithmic methods like predictive text and auto-complete). … At any given moment, any pair is likely to be reading something entirely different from any other pair (producing the Bar Pulpo’s signature din of roaring babel), occasionally all the readers will coalesce into a short-lived synchronization, having randomly fallen upon a passage common to all iterations, so that for a brief time, everyone is declaiming the same lines in unison, reading exactly the same thing in perfect choral uniformity.
As chaos of an unusually theatrical sort waxes outside, and intrusive algorithms continue to manipulate the nation’s baroque yet distinctly personal folktale, strangers, and those hiding behind avatars can still synch up at Leyner’s bar for a shared moment of accidental synchronicity, even while remaining glued to their screens.
Sean P. Cooper is a staff writer at Tablet and co-editor of The Scroll, the magazine’s afternoon newsletter. For alerts about his work, sign up for his newsletter here.