Navigate to News section

Anorexia and the Holocaust

Descendants of survivors are sometimes afflicted with a unique pain of their own

by
Julie Goldberg
July 21, 2021
Isip Xin
Isip Xin
Isip Xin
Isip Xin

If you look underneath the tweets of Eugenia Cooney, a popular fashion YouTuber whose increasingly thin appearance in her videos has garnered intense public attention, you’ll see replies rife with Holocaust comparisons. Some tell her she is “like an emaciated Holocaust victim” or indeed “skinnier than actual Holocaust camp victims”; others offer quips such as “You’re cosplaying Holocaust victims?” or “oh wow is it Holocaust memorial week already?” Under one particularly alarming photo, one user writes “Jewish girl after Holocaust colorized 1945.”

As it happens, the “Holocaust survivor” comment and its myriad variants abound across not only Eugenia’s, but also any visibly anorexic girl’s social media feeds. One wonders whether the comment is intended for the anorexic girl herself—to shock her into recovery or shame her for promoting harmful “thinspo” content—or if it is posted for a like and a retweet at the girl’s expense. Any critique of the grotesque comparison is rarely elaborated past the point of “You can’t say that!” Our difficulty in pinpointing precisely why this is grossly inappropriate raises the question of where our concern actually lies. Is it the fragile psyche of the anorexia sufferer we are worried about? Or is it the inviolate memory of the Holocaust we seek to protect? Might it be a mix of the two?

What is more surprising is how frequently the Holocaust survivor comparison is made in medical guidebooks and scholarly journals, and even in the firsthand accounts of anorexia sufferers and their families. When it is not being lobbed as an insult, the term “Holocaust survivor” and its variants are often used as a barometer of the severity of the illness. In comparing the anorexic individual’s body to that of a person who has been subjected to the most abject human suffering imaginable, the speaker makes it clear that this body is in grave distress and in need of intervention.

While these comparisons are less frequent in contemporary texts, they are far from uncommon in those of the later 20th century, particularly in the 1980s, when scholarly as well as popular interest in anorexia proliferated. A father’s cautionary tale, published in a 1984 issue of The Guardian with the headline “The trouble with a middle-class brat,” describes the titular “brat” as looking “increasingly like a concentration camp victim, an impression she heightened by having her hair almost shaved off.” A year later, Dr. Ira L. Mintz referred to anorexia patients as “self-made concentration camp victims,” implying, at once, that they are victims of their own minds as well as a kind of counterfeit of “the real thing”—appropriating the physical characteristics of one who has suffered tremendously without enduring that same degree of suffering themselves.

The stereotype of the anorexic individual, indeed, is the girl who has been brought up with all the comforts and privileges one could desire, facing no hardship from the external world. She elects to persecute herself, and thus enacts an ironic inversion of the “self-made” success.

In her seminal, if now outmoded, 1985 text, The Hungry Self, Kim Chernin cites the frequency of Holocaust comparisons in the families of sufferers. She provides testimony from a mother likening her 20-year-old daughter to “those pictures of people just liberated from Buchenwald,” as well as another woman who “can’t bear to look” at her younger sister, as she “looks like a death’s head, like one of those people the Nazis tried to starve to death.” For these family members, the comparison bespeaks estrangement from the girl they once knew. It is an attempt to communicate what they feel in their hearts— that she is simply not herself.

Claiming that we rely on liberation photography to grasp the “full horror” of what took place, Chernin points to the expediency of this “most tragic and dramatic imagery” in identifying the starving, and consequently suffering, body. Referring to “the protruding bones, the sunken faces, the enormous eyes and distended bellies that mean starvation, whether imposed or self-afflicted,” her description of the emaciated body points to starvation’s depersonalizing effects. The anorexia sufferer’s illness becomes her defining feature; she is no longer herself. This shorthand saves one the trouble, and perhaps the despair, of describing each feature separately.

What these comparisons and their critics seem to disregard, however, is the possibility that it is not always a coincidence that anorexic individuals can be said to resemble Holocaust survivors. For those with ties to the Holocaust, this resemblance merely makes legible an involuntary inheritance of a body impressed with pain. Decades of research have shown that Holocaust-related trauma is passed down to the second and third generation and has untold psychological effects. A 2015 study from Mount Sinai Hospital, for example, found that the children of Holocaust survivors, compared to Jews whose parents lived outside of Europe during World War II, expressed notably higher levels of a gene associated with post-traumatic stress disorder. The answer to whether this traumatic inheritance might articulate itself through disordered eating habits can be found in clinical studies, population-based surveys, and firsthand accounts alike, with the latter sending the most unequivocal message—that Holocaust trauma has affected successive generations’ relationship with food and their bodies in complex and life-altering ways.

In Granddaughters of the Holocaust: Never Forgetting What They Didn’t Experience, Nirit Gradwohl Pisano speaks to a number of third-generation women, many of whom comment on the treatment of food in their homes, and several of whom recall struggling with varying forms of disordered eating throughout their life. While 29-year-old Miriam only tentatively mentioned some “weird eating habits” in high school, 22-year-old Briana was more explicit about her struggles with anorexia; interestingly, she spoke most extensively about her parents’ reaction to it rather than her own experience. Her dad, she felt, “wanted to talk about it ... but not really talk about it ... but he wanted to make me eat … He didn’t want to discuss the problem underlying it; he just wanted to fix it on the surface.” Her mother was similarly in denial and, she suspects, “also felt guilt about it, which didn’t help with the guilty situation.”

This hostility toward addressing the root causes of the issue seems to be common in second-generation parents of anorexia sufferers. In her account of her 30-plus years doing psychoanalytic work at the Tavistock Clinic, Marilyn Lawrence notes that she frequently dealt with cases of anorexia in third-generation Holocuast survivors. One patient, raised in “deliberate conscious ignorance” of her family’s past, developed a severe case of anorexia as a teenager, which Lawrence describes as “a terrible spectacle to witness for a family that had barely survived starvation.” She also reported experiencing irrational feelings of hatred and resentment toward her living grandmother, even as she was at least superficially ignorant as to what the other side of her family had endured. Her parents’ reticence, an instance of the phenomenon Dr. Yael Daneli has dubbed the “conspiracy of silence,” seems to have aggravated rather than mitigated her intuitive sense of the pain and suffering within her family. If the tragedy of the past is necessarily a burden passed on to subsequent generations, then it seems it is a burden that gets heavier, not lighter, when it is suppressed.

It is not uncommon for this expression of trauma to essentially skip a generation. Studies on traumatic inheritance, including on Holocaust trauma and disordered eating, have long made note of this pattern, finding, for example, that “[second-generation] mothers who are more Holocaust-exposed have daughters who are more eating disordered.” In other words, the third generation’s likelihood to develop disordered eating habits was determined not by their own degree of knowledge of their grandparents’ suffering, but that of their mother’s. Though the second generation was more familiar with their family’s past, it was the third generation that made this painful legacy manifest through extreme measures, such as the emaciation of the body.

In a 1989 clinical review of Holocaust survivor families, Carolyn Quadrio encountered this generation-skipping effect in a grandmother-granddaughter dyad. Their case is particularly striking for the fact that the grandmother, a direct survivor of the Holocaust, developed anorexia herself after the war. Plagued with the memory of seeing the rest of her family ushered to the gas chambers, she attributed her survival to the sacrifices they made, as they each gave her a bit of their daily rations, since she was the youngest and seemingly the most vulnerable. She developed anorexia in her adult life as a kind of compensation for the extra nourishment she did not feel she deserved, and which she believed had cost her family members their lives. While this survivor’s daughter maintained an at least ostensibly healthy relationship with food, her granddaughter proved prone to the same food-refusing tendencies, as if restaging her grandmother’s pain; the effects of trauma, it seems, were not diluted but were rather ossified with the third generation.

In continuing to regard the Holocaust as a rhetorical tool rather than a lived reality, we abandon a considered treatment of its horrors, the details of which are increasingly forgotten, misstated, and muddled.

Those who have suffered the loss of a loved one to starvation, whether that starvation be the primary (as with anorexia) or secondary (as with cancer) symptom of their illness, have been liable to reduce their body to a comparably emaciated state as a means of reincorporating the lost object. As Jackson and Davidson propose, “their emaciated bodies bear their ‘self-guilt,’ a pervasive sense of discomfort for simply being or existing.”

Even if an eating disorder does not appear to stem from knowledge of or wounds incurred by the Holocaust, when a disease like anorexia does develop in the second or third generation, whatever the cause may be, the sufferer is bound to experience redoubled feelings of guilt about subjecting themselves to the bodily deprivation and torture that their ancestors were lucky to survive. As a study by Michael Friedman evidenced, second- and third-generation anorexia sufferers habitually reported experiencing intense feelings of guilt and anguish.

Another case of anorexia in a direct survivor, and of the second generation’s fraught relationship with their family’s past, was that of Judith Heyman and her daughter Ruth Gordon. After fleeing Nazi Germany all by herself at age 12, with nothing but her family’s cake tin in tow, Heyman found a home in England through the Kindertransport. Four years later, she was informed that her parents and sisters had all been shot in Riga. She had survived, but she found that perhaps mere survival was all she could hope to manage.

Nonetheless, she married and had a child, Ruth Joseph, to whom we owe most of our knowledge about Heyman. In her biography, Remembering Judith, Joseph recounts the experience of watching her mother waste away with anorexia, all while she rapidly gaining weight herself, as if to neutralize her mother’s loss. Describing herself as a “surrogate stomach,” she ate extravagantly in order to please her mother, who watched her “like a kind of eating voyeur, licking her thin tired lips, her mouth dry with exertion …” If Heyman was doomed to reinscribe the trauma of her family, her daughter at least represented the hope for a future free of such fetters. Joseph internalized this responsibility; this meant fulfilling her parents’ hopes for her by literally filling herself up.

Joseph’s testimony illuminates the precarious position in which second-generation survivors find themselves. The children of survivors, as Joseph insists, “are also suffering … [we] still hurt and want to know why.” While the second and perhaps even more so the third generation is spared the blunt, abject suffering of the trauma itself, they are afflicted with a unique pain of their own. Describing the third-generation patients she treated as “feeling filled up with death, even though for their parents they consciously represent the hope of new life for the family,” Chernin identifies the paradox at the heart of the second- and third-generation’s struggle. Looking back on the past from the distance at which it can take on metaphorical meaning, they are left uncertain as to which identities they can lay claim, while also burdened with the responsibility to inject new, untainted life into the family.

This is quite the charge: to “never forget,” but also not to become a slave to memory, to transcend the past by virtue of one’s duty to it. It’s a balancing act that one second-generation survivor describes as a “schizophrenic kind of thing.” For many, their mental health struggles are only exacerbated by a sense of guilt over their suffering, considering how comfortable their lives—whose existence they owe to their ancestors’ survival—appear. As Adam Kovac shared in Tablet, “I didn’t just feel like I’d let down my long-deceased grandma and grandpa—I felt like my self-indulgence was a betrayal of their legacy.” The second-generation subject of a 2015 case study of treatment-resistant obesity, whose excess flesh was identified as the source of his psychological suffering, recorded his lifelong feelings of guilt over suffering despite having a superficially easy life, and yet still “not feeling as guilty as [he] should.”

For these second- and third-generation survivors, joy feels at once undeserved and unearned, and yet the only proper recompense for their ancestors’ pain. “It also means that your life does not belong to you,” elaborated Michelle. “You don’t have the right to be satisfied.” It is not difficult to see how this renunciation of one’s own right to live can drive one to self-starvation, how this guilt over being satisfied, in an emotional sense, can translate into guilt over being satisfied in a physical one.

Under a controversial blog post, which took the stark approach of simply alternating photos of Holocaust victims and anorexia sufferers in succession, one commenter, whose grandfather was the only one of his family to escape the Holocaust, argued that this grim collage only made explicit an already present link in the minds of third-generation survivors like herself. “I am also an anorexic/bulimic,” she writes, “and I have spent a long time battling with the link between seeing pictures of how my relatives were forced to be, and what I enforce against myself.”

Where the Holocaust has had a role to play in anorexia treatment is not through photographs of starved bodies, which can be used in any number of ways, many of them toxic, but through decades of research into the effects, both physical and psychological, of starvation. The term “calorie” as a unit of heat wasn’t coined until sometime in the early 19th century, and it wasn’t until the 20th century that it rose to prominence as a means of measuring one’s dietary intake. After conducting a series of experiments in Leipzig and Berlin during the 1880s, and going on to develop a method of measuring caloric content, after which he is named, the chemist Wilbur Olin Atwater returned to the States and shared his findings in a series of articles in The Century Magazine. In these articles, he simplified the science into colloquial terms, making this new understanding of food-as-fuel and body-as-machine accessible to the public. He shared that Americans consumed significantly more than Europeans, a discrepancy that could be corrected, he explained, through attentiveness to “fuel-values,” which he defined as “the measure of the power of the food eaten to yield heat to keep the body warm and strength to do its work ... expressed in calories (heat units).” The distinction between what was “enough” and what was “excess” was no longer based on the body’s own signals, but on precise science.

This measurement of caloric needs allowed for the emergence of a science of productivity, which could determine, for example, precisely how much a body needed to be fed, and how much energy it could afford to expend, without slipping from a state of malnourishment to death-dealing emaciation. Before the calorie was incorporated into the dieting lexicon, it was utilized and often exploited by nations and institutions to maximize productivity with minimal investment in sustenance. “Caloric tables could be used to estimate rations for cities, armies, or even nations. Military rather than hygienic necessity made the calorie an international standard measure of food,” Nick Cullather summarizes.

Throughout World War I, the United States government encouraged citizens to ration foods like meat, wheat, and sugar, with future President Herbert Hoover calling on “his police force—the American woman” to institute “Wheatless Wednesdays,” and experiment with alternative recipes, which he dubbed “Victory meals.” Ordinary citizens (and women, as domestic food providers, in particular) were thus called on to perform a passive resistance, sacrificing their own nutritional needs—or, at the very least, their own wants—in service of the wartime project. Restaurants in New York City even began to display calorie counts on their menus—now a common and even mandatory practice at many food chains.

Calorie counting was thus promoted as a form of patriotism; exceeding the FDA’s recommendations was a kind of moral failing, a display of self-indulgence at the detriment of a carefully calibrated war effort. Despite the dubiousness of a link between the calories eaten at an NYC café and the reserves available for the war, individual eating habits accrued symbolic meaning, with personal willpower now articulated as a micro-expression of the mighty moral fortitude that the U.S. would wield to triumph over its enemies. Posters making persuasive demands such as “Food Will Win the War ... Waste Nothing,” reinforced this ethos of personal responsibility. Meanwhile, Dr. Lulu Hunt Peters’ Diet and Health, one of the first truly influential texts of this emerging subgenre of self-help—that is, help with weight loss and dieting—hit shelves in 1918, selling over 2 million copies and remaining a top 10 bestseller through 1926. Hitting its readers with commands like, “Hereafter you are going to eat calories of food. Instead of saying one slice of bread, or a piece of pie, you will say 100 calories of bread, 350 calories of pie,” this was the first weight-loss book to advocate for calorie counting, and it did so quite unyieldingly.

Once World War II arrived, the reign of the calorie had manifestly taken root in American and European culture alike, and the Nazi attack on food supplies for annexed populations throughout Europe was based on the same meticulous arithmetic that Hoover used to buttress the U.S. war effort during WWI. Calculating rations for the Warsaw Ghetto, Nazi doctors concluded that if they restricted daily intake to 700 calories per person, it would only take nine months to decimate the entire population. This is not to say that the dominion of diet culture can be considered in any way comparable to the genocide of 6 million Jews, but rather that as this qualitative approach to food consumption (which fueled disordered eating habits) was being popularized in the U.S., it was also being weaponized, with devastating consequences, against the Jewish people.

It was in response to the quite literally “calculated” Nazi plan to starve the Warsaw Ghetto population to death that Dr. Israel Milejkowski, the public health director of the Judenrat, led his team of Jewish doctors on their landmark study of “the hunger disease.” Examining over 100 patients, the doctors identified a myriad of previously unrecorded physiological symptoms of starvation. The study was also the first of its kind to anticipate the potential dangers of what we now call “refeeding syndrome,” a term commonly invoked to describe the dangers of overzealousness in the initial stages of anorexia recovery. Their suspicions were confirmed at the end of the war, when alarming numbers of those who had been liberated from the camps passed away suddenly in the initial days and weeks of their freedom.

Perhaps what is most astounding about the Judenrat’s work is that the doctors themselves were living under the same conditions as their patients; they, too, were starving, and only one survived the war. Amid this suffering, however, they saw the opportunity to study the symptoms of this horrific “hunger disease,” so their discoveries might be utilized in the treatment of future victims. Milejkowski’s final note to his fellow doctors, whom he addresses as not only his colleagues but his “companions in misery,” ends with the following prophecy: “You are a part of all of us, slavery, hunger, deportation, those death figures in our ghetto were also part of your legacy. And you by your work could give the henchman the answer, Non Omnis Moriar. I shall not wholly die.”

Milejkowski was correct—their legacy has lent them a form of immortality. We can at the very least acknowledge how foundational their work, and the work of others like them, has been in developing successful treatment strategies for anorexia sufferers and starvation victims worldwide.

And while most are unacquainted with the Warsaw Ghetto hunger study, many more are familiar with the Minnesota Starvation Experiment. As the name implies, this was an experiment on the effects, physical and psychological, of starvation (or “semi-starvation,” as Keys termed it) on the body. In 1944, as Allied forces emancipated Nazi-occupied Europe and began to grasp the scope of the emaciation these people had suffered, previously unconsidered questions about how to assess and treat such extreme starvation were raised. The researchers at the Laboratory of Physiological Hygiene at the University of Minnesota sought to answer these questions by taking the (even then) controversial approach of enlisting 36 young, healthy men to follow a highly restrictive diet for 13 months, allowing them to study the effects, both physical and psychological, of faminelike conditions. The participants were not random volunteers, however, but conscientious objectors from the Civilian Public Service (CPS), all of whom had an active interest in putting their bodies on the line for the good of mankind. “We were full of idealism [...] Everyone else around us is pulling down the world; we want[ed] to build it up,” said one conscientious objector, Samuel Legg, when interviewed in 2004.

As another participant, Marshall Sutton, stated, “I wanted to identify with the suffering in the world at that time. I wanted to do something for society. I wanted to put myself in a little danger.” It was an act of service as well as of solidarity; the young men who volunteered their bodies clearly expressing a desire to prevent future suffering, to contribute to peace rather than further violence.

Among the study’s many significant findings was that the participants developed an obsession with food to the exclusion of all else; one man recalled amassing nearly a hundred cookbooks over the course of the experiment. It also precipitated an understanding of how the disease can progress so quickly and how starvation can actually beget further starvation, as the stomach shrinks in response to a restricted diet and cannot accommodate what would otherwise be a “normal” intake.

Though the study’s findings were groundbreaking, they weren’t published until 1950—too late to be applied to the immediate treatment of World War II starvation victims. This came as a great disappointment to many of the conscientious objectors, who had sacrificed their physical and mental health to be of use to the Allied effort. “We had hoped to have an effect on the world hunger situation following the war … [but] the experiment was a little late,” lamented participant Earl Heckman. It might provide some belated consolation, however, that the study has been extraordinarily influential in the development of eating disorder research and treatment methods.

Formally titled The Biology of Human Starvation, the 1,385-page study has been cited over 4,600 times, with over 2,000 of those citations coming from texts focused on anorexia nervosa. The study inaugurated decades of research into the psychological effects of starvation, raising the question of whether or not they can even be addressed until the patient is sufficiently weight-restored, and critically brought to light the myriad comorbidities, such as anxiety, depression, and OCD, that often accompany the disease. It also played some part in removing the stigma from the disorder, as some of the sufferer’s less palatable attributes—irritability, fatigue, self-centeredness, for example—came to be understood not as character flaws but as psychosomatic symptoms of starvation.

Both Milejkowski’s and Keys’ starvation studies have had an immeasurable influence on eating disorder research. Yet, though strides have been made, the link between Holocaust trauma and disordered eating still remains understudied. Moreover, this link is often underrecognized in the treatment process, as doctors seem to be quicker to describe their patients as concentration camp victims than to inquire into their family histories and religious identities.

The Holocaust survivor comparison, like any cliché, has been abraded with repeated use. Yet for those to whom the archetypal “Holocaust survivor” is not an anonymous figure but a parent or grandparent, with a name and a face, such a comparison is fraught with guilt and anguish. For the conscientious objectors who “starved so that they may be better fed,” its superficiality devalues the sacrifices they made.

In continuing to regard the Holocaust as a rhetorical tool rather than a lived reality, we abandon a considered treatment of its horrors, the details of which are increasingly forgotten, misstated, and muddled. Meanwhile, misconceptions about anorexia—that it is the province of upper-class cishet white girls, a fit of juvenile obstinance, merely an extreme diet rather than a psychological terror—still abound.

My point is that while surface-level comparisons between the Holocaust and anorexia are indeed crass and trivializing, we also do ourselves a disservice when we act as if the two have nothing to do with one another. Ultimately, the individual in the throes of anorexia does resemble a Holocaust survivor. The self-sameness of starvation’s physical presentation is unsettling because it effaces the psychological aspects of starvation, but it is also what has allowed for the transference of medical knowledge from World War II starvation studies to anorexia nervosa research. This history is erased, and the suffering of the Holocaust is trivialized, each time it is cited simply out of convenience.

Julie Goldberg is a New York-based writer currently completing her master’s degree at the University of Cambridge. You can read her short fiction in Hobart, The Los Angeles Review, and Minola Review.