Protesters call for businesses to sever their relationships with Donald Trump as they demonstrate outside the site of a new hotel owned by Trump. (Reuters / Yuri Gripas)
Thank you for signing up. For more from The Nation, check out our latest issue.
In Lysistrata, the women of Athens give up sex to end a war. There is a version of this story that would see this as little sacrifice on women’s part. The enjoyment of sex, we’re so often told, was a male province until very recently. Aristophanes knew otherwise. His Athenian women were just as lustful as the men. When Lysistrata tells everyone her plan, after all, she’s hardly finished before she has to yell, “Why are you turning away from me? Where are you going? Why are you all pursing your lips and shaking your heads?”Ad Policy
I thought of that scene when fault lines quickly appeared between the women looking to organize a large-scale demonstration among Donald Trump on January 21. At first, the dispute was over taxonomy: The first name for it was the Million Women March, but there had already been a Million Woman March of women of color in Philadelphia in 1997. That problem was quickly dispensed of: The thing was rechristened the Women’s March on Washington. But then there were squabbles over the homogeneity of the leadership of the march. That, too, was quickly taken care of: There are now three women of color who are co-chairing the event with other organizers. In the meantime, though, some small amount of damage was done.
That women don’t instantly align with one another should not be surprising. If the election taught us nothing else, it was that “women” don’t have a prima facie claim on one anothers’ loyalties. Whatever the strengths and failures and, well, active sabotage waged against her campaign, Hillary Clinton did not get the majority of the votes of white women. She commanded over 90 percent of black women, but that statistic served only to exacerbate the sense that women were not in anything together. In fact, women should already know that their “solidarity” is a fragile thing: History and experience have given us reason enough to know it.
The earliest women’s protests happened spontaneously before there was a reason to even call political action “organizing.” For example, we do not know exactly how what the English historian Thomas Carlyle called the “Insurrection of Women” got started in Paris, France, on October 5, 1789. We know that a man named Stanislas-Marie Maillard marched in front of the women and called himself the march’s leader. We also know that Louis-Philippe Joseph, the scheming Duke of Orléans, was suspected of being behind several popular protests of 1789. But some historians believe he headed a spontaneous procession, seven thousand Parisian women marching on the elaborate Versailles palace because they had no bread to feed their children.
“Men know not what the pantry is, when it grows empty; only house-mothers know,” Carlyle wrote of the Versailles marchers. He was accidentally making a key distinction, there: Even back in pre-revolutionary France, there was a line to be drawn between “all women” and “house-mothers.” In bringing their complaints to Versailles the women were, in some sense, going up against another woman: Marie Antoinette, her excesses a scapegoat for every bad thing the ancien régime stood for. No one would have mistaken the queen’s interests for those of her subjects.
What solidarity there historically was for protesting women, then, was based in homemaking and child-rearing. This phenomenon of the food riot does come up again and again, in world history. The foodstuffs in short supply change through the ages: In Virginia, during the Civil War, it was meat the women were after, eventually stealing 500 pounds of bacon. In Russia, during World War I, it was a shortage of sugar that set off a riot. Through time, and across the world, though, it’s often the women who lead the charge: In Nigeria, in the 1920s, it was the potential taxation of the market women that sparked a heroic—and successful—revolt against the colonial authorities known as the Women’s War.
Ready to Fight Back? Sign Up For Take Action Now
In the America of 2017, the lines of femininity are not so clear cut. From the perspective of the left, the main points of alliance are on questions like pay equity and reproductive rights. Important as those may be, they lack the primal appeal of food in the bellies of starving children. They also dissolve into thin air across partisan lines. Resistance to Trump, as a goal, has the potential to finally get American women out of the holding pattern the divided opinions of Americans.
There is hope for that: Women have more often won out by arguing with each other than they like to remember. Among themselves, for example, the suffragettes agreed on very little. Their Carrie Chapman Catt was an institutionalist, wanting to work within established lines. Alice Paul was a shit disturber, wanting to hold demonstrations like the ones she’d observed in England. Frances E. Willard, the head of the Women’s Christian Temperance Union, clashed repeatedly with Ida B. Wells, who pointed a finger at the racist appeals Willard frequently made in the name of arguing for women’s suffrage.
That fractious history can be traced right down through the 1970s. We remember the second-wave feminists as a monolith now, but they did not feel that way about themselves. When Jo Freeman wrote a well-regarded essay on “trashing” in the movement, she also admitted that disagreement, conflict, and opposition “are perfectly ordinary phenomena which, when engaged in mutually, honestly, and not excessively, are necessary to keep an organism or organization healthy and active.” When Audre Lorde wrote a letter to Mary Daly arguing that her tract Gyn/Ecology did not include any women of color, Lorde implored Daly, “This dismissal stands as a real block to communication between us. This block makes it far easier to turn away from you completely than to attempt to understand the thinking behind your choices.
In other words, these arguments about representation, tactics, and rhetoric that we see in our own time have a long history of their own. If we knew the history of these movements a little better than we do, we might know that. The occasional Hollywood film or popular book has not quite managed to surface the way that women’s protests have always been fragile and fractious alliances. But perhaps the march on January 21 will serve to remind us that, all disagreement aside, there is something to marching together that exceeds all of our many, reasonable disagreements.
Food riots have broken out in Haiti, Egypt, Mozambique, and Bangladesh. In New York, eight million people look into the refrigerator wondering what to eat.
Rice prices soar on the international market as shortages trigger unrest among people for whom rice is a dietary staple. The packaging of the rice on my pantry shelf tells a story about pristine fields and rare cultivation. “To keep our rice select, we inspect each grain. . . . This may seem like a lot of extra work to you, but we care.”
“Care” is one of the fluctuating words of our time. As CARE, it is an international rescue organization; as demotic speech, a matter of whim and interest; in official talk (as of health care), it essentially means nursing or medicine. In one sense, I could care less if I have rice. In another, I care (and care for myself) a great deal: I put all kinds of worry and concentration into whether I will have white rice, brown rice, basmati, arborio . . . Does brown rice leave more of the fibrous husk (for health)? Is polished rice more suitable to the Cambodian cuisine I’ll cook tonight (for experience)? And should I have salmon, which, with its omega-3s, is said to be good for my brain, heart, and mood? Or tofu, with its cholesterol-lowering soy protein, its isoflavones and selenium? Thus from taste or “choice” we stray outward to a vantage from which the wrong choice at dinner looks like death, where care becomes ambassador for compulsion.
Two generations ago, progress in the realm of food split along dual tracks. For those who don’t yet have enough, the goal remains to gain plenty by any technical means available. For those whose food is assured, the task becomes to re-restrict it. This second movement has been underacknowledged.
Having had our food supply made simple, we devote ourselves to looking for ways to make it difficult. The more we are estranged from the tasks of growing and getting food, the more food-thought pervades our lives. It is a form of attention that restores labor, rarity, experience, and danger to food’s appearance (its manifestations in the market and at the table) and its refusal (our rejection of unfit foods, our dieting). This parallels the new complication of other phenomena of bodily attention, specifically modern exercise and sex. It will be objected that the care for food is a fascination only of the rich; this is false. Stretching from high to low, the commands to lose weight, to undertake every sort of diet for purposes of health, to enjoy food as entertainment, to privatize food-care as a category of inner, personal life (beyond the shared decisions of cooking and the family dinner), have communicated new thought and work concerning food to the vast middle and working classes of the rich Western countries.
I think there is something wrong with all this. Underlying my opposition is a presumption that our destiny could be something other than grooming—something other than monitoring our biological lives. Many readers will disagree. Their disagreement is only legitimate if they are prepared to stand up for a fundamental principle: that what our freedom and leisure were made for, in our highest state, really is bodily perfection and the extension of life. One of the main features of our moment in history, in anything that affects the state of the body (though, importantly, not the life of the mind), is that we prefer optimization to simplicity. We are afraid of dying, and reluctant to miss any physical improvement. I don’t want to die. But I am caught between that negative desire and the wish for freedom from control. I think we barely notice how much these tricks of care take up of our thinking, and what domination they exert.
The reason to eat food is no longer hunger. There is now no point in your day when if you were to go without a meal you would fall into physical jeopardy. You “get hungry,” to be sure, but probably from birth to death never go hungry, though enough people in the world still do.1 This is a change in life.
Confusion arises around any need that never gets fully activated. We know we have to eat because otherwise we will die. We direct our thoughts to the activation-point, the terminal condition, even though we’ll never approach it. We act as if we are under compulsion for decisions none of which are determined by this need—as if our provisional necessity were a fatal necessity. We have to eat; but we don’t have to eat anything in particular, so extensive are our food choices, any of which is sufficient for life. (Traditional societies always existed subject to conditions of scarcity; choice was circumscribed when one’s food was given by whatever would grow.) We have to eat; but we don’t have to eat at any given moment, so regularly do we eat, so lavish are our meals. (Though of course you get hungry just hours after eating—from sitting, or waiting, or being. Hunger is recalibrated in half-inches where once it was measured in yards.)
It’s now considered possible for some of us to get ourselves to states in which eating once again feels medically necessary, even with respect to the timing of daily meals. “My blood sugar must be down.” “Remember to stay hydrated.” You become “hypoglycemic,” that is, lacking in sugars. You become “dehydrated,” that is to say, thirsty. You reach a point where you get lightheaded, sick, unhappy without your food.
Truer states of privation can be achieved through exercise. (A kind of confusion is deliberately maintained between thirst, which activates in shorter timespans, and hunger, which is slow-building and diverse in its fulfillment—so that a glass of orange juice becomes a palliative to thirst that also contains, in the solids and sugars and vitamins and tangible substance, a kind of food.) But you don’t even have to wear yourself out to feel changes inside you. Enough of us monitor ourselves this closely in daily life. We become patients in a hypothesized emergency room, in which we move as specters.
These reduced states, and the ability to identify the feelings of them, go with a degree of discernment and class distinction. Lower-class people get hungry, and “we” get hypoglycemic. The redevelopment of biologically necessary hunger is considered morally superior to its widespread alternative, the lazy hunger of an addiction to abundance. The poor say they want lunch. Don’t believe them. Put aside the hunger of these people who endlessly crave “junk foods” and “drug foods,” fats and sugars—the obese overeaters, one of the last classes of people it’s socially acceptable to despise. The exalted need of the momentarily dizzy armchair athlete is counterpoised to the cravings of the obese underclass.
Historically, the modern project of food has always been associated with an end to scarcity. Before modernity, the multiplication of food meant that the supernatural had entered the mundane: as with the miracle of the loaves and fishes, Jesus’ carbs and proteins. Not until the 18th and 19th centuries did plenty come into view as a practical possibility. Its realization crept up on theorists by gradual developments. Malthus declared it impossible that agricultural production’s carrying capacity (arithmetically increasing) could ever catch up with the level of population (geometrically increasing) less than two decades before capacity proved him wrong. Malthus wrote in 1798. The last major “subsistence crisis” to strike the Western nations en masse, according to the agricultural historian John D. Post, occurred in 1816–17. (The starvation conditions in 1845–47 in Ireland and parts of Central Europe because of potato blight he disqualifies as an isolated last catastrophe.) Periodic famine would no longer be a recurring feature of Western life, though it had been a basic condition for all human societies since the early Holocene.
The early years of the 20th century ushered in a second transformation. The embrace of agricultural mechanization spurred a transition from the mere end of famine to the permanence of plenty, even over-plenty. The United States led the worldwide change, which took place from about 1915 to 1970. The application of machine power, specifically the arrival of tractors, made labor hours drop despite ever larger harvests. Crop yields increased per amount of acreage cultivated, massively so with the introduction of hybrid strains of corn. Bruce L. Gardner, reanalyzing most recently the agricultural and economic data for the period, notes that US farms produce seven times the amount of food they did in 1900, while having shed two-thirds of their laborers. “As late as 1950, food consumed at home accounted for 22 percent of the average US household’s disposable income. By 1998 that percentage had been reduced to 7,” while we manage to eat more.
We live far enough after the period of the modernization of food to condescend to its achievements. The technical achievement of super-abundance led to a predictable but short-lived celebration of technicized food itself: a commercial fetishism of the techniques of freezing and refrigerated transport, Swanson dinners and Birds Eye vegetables, and a lust for the re-engineering, preservation, and shelf stability that made Cheez Whiz and the Pringle out of smooth Wisconsin milk and bursting Idaho potatoes. Corresponding to modernization was an early modernism of food, a recognizable trajectory through attitudes well known to us from painting or design or writing. Postwar modernization theory held that modernizing was exclusively an economic-technical achievement, one which stood apart from the sorts of aesthetic regimes that succeeded one another in progress in the arts, but it was not so. “Food science” represented a moment of human progress, recognized and regnant, transformed into culture. When mid-century food technics are satirized today by right-thinking people as kitsch—Tang, fish sticks, and Wonder Bread—a moment of utopian progress is reduced to folly, as can happen with so much of the naïve ecstasy and radiance of all products of the machine triumphant.
Today we participate in a late modernism and even a postmodernism of food. We witnessed, after the triumph of a previously unquestioned project, a characteristic latecoming struggle around the nature and direction of progress. First, in the late 1960s, came reactions against the inhuman technical character of food science and “agribusiness.” Critics in this phase pitted themselves against consumer capitalism. This initial reaction was romantic and primitivist, associated with the counterculture and the movement “back to the land,” just a few decades after productivity gains had led an agrarian population to leave it. It brought a call to the East for mystic authenticity in the culture of “health foods”— tofu, brown rice, yogurt, seaweed, wheat germ, made from the live spirits and microbes excluded in industrial processing. (These were the parts that were said to live and germinate, against an antiseptic modernist technics of death: the Bomb and pasteurization were made by the same culture. The historian Warren Belasco has extensively documented both the actions and the imagination of the early food counterculture.) This counterculture, too, introduced its own counter-technics of food-medicine, in remedies either Western but crankish and eccentric (like the chemist Linus Pauling’s early championing of Vitamin C, a scandal in its time), or Eastern against the West (macrobiotics, acupuncture).
Soon a more flexible capitalism proffered a new set of options which allowed the dissolution, or simply the side-by-side juxtaposition, of opposites, and a new field of cooperation. The standard of “health” perhaps foreordained that the dropouts would deliver themselves back into the hands of experts. Proselytizing pioneers of the counterculture became the arbiters, physicians, and best friends of an expanded version of the Western capitalist culture they believed they had critiqued: figures like Andrew Weil, MD, the “healer,” and the organic “growers” of Cascadian Farm. The conceit of magic pre-existing “natural” remedies and supplements and minerals (red wine, chocolate, “fiber,” “antioxidants,” et cetera) was brought under the mantle of medical testing and food enhancement. Food scientists and processors ceased to fight their former opponents, as they were licensed by the counterculture to formulate new concoctions and mine new markets evaluated, not by opposition or refusal, but along the common metric of health. The post-modern moment can be identified whenever the tug of war between “scientific” food progress and “humane” food reaction produces more business for both sides. In our moment, the options exist in plural, and the most self-satisfied individuals graze from two troughs, the scientifi- cally fashioned and the organically-romantically grown, with the same rationale of “health” for both.
(Food science, to make this clear, was not intrinsically evil nor flawed, and the food counterculture was not just about optimizing toothsomeness and health, but about opposing an established order wherever it had become complacent. Whenever utopians present the substance of their wishes, a fair number of their dreams do come true, though sometimes in forms other than the ones they had anticipated. The Victorian prophet Winwood Reade in 1872 predicted that three inventions would transform the world (the wording comes from Lewis Mumford’s later summary): “a fuel substitute for coal, aerial locomotion, and the synthetic composition of food.” Each came to pass or was obvi- ated within a hundred years. Small-scale petroleum drilling had already begun in the 1850s and 1860s and was sufficient as a coal substitute until recently. The Wright brothers made their successful experiments in aerial locomotion in 1903. The synthetic composition of food didn’t become necessary because of the revolution in productivity. Rather, “synthetic” food became possible in another way—not synthesized ex nihilo but through the chemical refashioning of foods that existed in “natural” form. Reade concluded with two “further triumphs . . . left for man”: “The extinction of disease and the achievement of immortality.” We must recognize that the 21st-century enterprise of health is an open attempt at one (the end of disease) and a covert search for the other (immortality). Health today cannot be understood apart from its refusal of mortality—not through the discovery of a fountain of youth, but through the bargain that, if men and women will obey health guidelines and regulate themselves as they are told and buy the right products for care, they can lengthen their life spans without any absolutely fixed term.)
The contemporary transformations of food are associated with a new impersonality imparted to the field—making of eating a “hobby,” one pastime among others.2
New food entertainments have changed the character of the tradition devoted to cooking and dining. Food generates constant discourse. It has given interest in food an increasingly abstract character, as a “spectacular” function of food can be divided off from its practical, gustatory function. We are learning to take our foods at a remove. First, the contemplation and nutritional analysis of our foodstuffs becomes a semi-autonomous “scientific” sphere independent of any particular meal or mouthful. Second, our entertainments create a standpoint of satiety or disinterest from which we can contemplate food without hunger and find pleasure in that contemplation.
We’ve had an explosion of “food writing,” as the bookstore category is renamed from “cooking” to “food.” We have memoirs in food, novels with recipes, high literature which expands to absorb a “canon” of 20th-century food writers from A. J. Liebling to M. F. K. Fisher to Ruth Reichl. Chronicles appear of a single comestible through history, cod or salt. The newsstand purveys magazines that range from the scientific purism of Cook’s Illustrated, to the aspirational luxury of Gourmet, to the academicism of Gastronomica. The Food Network delivers twenty-four-hour TV programming devoted to cooking and eating: interminable specials on barbecue, semi-celebrities peddling the delights of chipotle. (In Harper’s a few years ago, a well-meaning critic devoted himself to exposing that channel’s programming as “gastropornography,” trapped by the same leveling action between food and sex that makes all of our basic bodily desires into just one thing.) And late at night on TV, between the other paeans to desperation (“Been in an accident?”; “Foreclosure problems?”), are the pitches for miracle metabolism supplements, fat-burning capsules, and colon cleansers.
It may seem odd to think of food warnings and diet plans as entertainment, too. Certainly they’ve taken on the same spectacular character, though, and offer a linked way to spend your time. Should we eat wild or farmed fish; is chocolate healthy, and is red wine? Last month, the magazine Health featured an advertisement for its website: “Trying to figure out what fish or vegetable is safe to eat this week?” It wasn’t a joke. Nor was it just about contamination—it concerned a weekly shift in knowledge. The whole idea of food “news” announces the end of thousands of years in which there couldn’t be such a thing as “news” in food. On one channel, we have competitive eating, broadcast as sport; on another, a weight-loss game show, The Biggest Loser. On one, how an automated factory makes Ho Hos; on another, the nutrition report. The point is not that we’re “schizophrenic” about food, as some say—celebrating gluttony and advocating dieting. The point is that although we’re collectively amused along two separate tracks, both may have a common meaning and, perhaps, purpose.3 Their meaning lies in making food “discursive” along every axis. Their purpose? They may constitute a more fully integrated system at the level of social regulation, underlying what look like contradictory temperaments and local interests.
Kant’s aesthetics required aesthetic contemplation to be disinterested. A viewer must not take the aesthetic object to have a use, and must not conceive of the object’s physical desirability. Beauty must not be confused with attractiveness. A painted bunch of grapes should not inspire hunger. One should not think how well a painted horse would ride. In principle, real food should never be aestheticizable under this regime, because it will always be seen as an object to be enjoyed. Part of what is astonishing about the present order is that it does make food available, at times, for something more like disinterested aesthetic appreciation. And this allows the step backward from immediacy that perhaps lets us think of our mortality, our bodily incarnation in its journey toward death, as likewise groomable, accessible to recipe—and preparation, and taste—if not yet subject to an absolute choice of when we die.
The most modern and elite of our eaters find that careful discriminations, taboos, and rigorous exclusions still lead down both paths without contradiction: toward the totally engineered and compressed vitamin pill; and toward the organic, sourced, inherited, unmodified “whole” food—not “made” but harvested, not altered (in this imagination) except by joyful labor. You can eat your PowerBar, product of an engineering as peculiar as any the world has known, and wash it down with unpasteurized unfiltered cider pressed by Mennonites, and on both fronts, you find it good.
The food imagination of our moment is different than we think, and needs to be excavated, category by category.
Food dreams: A friend of mine sighs and says he wishes he didn’t have to eat. He wishes he could take a pill that would cover all of his physical hunger for two weeks, say, or a month. Then he would only have meals when he wanted to, purely for pleasure: he would be completely delivered from the bodily responsibility to consume food. Another friend wishes for a magic food that could be eaten all the time, in satiating quantities, in different flavors, that would require exactly as many calories to chew and digest as it contained. This is the rumor about celery—that it has just as many calories as must be burned to process it. “Celery-food” would be calorically null. Then she would not have to monitor what she ate; then she could eat as large a quantity of food as she wanted, to gratify any hint of hunger, without it being incorporated into her body as weight.
Interior visualization: We’ve learned to picture our insides; and the pictures get better and better. “Beautify your insides”: an advertisement to young women for Metamucil, illustrated by a presumably gut-emptied fashion model complacently reclining against a drape. In forensic television like House and CSI, we see first the sick patient in the hospital bed, or the expired corpse on the slab, diagnosed by the team; then a computer graphic of the insides of the stomach or heart wall or liver in churning color; then the modifications to show the new pathological state: ugly, green-infested, bacteria pinging like lottery balls. We want, ourselves, never to approach that bad state. Never to be corroded. At such times, we really do want our insides to look beautiful. We have somehow seen our own healthy stomachs, hearts, or livers, and feel a longing for them, as we learn what has gone wrong for some fictional person.
Food fears: When a third friend is about to eat a food that has fat, especially meat fats or hydrogenated oils, he imagines the interior arteries of his heart becoming clogged with a yellow-white substance, like margarine or petroleum jelly. When he eats calories or fats, he imagines individual particles entering shrunken fat cells in his belly and sees them stretch and become oblong. When he eats meat, he imagines it passing through his colon with a rough texture that scrapes the walls, roughening them, to make them susceptible to cancer. He conceives an evil superfat, beyond palm oil, soybean oil, and trans-fatty acidic frying oil, that can spread from food into every cell, hardening the arteries, clotting as plaque, making him obese.
Dogma of total effect: We’ve moved toward a dogma of total effect when it comes to food. There is nothing that goes into your body, on this view, that doesn’t permanently affect its makeup. Nothing fails to be incorporated. “Everything that goes into food goes into you,” runs the frightening apothegm of one health-food advertisement. It is as if the somatic record registered all the adulterations and processes of our foods even more than the ingredients. Perhaps, too, the type of labor that went into them—even the intention with which that labor was done. It’s like the idea of a perfect fossil record or perfect calculation of what you consumed over your whole life, as if your body grew or aged differently based on every single item to pass your lips. Not for us, the quite reasonable supposition that the majority of what we eat doesn’t change us; that human beings have been digesting so much for so long that they eat disparate foods with identical outcomes.
Healthy taste: Taste is conditioned by ideas. The physiological base emerges in infancy and is shaped by one’s “native” cuisine. Sweet, salt, sour, and bitter are said to be physiological universals. Subsequently taste is conditioned by will and effort. The savors of coffee, bitter chocolate, wine, beer, anchovies, brine, sharp cheeses, Brussels sprouts, all have to be learned with intellectual and physiological effort—for some people with more difficulty, for others less. The taste of health is one of these. A healthy taste will be the aggregate of all the carrots, apples, breakfast bars, protein shakes, fruit smoothies, chewable vitamin C tablets, tempeh, green tea, and zinc lozenges you’ve ingested. It is a rubbery ball of taste, added to and modified by each new item that’s “healthy” that crosses your lips.
The most confusing foods to taste now are those that are bad for you but delicious and those that are “natural” or rare but flavorless. You learn to taste artificial grape flavor as “cheap” or chemical-like. With “refined sugar” or fried food, you say, “I enjoy having a Coke and onion rings, but I get a headache afterward”—and then you get the headache. Taste also becomes a product of comparison and decision. At a highway rest stop, I wanted a hamburger, but I bought a turkey sandwich. The roll was cardboardy, the turkey metallic, the lettuce white; but underneath it all, because of the self-denial and the act of choice, it had a peculiar flavor—it tasted healthy. When you eat the supermarket tomato that tastes terrible, it is “terrible”; when you bite into the heirloom tomato that happens to be tasteless and watery, you adjust it to taste “real.”
The Jetsons: When George and Judy Jetson wanted to make a meal, they entered the kitchen, reached under the counter, and pulled out a little pellet, a pill. They put the pellet in the food oven, closed the door, opened it again, and now it was a steak. The pellet always became some very basic food—a plump turkey, a bowl of mashed potatoes, or an ice cream sundae with a maraschino cherry on top, for Elroy. Something unmarked; something common. And something whole.
This vision of an ideal food future managed to get our real food future wrong in exactly the right ways. We have experienced a bifurcation of food along Jetsonian lines. The Jetsons had a pellet. Now we eat the pellet. We call it a PowerBar or a multivitamin pill. Every “nutrition facts” statement on a label reminds us that our foods are assembled of thousands of elemental sub-parts, fat and protein and carbohydrate and vitamin and mineral parts—more pellets that displace the whole. We have constituent foods, fundamentally.
On the other hand, we have provenance foods: foods with names and locales, that passed through particular hands to take their current forms, and carry something of their geographic and personal attachments with them anywhere they go. We don’t get turkey and mashed potatoes. Instead, Vermont organic fingerling potatoes with Sardinian olive oil, and a free-range grain-fed turkey raised cruelty-free. Food with origins, and sourced foods that carry with them a certain way of life and experience. What you’re doing when you’re eating these foods is also eating Vermont and Sardinia, and the labor performed in those places. Or, more genteelly, you’re imagining what sort of lifeways, which you experience vicariously, go on in such places of origin. It’s not labor but a kind of bliss, with fish leaping into nets and grains bending to the scythe. Perhaps you even imagine what goes on in the consciousness of the animals in the lands where the turkeys range free.
Gourmet vs. foodie. The gourmet of past decades wed himself to a single place: Western Europe, and more particularly France. He learned to cook a single alien cuisine, French, and his time and attention went toward two basic activities, cooking and importing. He cooked a limited palette of dishes and learned a set sequence of techniques. The gourmet knew foods that lent themselves to travel: wine, cheese, sausages (or charcuterie), pâtés, later coffees and chocolates. He would have a shop in his wealthy town that sold these items (“the gourmet shop”; the “wine and cheese shop”), and as an “expert” he could match wits with the experts who worked and shopped there. Julia Child exemplified the old gourmet.
The foodie differs in having the whole world at his fingertips. There is no one other region. There is the globe. If his cookbooks are European, they gravitate first to Europe’s warm and “savage,” uncivilized places: Provence, Southern Italy; then quickly slide to brightly illustrated new books carrying him to Turkey, Morocco, Vietnam, India. No single tradition exists for him to learn, no singular importers to patronize, but an ocean of ingredients that wash up on his shores—in the high-end supermarkets, which pretend to adventure among wild foods (the one nearest the n+1 office is called “Foragers”; Trader Joe’s represents the supermarket as a colonial trading station), in the old gourmet shops which survive on sufferance, and in the ethnic groceries, where the “ethnic” food may even include the foodie’s native, childhood cuisine. The foodie wades out and swims in possibility. And then, surprisingly, many a foodie will deliberately restrict his range. He begins to set rules or laws for himself that make the quest for food harder and the thinking more complex. Undiscovered foods only; “authentic” restaurants only, or kitsch diners, or barbecue joints; organic food only; local or farmers’ market food; historically reconstructed food; raw food or slow food only. A foodieism even exists of carnivorousness, or disgust: eating body parts that have become disreputable or rejected. Anthony Bourdain, traveling five thousand miles in business class to eat a sheep’s eyeball, is one type of foodie hero, the authenticity-, nature-, and experience-devouring buccaneer who acknowledges disgust only by offending cautious philistine eaters. The rules or laws of his restrictions may be contradictory, operating in different food spheres, yet the true foodie can keep several going simultaneously, or slip from one regime to another. Not everyone undertakes the path of restriction, or follows it rigorously, but enough do, and the trait is essential.
The gourmet was always close to the snob. He wanted to be an aristocrat and identified with tradition. The foodie comes after the eradication of tradition. He is not like an aristocrat, but like someone who has stumbled into obscene wealth by what seems to him happenstance— as, judging globally, any of us in the rich countries has stumbled into wealth by the luck of where we live, and into food-wealth by our system of cheap overabundance and our access to all the migrant cuisines that shelter in America and Europe. There is no food we can’t access. There is no food, moreover, that can’t be further enchanted by our concentration, restriction, choice, and discrimination between better and worse specimens. We add the value of our intellectual labor, our “finishing” of the world’s raw materials. Foodieism is a natural hobby for first-world professionals, ostensibly taking up the world, but referring back to the perfection of the enriched, corporeal self.
It would be dishonest to suggest that we have not had critiques of the new food order. They come in two forms, and can represent a kind of fraternal warfare. First, we have exposés of our “fast food nation” and brilliant critiques of the national addiction to corn syrup. These come from the partisans of “nature,” who fight industrial or mass food. Second, standing against them, the chef-, kitsch-, or ethnic-food worshipping gourmands occasionally strike back. They accuse the nature-lovers of tampering with pleasure, acting like killjoys, or speaking condescendingly from a position of wealth (from which the gourmands also speak, but never mind). It is a specimen situation of what Bourdieu called the “law of mutual lucidity and reflexive blindness” among practitioners of critique.
The most notable voice from the side of the gourmands may be UCLA sociologist Barry Glassner. His strategy in his book The Gospel of Food (2007) is to let nutrition scientists contradict each other in such a way as to create a chaos of knowledge. In this chaos, the best solution for eaters—until better information is available—may be to eat what you most enjoy, which, for Glassner, happens to be expensive Southern California restaurants in the contemporary foodie mode.
The most important voice in the “nature”-oriented critique of food is the journalist Michael Pollan. His The Omnivore’s Dilemma (2006) seems to be the Silent Spring of this decade. It takes its great moral gravity from the orientation with which Pollan starts: that of the public good, the good of the environment, even the interests of animals. He does the investigative work to convincingly establish a critique of corn overproduction caused by the US’s celebrated mechanization of agriculture and planting of hybrids. The consequence of reliance on a single crop and the necessary introduction of artificial fertilizers has been land damage. Tractorized corn farming causes overexpenditure of fossil fuels in an artificial economy sponsored by the state (leading to carbon pollution and fuel depletion). It makes for the cheapening of feed for food animals, particularly beef cows and broiler chickens, whose expanded ranks add wastes to the environment, while their removal from dependence on grassland allows humans to abuse them in ever more confining pens. The subsequent lowering of the price of meat leads humans to eat too much flesh, because they can afford it, and thus suffer new health problems. Meanwhile, the processing of large quantities of corn into too cheap a form of “unhealthful” nutrition, in the use of corn syrup to sweeten and calorify just about every processed food— sweetening foods beyond what is needful to the palate—creates obesity and ill health by a second route. And corn is further processed into all sorts of food-products (fillers, binders, emulsifiers) which are not easily found on their own in nature.
The difficulty with Pollan’s argument is that he winds up moving from the large-scale phenomena of environmental effects to the small-scale phenomena of human body composition and health. One of the shocking moments in his book describes a chemical test that proves how much of the average American body has been built from molecules originating in corn. It’s a scene made for horror, and more than one person I know who has read it developed a visceral disgust for corn products in foods. (As with Upton Sinclair’s The Jungle, a novel about labor abuses in the meatpacking industry, with which Sinclair famously “aimed for America’s heart and hit it in the stomach,” Pollan’s discoveries may often lend themselves less to solutions for the collective good than to immediate, personal self-protection.) It’s worth remembering that this kind of use of corn was one of the great dreams of progress in food: a single source of nutrition that could feed everyone, or be broken down and recreated in any form—a dream of humankind since the time of God’s manna. Pollan’s critique can become purely privatistic (in the worries about individual weight gain and disease from meat and corn syrup) or turn into a hostility to progress and the widespread, easy provision of food itself. His book goes on to treat the large-scale success of organic farming, which he believes has cost the organic movement its soul; be that as it may, Pollan’s only successful alternative, a “grass farm” where all livestock farming depends on the successful management of grass, seems to represent a model that simply can’t be scaled up for a multitude of eaters. It depends to a high degree on charismatic genius simply to run from day to day.
After this, The Omnivore’s Dilemma becomes essentially self-directed. Can Pollan forage food? (He learns to pick extremely expensive mushrooms.) Can he kill his meat? (He hunts a wild boar.) How will he enjoy his book-ending meal, a dinner so maddeningly elite (the fava beans don’t help) as to make one momentarily sympathetic to Pollan’s most petty critics—those who say that he defends “nature” in the form of food, not for the good of mankind, but to justify the sort of egotistic luxury trade that pays the rent on a thousand Whole Foods Markets.
Whether or not one admires the path Pollan takes, one thing is clear. In his solutions, and in his reasoning about first principles, he is anti-progressive. He no doubt votes for Democrats; supports the right causes; has the right attitudes. He defends “nature” but moves quickly from outer nature (trees, earth) to an inner nature (or “human nature”) which he believes is fixed in crucial respects. The invariant parts of human nature, for Pollan, are rooted in our evolutionarily acquired attitudes to just such things as food. As is true for much other conservative thought about human nature (conservative with a little c), Pollan has a fundamental belief that past practices are likely to be superior simply because they were the past, “our” past. They must suit us in some deep way. He holds an underlying preference for the “nature” to be found beneath the vanity of reason. The line of thought in his short In Defense of Food (2008) intensifies this feeling. Pollan gets away from contemporary ever-changing studies of nutritional science (the material Glassner employed to sow doubt) by appealing to his evolutionary model, in which human beings are adapted to eat many very different traditional diets, so long as their diet is not the new “Western diet” of 20th-century invention. He seems to believe, against present food culture, that adopting any traditional diet of his grandparents’ generation or earlier would lead us now, when combined with modern medicine, to live longer than we will on our own current diet (post–World War II Western foods). He has much to say in this book that is more honest than what other mainstream figures will allow themselves to say—for example, that food anxiety is beneficial to “the food industry, nutrition science, and—ahem—journalism” but not to eaters. But Pollan’s philosophical commitment to “tradition” as his truest guide, and the imputation that reason is always inadequate to decide anything “complex,” like human dietary or social practices, distorts his thought in a particular way.
The most progressive food philosophy of the present day, in the strict sense of progress as opposed to conservation, is vegetarianism. On the basis of reason and morality, it calls for a wholesale change in the way that human beings have always eaten, and a renunciation of a central part of the human dietary past and all its folkways. Vegetarianism has not given up on utopia. The point in The Omnivore’s Dilemma at which Pollan must lose the goodwill of many of his readers, I think, is in his discussion of vegetarianism. I speak as a non-vegetarian. (Non-vegetarian because I am immoral, not because I think there is any good argument for carnivorousness. It is not true, what some philosophers say, that genuinely to hold a belief is necessarily to act on it. I hope I will learn to be moral, by and by.) Everything in Pollan’s book concerning meat animals and their damage to the environment points to the conclusion that we should stop eating meat in any serious way. It is at this point that Pollan’s traditionalism splits with his hope for social change, and one sees which comes first. He will not stop eating meat, because humans have always eaten it.
Non-dogmatic vegetarians are often the eaters who seem to have the greatest stake in progressive food production and food chemistry while still paying attention to the socially harmful forms of overproduction that critics like Pollan identify. For them, Quorn (protein made of fungus) and Boca Burgers (protein made of soy) and multivitamin pills all have their place in a rational and future-oriented diet, just as heir-loom lentils do, or quinoa, or Thoreau’s beloved beans.
Pollan really possesses, as his true endpoint, not social change for the sake of the social, but change for health. Health may be expanded to include what our medicalized culture calls “quality of life”—satisfaction of the palate, crumbs of happiness, dinner table sociability, and Pollan’s sort of natural piety—but it never ceases to be health. Its fulfillment may require progress in science and progress in medicine. But it is always, fundamentally, progressive in an antisocial way. Though it claims to purify and strengthen the body politic, health has nothing to contribute to (horizontal) solidarity and democracy. It always leads individuals back into themselves, as those selves try to meet the (vertical) demands of experts. Any focus on optimization and purity has a way of contravening the attitude of democracy. Democratic imagination desires that which is unlikely, unfitted to itself, unfit; it incorporates the sick and unknown not just for the sake of justice, but for a reckless joy.
Is there anything I know that Pollan and Glassner don’t? I know that each of them can only escape the system of flawed nutritional science by looking for more and different science. And yet if one were really to get out of this system, one would have to embrace a will to discover a different source of value.
The rules of food, of sex, of exercise, of health, give us ways to avoid facing up to a freedom from care which we may already have within reach. This would be an accomplished freedom from biology, lacking nothing, which we simply don’t know what to do with. What if life were not really a possessive commodity that came in quantities that you gained or lost by your efforts? What if there were no further overcoming of some obstacle (disease, mortality) still to be attained, and we are now in the era of life assured and made free? If so—well, how should we act?
What is health? It is stored care.
It is good foods intaken, converted into a kind of currency, with your body as piggy bank. It is vile careless pleasures kept from the mouth, rejected.
It is a set of predictions about the future of your body based on correlations from others’ lives. It is a set of attitudes and feelings about your body as it is now, based on introspection, feeling, combined with chancy outside expert information.
It is putting in the hours of exercise to keep your heart pumping far into your old age. It is not bad, nor good. It is not assailable, nor is it the only truth.
I can’t help but want to live longer. I also want to live without pain. This means I want “health.” But when I place myself at a point within the vast constellation of health knowledge and health behaviors, I can’t help but feel that these systems don’t match up with my simple projects of longevity and freedom from pain. There is something too much, or too many, or too arbitrary, or too directed—too doom-laden, too managerial, too controlling.
The ultimate quarry and ultimate obstacle in any different way of thinking about food is this concept of health. How would one truly get outside of the rules of the game? By rejecting health as a goal, and choosing some other reason for living?
We have no language but health. Those who criticize dieting as unhealthy operate in the same field as those who criticize overweight as unhealthy. Even those who think we overfixate on the health of our food call it an unhealthy fixation.
But choosing another reason for living, as things now stand, seems to be choosing death. Is the trouble that there seems to be no other reason for living that isn’t a joke, or that isn’t dangerous for everyone— like the zealot’s will to die for God or the nation? Or is the problem that any other system than this one involves a death-seeking nihilism about knowledge and modernity, a refusal to admit what scientists, or researchers, or nutritionists, or the newest diet-faddists, have turned up? . . . As their researches narrow the boundaries of life.
Health is our model of all things invisible and unfelt. If, in this day and age, we rejected the need to live longer, what would any one of us rich Westerners live for instead?