“It’s normal now for people to say, ‘I can eat a steak, but the cow’s brain? That’s the most extreme thing I’ve ever heard,” says health coach and offal cookbook author Ashleigh VanHouten. “Throughout history, this was never a question. It wasn’t like, ‘In what culture was the brain eaten?’ Everybody ate all of it.”

Traditionally, many cultures have regarded animal brains as a delicacy, and not just because of an ethos of zero waste. Brain’s texture and flavor—rich, fatty, and delicate—are unique, and as cuts of meat go, this one is relatively hard to come by. Not only do you only get one brain per animal, “it is extremely difficult to cut through a skull,” says food writer Janine Farzin, who demystifies organ meat consumption on her blog, Offally Good Cooking. “You really see how well-protected our brain is.” The labor-intensive cuisine of elite Roman banquets included stuffings and soufflés made with pig’s or calf’s brains, including one memorable recipe that flavored the organs with roses (see below). At the Tao Heung Museum of Food Culture in Hong Kong, a diorama of an Imperial Chinese feast features raw monkey brains perched atop slices of cucumber.

The brain’s nutritional content is unique, too: It’s rich in several nutrients that are essential for brain health. Farzin explains that the organ contains high levels of choline, serine, and B-vitamins, and when it comes to omega-3 fatty acids, “you can find them in seafood, and you can find them in brain. It’s the only place in the animal that you can find them” in high concentrations. As a result, brain is believed to have played a crucial role in the diet of prehistoric peoples in landlocked regions with limited access to seafood.

In many parts of the world, people never forgot the value of eating brain. In South Asia, stir-fried maghaz is especially associated with the Muslim festival of Eid, prepared separately from the meat when a goat or sheep is slaughtered for the holiday feast. Mexicans use brains as the filling in tacos de sesos. Italians make cornmeal-dusted frittelle di cervello, and the Minangkabau people of Sumatra are known for gulai banak, brains stewed in coconut curry. The Chinese tofu pudding dòu fu nǎo, literally “tofu brains,” is brainless (it’s named for its silky texture), but pig brains are commonly consumed in Sichuan hotpot and other Chinese dishes.

Yet for many in the modern United States, there’s something uniquely unpalatable about eating brains, a squeamishness that goes back only a few generations. Before the mid-20th century, Americans treated the brain like any other cut of meat, especially in areas where livestock animals were raised. At least one company, Rose, still markets canned brains soaked in milk (a typical initial step when preparing brains to remove the blood). Scrambled eggs and brains was once a classic American breakfast pairing, appearing in Fannie Farmer’s influential 1896 cookbook and many others. “When [brain] is lightly cooked and pan-fried, it has a very similar texture to scrambled egg,” says VanHouten, who included a recipe for the dish in her own cookbook, It Takes Guts. “Mixed together, you barely even taste [the brain]. It’s just adding a little bit of richness to your eggs.” Farzin points out that you can use brains the same way as egg yolks; even in custard-based ice cream or an emulsified “brainaise.”

Many brains from American livestock are exported to Mexico, where they are a delicacy.
Many brains from American livestock are exported to Mexico, where they are a delicacy.

By the late 20th century, the American palate had decisively shifted away from brains due to several historical factors. The perfecting of the commercial deep-fryer increased consumer preference for crispy, crunchy textures, not easily achieved with pillowy-soft brains. And after scientists established the link between cholesterol and heart disease in the 1950s, an increasingly health-conscious public started to shy away from fatty foods. In 1977, with the publication of the report Dietary Goals for the United States, the U.S. government officially endorsed a low-fat diet as the healthiest choice, and numerous cookbooks and lifestyle magazines followed suit. A quarter-pound of beef brain, for all its nutritional value, also contains more than 1,000 times the recommended daily intake of cholesterol. VanHouten describes brain as “not the thing that I’m going to be cooking the most frequently,” emphasizing that it should be consumed in moderation rather than avoided outright.

Another factor was a change in the attitude towards organ meats in general in the United States and some other Western countries, such as the United Kingdom. In the postwar boom of the late 20th century, economic prosperity and the intense industrialization of livestock farming made cuts of meat once considered choice and rare, like steaks, increasingly available to consumers. While red meat became, in the words of food anthropologist David Beriss, “the symbol of American success,” organ meats started to be looked down on as the food of poverty and struggle. Offal was also increasingly associated with immigrants, including those of America’s past, whose descendants had since prospered and joined the red meat–eating mainstream. “Our particular form of upward mobility is to leave everything behind in order to not be what we were,” American cookbook author Betty Fussell told The New York Times in a 1993 article on the decline of eating offal outside of immigrant communities. Far from the days of Fannie Farmer’s brains and eggs, today, the United States is the world’s largest exporter of edible offal.

Farzin recalls how during the 1950s and ’60s, her mother grew up eating offal in a Portuguese immigrant family, but pressure to eat like her American peers affected how she viewed her cultural cuisine. On a family visit to Portugal as a teenager, Farzin’s mother was aghast to see “brains on the menu, three different ways” at an opulent hotel restaurant. “It was the fanciest place my mom had ever been in her life,” says Farzin. “She was so disgusted, because she was like, ‘Wait, I thought that we ate this food because we were poor?’”

By the late 20th century, despite the best efforts of open-minded celebrity chefs like James Beard, who served brains and eggs to luncheon guests, and Julia Child, who recommended different cooking times for lamb, pork, and beef brains, many Americans took for granted the idea that brains were not something “civilized” people ate. Eating brains was even deployed as a racist trope to depict foreigners as exotic and savage. As recently as 1984, Indians were depicted feasting on freshly-decanted monkey brains in the film Indiana Jones and the Temple of Doom, a garbled misrepresentation of a (probably never common) practice in ancient China.

In this shock-horror context, monkey brain served as a stand-in for the ultimate off-limits delicacy: human brain, still the favorite snack of many monsters in popular culture. Since 1975, Dungeons & Dragons players have been fighting off the octopus-headed, brain-sucking illithids, or “mind-flayers,” one of the game’s best-known original creations. In 1985, just a year after Temple of Doom, filmmaker Dan O’Bannon invented one of the most familiar tropes in modern horror: the brain-hungry zombie. While the undead eating the living is a concept as old as Gilgamesh, it wasn’t until O’Bannon’s cult horror-comedy The Return of the Living Dead that they were said to specifically crave gray matter. As one zombie in the film explains: “It takes away the pain of being dead.”

This Sumatran brain dish is also called <em>gulai otak</em> (gulai being an Indonesian style of coconut curry).
This Sumatran brain dish is also called gulai otak (gulai being an Indonesian style of coconut curry).

But there have been times and places where eating a human brain was not only acceptable, but customary. “I don’t see any reason why if you were cannibalizing someone, for whatever reason, you wouldn’t consume the brains,” says Bill Schutt, emeritus professor of biology at Long Island University Post and author of Cannibalism: A Perfectly Natural History. In Europe, prehistoric human remains from 100,000 years ago show signs of cannibalization that include cracked skulls from which the brain was extracted. Well into the modern era, cannibalism as a cultural practice often includes brain-eating, with one especially notable example being the Fore people of Papua New Guinea. Their custom of brain-eating had uniquely devastating consequences that illustrate another reason for the modern American aversion to brains: the fear of disease.

Schutt explains that in the 1950s, some 200 people, or “about one percent of the population of the Fore, were dying every year from this horrible disease, and there was all sorts of speculation about what was causing it.” The Fore called it kuru, from a word for “trembling,” for one of its key symptoms, but it was also known as “laughing sickness” because sufferers lost control of their emotions, as well as motor skills and other key brain functions. The debilitating condition was incurable, fatal, and affected women and children eight or nine times more often than men, leading some Fore to fear that kuru would spell their extinction.

As Western scientists struggled to find a medical cause, Fore customs came under scrutiny. Since the mid-19th century, the Fore had practiced transumption, or funerary endocannibalism, eating the entire bodies of their dead as part of mourning rituals. “By consuming their own dead,” an early 20th-century anthropologist wrote of the Fore, “they incorporate them into themselves, and so lessen not only the sorrow, but even the idea of loss.” Fore men often abstained from human flesh or ate only certain parts, wary of the dangerous spiritual energies of the deceased, but women were believed to absorb and neutralize these energies, making them the primary participants in transumption. For the deceased to pass on properly to the afterlife, a 2008 study reported, “the entire body had to be eaten and, for this reason, the women ensured that the brain was all eaten.” As the brain was considered a delicacy, mothers frequently shared it with their children. In some Fore communities, brain was even believed to help the child’s development.

Scientists now believe that the initial case of kuru was a spontaneously-developed prion disease, which spread to Fore women and children through endocannibalism of brain tissue. Prion diseases are named for the misfolded brain proteins that are believed to cause them. “The protein itself is so good at aggregating that it basically functions as an infectious particle,” explains Jonathan Drake, a neurologist at Brown University. Amyloid proteins in the brain stretch into sheets as part of their normal functioning. But in cases of prion disease, “they tend to stick together,” says Drake, “And that ends up being bad for the organism, because then the protein can no longer do what it’s ordinarily meant to do.” Clumps of stuck-together proteins in the brain cause pileups, killing healthy tissue and filling the brain with holes.

Drake explains that a malfunctioning prion is “so infectious that if you just come in contact with this [prion], it can get into your body and cause the same problem, just like a virus would,” regardless of whether the meat has been cooked. It can take years for the effects of a prion disease to build up enough for symptoms to emerge. Endocannibalism was already in decline among the Fore by the time kuru was medically documented, and they had ceased the practice by the end of the 1950s, but new cases of kuru were still being reported decades later because of people who had eaten human brains in childhood. Sources differ as to the exact date, but by 2010, the last-known victim of kuru had died.

This was the first, but not the only, time in the 20th century that the ugly specter of prion disease reared its head in connection with brain-eating. By the 1970s, with mass industrialization of meat production and consumers losing interest in organ meats in the U.S. and U.K., undesirable animal parts were being rendered down into MBM, or “meat and bone meal,” which was used as a protein supplement for living cattle. Then, “in 1974, there was a huge explosion at one of these rendering plants in England,” Schutt explains, for which solvent chemicals were blamed. In the aftermath, Schutt says, many such plants “changed how they processed down cattle, and they got rid of some of the solvents that they were using.” The unintended consequence was that these solvents may have been destroying infectious particles.

In Italy, brains are also combined with eggs in soufflé-like dishes that resemble ancient Roman recipes.
In Italy, brains are also combined with eggs in soufflé-like dishes that resemble ancient Roman recipes.

The lack of solvents and the practice of feeding brain tissue to cattle combined to create an outbreak of prion disease in British cattle in the 1990s. Technically called Bovine Spongiform Encephalopathy (BSE) for the spongy holes in the brain, the affliction was popularly known as Mad Cow Disease. All meat from an animal infected with prion disease can potentially carry infectious prions, albeit lower levels than the brain. While most known prion diseases remain within a species, BSE was unique in that it jumped from infected beef to humans. This resulted in at least 200 human deaths from a variant form of the human prion disease known as Creutzfeldt-Jakob’s Disease (CJD). Schutt notes that “pretty early on” in the outbreak of BSE, scientists drew parallels to the phenomenon of kuru among the Fore.

In the immediate aftermath of the BSE outbreak, 4.4 million potentially-infected cattle were slaughtered in the U.K. alone, and many nations revised their safety regulations. Today, MBM is no longer used in feed for ruminant (cud-chewing) animals such as cows and sheep. It’s still an ingredient in some commercial pet food, with the caveat that the brain and other especially risky parts are not included. And since prion diseases often do not manifest until adulthood, it is now illegal in many countries to sell the brain from an adult cow for food. In the United States, only a few cases of BSE have ever been identified in cattle, and these are believed to have developed spontaneously. However, exports of U.S. offal and other beef products declined sharply in the years immediately following the BSE outbreak, ending an overall trend of increase.

The lasting reverberations of the BSE panic have resulted in what VanHouten calls “some outdated misconceptions around the relative safety of eating brain.” Brain still shows up as a shock or gross-out food on social media, with one vlogger going viral in 2022 for eating what he claimed was raw beef brain.

Modern proponents of eating brain emphasize that the organ is not inherently dangerous as long as precautions are followed. “When I’m purchasing [brains], it’s from farms that I’ve been to, and farmers that I trust,” says Farzin. VanHouten also speaks of the importance of getting fresh meat from healthy animals sold by reputable sources. Though there is some concern that a different prion disease (the ominously-named Chronic Wasting Disease) may be spread from wild North American deer meat, most people today don’t need to worry. Thanks to increased safety regulations and reformed feeding practices for livestock, such conditions are now extremely rare. For evidence, we might point to the many people around the world who eat animal brains today with no apparent ill effects.

Why haven’t we seen more prion disease outbreaks like kuru and BSE among brain-eaters? Schutt offers a possible answer. Most modern humans, he explains, “probably have some type of immunity, because in ancient history, our ancestors were exposed to this disease through cannibalism.” A study published in 2003 with involvement from kuru researchers put forth the theory that prion disease epidemics had been widespread in Paleolithic societies. Their evidence was a gene found in many modern human populations that provides some protection from prion diseases. The study suggests this was passed on as a survival trait when our prehistoric ancestors took ill from eating human brains. Schutt points out that endocannibalism was “relatively new” among the Fore when they began suffering from prion disease. It’s possible that kuru represents a tragic echo of what befell the first of our ancestors to crack open a human skull.

Strong flavors, extra steps of preparation, and the physical resemblance to body parts can make organ meats challenging for American meat-eaters, but in many countries, eating offal simply comes hand-in-hand with eating animals. Chef Ange Branca, who worked on an offal-focused dinner series in Philadelphia called Anatomy Eats in 2022, explained the approach towards consuming animals in her native Malaysia: “We have to learn how to use everything, because you paid for it. It’s not going to go into the trash can,” she said at one of the dinners.

In recent years, a growing number of food writers and recipe creators, like Farzin and VanHouten, have been working to bring animal organ meats back into the spotlight. Both find that the nutritional benefits of eating organ meats far outweigh any extra processing that may be involved in making them taste delicious. Furthermore, says VanHouten, if you can accept the eating of animals, eating every part is “the most ethical, reasonable, and sustainable way to do it.”

For better or for worse, people have been eating brains since prehistoric times, and while it may not be as widespread a custom as it once was, it’s one that shows no sign of stopping. “It’s more of a mental hurdle than an actual physical safety concern,” says VanHouten. “It looks like a brain. And you have to get yourself over that.” But if you can, you might even discover that you like it.

Gastro Obscura covers the world’s most wondrous food and drink.
Sign up for our email, delivered twice a week.