Did cooking make us human?
A review of Richard Wrangham's "Catching Fire: How Cooking Made Us Human".
Humans are a remarkably capable species. Collectively, we’ve built (and toppled) civilizations, created works of art, and flown to the moon. But how did we get here?
Scholars of human origins have lots of hypotheses, none of which are mutually exclusive. These include: genetic changes (either sudden or gradual), tool use, cultural adaptations, meat-eating, and more. In Catching Fire: How Cooking Made Us Human, anthropologist Richard Wrangham suggests another possibility: the advent of cooking.
Wrangham brings a variety of evidential sources to bear on this “Cooking Hypothesis”, ranging from anatomical changes in early hominids to studies of the human digestive system. Like many accounts of human evolution, it’s hard to outright disprove (or verify) this theory, but I found the evidence compelling and I think it’s worth considering it alongside the numerous other accounts. I’ll describe that evidence in more detail throughout the post, but the key insight is given by Wrangham in the book’s introduction (pg. 14):
Cooking increases the amount of energy our bodies obtain from our food.
That is, applying heat to food1 generally makes it easier to extract calories: it’s a kind of “externalized” digestive system. As Wrangham argues, cooking food led to all sorts of knock-on effects in human history, with impacts on human anatomy and culture. In some ways, we can even think of modern food processing techniques as a natural continuation of this process of ever-increasing caloric efficiency—with all its attendant benefits and challenges.
In case it’s not clear, I think the book is worth reading for anyone interested in human evolution—or, indeed, anyone interested in the importance of food to human culture and biology. My goal in the remainder of the post is not to summarize the entirety of the book, but rather to focus on the evidence in support of three key claims that Wrangham makes in the book: first, that cooking makes food easier to chew and digest; second, that doing so frees up both time and energy for other things; and third, that cooking probably began almost two million years ago and contributed to the transition from Homo habilis to Homo erectus. I’ll conclude by discussing the relevance of the Cooking Hypothesis to contemporary debates about ultra-processed foods.
A final caveat: I’ll be presenting the evidence as it’s presented in the book—in some cases, I’ve read the published academic articles a claim is based on, but I’m not an expert in the field and I haven’t done an exhaustive or comprehensive review of the literature myself.
Claim 1: cooked food is easier to chew and digest
Non-human animals seem to get by just fine on raw food—is it really the case that cooking food makes it easier to eat?
At least for chewing, it’s relatively straightforward to get an intuition for why this is true. For example, it’s much harder to eat a raw potato than a boiled one: that’s because cooking breaks down the fibrous cellulose walls comprising the cells of the potato, which makes it easier to break apart with the teeth. Applying heat also gelatinizes the dense starch granules stored inside plant cells, causing them to swell and eventually burst—this is especially important for digestion, but it also makes starches easier to chew. Cooking also tenderizes meat, reducing the amount of chewing required (and thus the time involved in eating). A 2003 paper by Wrangham and Conklin-Brittain contains a particularly illustrative example of a chimpanzee chewing its meal (an infant baboon) for nine hours. Based on this and other examples, the authors estimate (optimistically) a rate of caloric intake of approximately 400 calories per hour—which, given the caloric needs of early Homo erectus (about 2200-2500 calories per day), would require about ~5-6 hours of chewing per day.2 That doesn’t leave much time for actually catching and preparing the meat.
Ease of digestion is harder to assess directly than chewing. One approach is to study ileostomy patients, i.e., individuals who’ve had their large intestine removed. There are two distinct processes involved in digestion: the first takes place between the mouth and the small intestine, while the second takes place in the large intestine. That first process is what produces the calories that are most useful for us, which means we can study ileal digestibility—which measures the contents left at the end of small intestine to estimate how much energy that food provided to our body. Researchers can thus compare the ileal digestibility of cooked vs. raw food. As Wrangham writes, many cooked starches are quite easy to digest (pg. 58-59):
The percentage of cooked starch that has been digested by the time it reaches the end of the ileum as it at least 95 percent in oats, wheat, potatoes, plantains, bananas, cornflakes, white bread, and the typical European or American diet (a mixture of starchy foods, dairy products, and meat).
In contrast, raw starches are much harder to digest (pg. 59):
Ileal digestibility is 71 percent for wheat starch, 51 percent for potatoes, and a measly 48 percent for raw starch in plantains and cooking bananas.
The reason for this is, as mentioned above, a process called gelatinization. This is what happens whenever we cook starch in the presence of water, including baking bread, producing pasta, or even thickening a sauce (e.g., making a roux with butter and flour). Wrangham writes (pg. 60):
The more starch is gelatinized, the more easily enzymes can reach it, and therefore the more completely it is digested. Thus cooked starch yields more energy than raw.
Cooking also seems to make proteins (such as those found in eggs and meat) easier to digest. One study compared the ileal digestibility of raw vs. cooked eggs in ileosotomy patients, finding that digestibility was higher (between 91-94%) for cooked eggs than for raw eggs (around 51%).3 This is because of denaturation of the food proteins, in which the internal bonds of a protein weaken and ultimately cause the molecule to open up, losing its three-dimensional structure—which in turn exposes the contents more directly to digestive enzymes. As Wrangham notes, applying heat isn’t the way to achieve denaturation: you can achieve similar results by adding acid or salt, or by drying the meat.
A related piece of evidence comes from an early 19th-century case study conducted on a man named Alexis St. Martin, who suffered a terrible gunshot wound that left him with an open cavity in his stomach. William Beaumont, an army physician, treated St. Martin and helped him recover—then used the misfortune as an opportunity to learn about digestion. Because Beaumont had direct access to the contents of St. Martin’s stomach, he could measure how long it took for different foods to be digested. Beaumont experimented with different food preparations, and found that St. Martin’s stomach had an easier time digesting food that was cooked, tender, and finely divided.
Beaumont’s finding conforms to our own dietary preferences for soft, tender food—particularly meat. Cooking meat breaks down collagen, the main protein in connective tissue, until it takes on a texture akin to jelly. This tenderizing process makes it easier to chew and digest meat, and it also seems to make it more pleasurable to consume.4 Here, some readers (including myself) might note an apparent contradiction: it’s true that people like tender preparations of meat, but doesn’t that often involve cooking them less (like rare meat) or even not at all (like steak tartare)? That’s because applying too much heat can have a countervailing effect on the quality of cooked meat, making the muscle fibers tougher and drier (e.g., as with an overcooked steak). As Wrangham notes (pg. 75):
Bad cooking can render meat hard to chew, but good cooking tenderizes every kind of meat, from shrimp and octopus to rabbit, goat, and beef. Tenderness is even important for cooks preparing raw meat. Steak tartare requires a particularly high grade of meat (low in connective tissue) and the addition of raw eggs, onions, and sauces. The Joy of Cooking recommends grinding top sirloin, or scraping it with the back of a knife, until only the fibers of connective tissue remain.
One other piece of evidence comes from studies of humans existing entirely on raw food, whether for ideological reasons (e.g., raw foodists) or because they did not have access to cooked food. Wrangham dedicates a whole chapter to this evidence, surveying various studies (e.g., the Giessen Raw Food Study) that suggest a raw diet leads to substantial weight loss and in many cases “chronic energy deficiency” or even reduced menstruation5. Raw-foodists sometimes report persistent hunger even after eating large quantities of raw food, consistent with the argument that the digestive system is simply less effective at extracting calories from raw food. Similarly, survivors of shipwrecks or other accidents who are forced to subsist on raw food for extended periods of time generally lose substantial amounts of weight and also report constant hunger—again, even after eating large quantities of raw food. While there are societies with diets that purportedly consist largely of raw food (e.g., Inuit diets), Wrangham suggests that this evidence has been overstated: even in these societies, meat was generally boiled where possible (with the exception of blubber, which was sometimes spread over meat like butter), and raw food was primarily relied upon as a snack when on a hunt.
Altogether, I found this assemblage of evidence convincing. There are clear chemical mechanisms by which the application of heat should make both starches and meats easier to chew and digest. This is corroborated by studies on ileostomy patients, and by the experiences of individuals subsisting entirely on raw food. From an epistemological perspective, it’s a compelling triangulation of the claim, combining evidence from biochemistry, medicine, and anthropology.
Claim 2: Cooking frees up energy
This claim is related to the first. Cooked food is easier to digest, which unlocks additional calories and also “frees up” energy we might have used in the digestion process. Wrangham writes (pg. 57):
The mechanisms increasing energy gain in cooked food compared to raw food are reasonably well understood. Most important, cooking gelatinizes starch, denatures protein, and softens everything. As a result of these and other processes, cooking substantially increases the amount of energy we obtain from our food.
One clear consequence of this fact is that eating cooked food is associated with weight gain in both humans and non-human animals. We’ve already seen evidence of this by comparing the outcomes across humans eating cooked food or raw diets, but there’s extensive evidence suggesting that non-human animals also gain weight when given soft, cooked food. This has been demonstrated in rats, snakes, salmon, and even insects.6 These animals are clearly not adapted to cooked food, but they enjoy the caloric benefits nonetheless. In times of caloric scarcity—which probably accounts for most of human history—this is a huge advantage.
Another potential advantage is that cooking frees up calories that can in turn be used to power the brain, which is both extremely energy-hungry and probably a (perhaps the) major factor in the apparent success of the human species. Human intelligence has a number of advantages, including sophisticated planning, creative use of tools, and managing complex social relationships.7
Here, Wrangham builds on a theory proposed in the 1990s by Leslie Aiello and Peter Wheeler called the “expensive-tissue hypothesis”. Human brains are particularly greedy for glucose (pg. 109):
For an inactive person, every fifth meal is eaten solely to power the brain. Literally, our brains use around 20 percent of our basal metabolic rate—our energy budget when we are resting—even though they make up only about 2.5 percent of our body weight.
This proportion of energy expenditure is higher than other primates (about 13% on average) and other mammals (about 8-10%), raising the question of where all this energy comes from. Aiello and Wheeler rule out the possibility that humans simply use more energy overall—humans and other primates have pretty similar basal metabolic rates—which suggests that there must be some kind of trade-off occurring somewhere. Specifically, between the gut and the brain: just as animals vary in the size of the brain, they also vary in the size of their gut—and crucially, animals with smaller guts tend to have larger brains (pg. 112-113):
Aiello and Wheeler estimated the number of calories a species is able to save by having a small gut, and showed that the number nicely matched the extra cost of the species’ larger brains…Big brains are made possible by a reduction in expensive tissue.
As we’ve already reviewed, the gut plays a really important role in processing and extracting calories from food. Something major must have changed to allow for this reduction in gut size—intuitively, this probably involved a change in diet. But what exactly was that change?
One dominant account—with which followers of the carnivore diet are no doubt familiar—is an increase in meat consumption. Carnivores tend to have smaller guts than herbivores8, and there’s general consensus that hominids started eating more meat around two millions years ago, perhaps precipitating the shift from australopithecines to Homo erectus (more on that in the section below). This is the account that Aiello and Wheeler support in their 1995 paper. The authors do suggest that the advent of cooking played a role in human evolution too—just that this occurred considerably later (around 500K years ago), leading to the shift between Homo erectus and Homo heidelbergensis.
Wrangham agrees broadly with the principles of this account, but disagrees with respect to the specific timeline. He argues that hominids started cooking much earlier than 500K years ago, and that cooking thus contributed to one of two major shifts leading to Homo erectus (pg. 114):
That phase of our evolution occurred in two steps: first, the appearance of the habilines; and second, the appearance of Homo erectus. Meat eating and cooking account respectively for these two transitions, and therefore for their accompanying increases in brain size.
Wrangham’s primary piece of evidence here is that—as discussed in the section above—eating raw meat is pretty hard and time-consuming with human (or ape) jaws. Chimpanzees do kill other animals and eat them, but they frequently only eat the soft parts of the animal (like the intestines, liver, or brain), sometimes leaving the muscle behind. Thus, some sort of system for processing their meat would’ve helped greatly. As Wrangham notes, even chimpanzees make use of rudimentary methods for processing meat (pg. 118-119:
By adding tree leaves to their meat meals, they make chewing easier….The only obvious rule governing their choice is that the leaf must be tough: they take only mature tree leaves, not young tree leaves or the soft leaves of an herb…An informal experiment in which friends and I chewed raw goat meat suggested that the added leaves give traction. When we chewed thigh muscle together with a mature avocado leaf, the bolus of chewed meat was reduced faster than when we chewed with no added leaf.
Habilines likely used more advanced methods, such as smashing bones with stones to extract marrow or even tenderizing the meat with clubs. And of course, hominids could use fire, they could lay meat over the flame and cook it—which confers all the digestive advantages we’ve already discussed. In Wrangham’s view, the fact that anthropologists continue to see increases in cranial capacity through Homo heidelbergensis likely reflects improvements in both hunting methods and cooking techniques. Wrangham reviews a number of developments here, from “earth ovens” (burying food alongside hot rocks) to heating food in natural containers (like bamboo). He writes (pg. 127):
Although the breakthrough of using fire at all would have been the biggest culinary leap, the subsequent discovery of better ways to prepare the food would have led to continual increases in digestive efficiency, leaving more energy for brain growth.
So, in sum: it’s easier to extract calories from cooked food—as we reviewed in the section above—and these calories help supply energy for important physiological functions. In the case of humans, there’s a good case to be made that cooking enabled a kind of evolutionary trade-off: our guts got smaller and our brains got bigger.
The next question is when and how this all started.
Claim 3: Cooking started about ~2M years ago
The claim that cooking food makes it easier to digest seems pretty uncontroversial. In contrast, there’s still considerable debate about when cooking started. As Wrangham notes, some researchers argue that it’s as recent as ~40K years ago, while others maintain that we’ve been cooking for at least ~500K years. This kind of argument will be familiar to readers interested in the evolution of human language—some researchers think it’s quite recent, and others think that even early hominids used something like human language for communication.
With cooking—as for language—the evidence is far from definitive.
One methodological approach here is to look for evidence that humans at a given time period were able to use and control fire. Clearly, the use of fire does not entail cooking, but intuitively it does seem like a necessary prerequisite. As this 1986 article by Johan Goudsbloom speculates, early hominids might first have benefitted from the “passive” use of fire, i.e., fires originating from lightning, volcanic eruptions, or other natural causes. These benefits likely came in the form of heat and light, along with the possibility of eating vegetation or meat incidentally “cooked” in the flames. Goudsbloom speculates that gradually—over a very long time period—hominids must have transitioned to a more active use of fire: perhaps first by transporting burning matter from one location to another, and then eventually by actually making fire using friction or chemical means.9
Archeologically, there appears to be good evidence of controlled fire use for at least several hundred thousand years. This includes: heat-cracked river cobblestones in Dordogne suggesting evidence of boiling water (about ~40K years old); burnt shells and fish bones near hearths in South Africa (about ~60K-90K years old); ash layers and carbonized stems and plants in Zambia (about ~180K years ago); hearts with ash deposits in Israel (about ~250K years ago); and burnt seeds such as olive and barley found at Gesher Benot Ya’aqov (about ~800K years old). As Wrangham notes, the evidence gets increasingly shaky as we go beyond this last point (pg. 87):
Before then we find only provocative hints. Archaeological sites between a million and a million and a half years old include burnt bones (at Swartkrans in South Africa), lumps of clay heated to the high temperatures associated with campfires (Chesowanja, near Lake Baringo in Kenya), heated rocks in a heartlike pattern (Gadeb in Ethiopia), or colored patches with appropriate plant phytoliths inside (Koobi Fora, Kenya). But the meaning of such evidence as indicating human control of fire is disputed.
In other words: someone who’s already predisposed to the belief that cooking is very old might be inclined to interpret these pieces of evidence as confirmation of that belief—but it’s also entirely possible to “explain it away” as incidental. A major challenge here is that evidence of fire use is just hard to come by, since many signs wouldn’t be archaeologically preserved. Wrangham points out that even modern hunter-gatherers like the Hadza (located in Tanzania) leave little to no trace when they make a fire, which might lead a more epistemologically conservative archaeologist to the erroneous conclusion that they simply don’t use fire.
Some anthropologists might be tempted to throw their hands up at this point. But Wrangham suggests we look to a different source of evidence: namely, changes in the fossil record—specifically, changes in the cranial capacity of early hominids. If we assume—and this is a nontrivial assumption—that some of these changes were driven by adaptation to cooked food, then we can draw inferences about when cooking began. Epistemologically, this kind of second-order inference is on shakier ground than those based on evidence of fire use, as we have to assume something (evolutionary adaptation to cooking) for which there is not necessarily strong consensus. Wrangham’s careful to point this out, and in my view we might think of it as similar to an axiom in mathematics: if we start with some axiomatic premise, we can ask about which conclusions follow. The premise itself also isn’t ungrounded: we’ve already discussed the caloric benefits of cooked benefit, and Wrangham covers additional evidence suggesting that evolutionary changes are common (sometimes on relatively short timescales) in response to changes in diet.
With that in mind, Wrangham highlights three potential transition points: Homo erectus (~1.8M years ago), Homo heidelbergensis (~800K years ago), and Homo sapiens (~200K years ago). As we’ve already discussed, Wrangham thinks H. sapiens is too recent. For one, there’s evidence of fire use dating back further than 200K years. The onset of H. heidelbergensis is more plausible, as it fits better with the pattern of fire use; it’s also roughly the time point that Aiello and Wheeler suggest as the onset of cooking. But Wrangham argues that the relevant anatomical changes heralding the onset of H. heidelbergensis (a small increase in cranial capacity and a flatter face) don’t fit with an adaptation to cooked food. Instead, Wrangham favors an account in which cooking began with H. erectus, which involved much larger anatomical changes. Some of the most relevant of these include a reduction in tooth size, an increase in body size, a substantial increase in cranial capacity, and a smaller pelvis and less flared rib cage—the last of which could be an indication of a smaller gut. Other changes are also suggestive, such as the disappearance of adaptations for climbing. There’s some evidence that H. erectus slept on the ground, which is dangerous and unusual for primates10; perhaps this was enabled by the use and control of fire.11
Putting it together: cooking depended on fire, and we see evidence of fire use stretching back to at least ~800K years ago, and possibly beyond that; we also see changes in the fossil record of early hominids consistent with the advent of cooking at a few key time points (~200K years ago, ~800K years ago, and ~1.8M years ago). Wrangham’s view is that the evidence is most consistent with cooking being ~1.8M years old.
An alternative account
The evidence for the first claim we’ve discussed seems very strong, and the second claim seems relatively well-supported as well. The third is weakest, as Wrangham himself admits. Thus, it’s worth considering what an alternative account might look like. If cooking didn’t herald the onset of H. erectus, what did?
Probably the most plausible alternative account is the one Aiello and Wheeler suggest: namely, increased meat consumption provided the kind of caloric and nutritional benefits that led to H. erectus. As we’ve already discussed, however, it’s hard to chew and digest unprocessed raw meat. One possibility is that early hominids began using “cold-processing” techniques like grinding or pounding meat, which provided some of the same benefits that cooking does. This appears to be the argument made in a more recent (2016) Nature paper by Katherine Zink and Daniel Lieberman. The authors test out a number of tenderizing techniques known to be available in the Lower Paleolithic (the time period during which H. erectus evolved), and ask about their impact on—for lack of a better word—the “chewability” of meat and roots. They find:
Furthermore, by simply slicing meat and pounding USOs, hominins would have improved their ability to chew meat into smaller particles by 41%, reduced the number of chews per year by another 5%, and decreased masticatory force requirements by an additional 12%. Although cooking has important benefits, it appears that selection for smaller masticatory features in Homo would have been initially made possible by the combination of using stone tools and eating meat.
That is, these techniques made both meat and roots (like tubers) easier to chew, which the authors is consistent with an account in which cooking itself may not have originated until considerably later (like 800K years ago).
As I wrote earlier, I’m not an expert on this topic, and I also don’t know what Wrangham would say in response to this argument. But knowing what we know from the first claim, we can take a guess: making food easier to chew is only one benefit of cooking—cooked food also appears to be easier to digest. Cold-processing techniques like slicing and pounding also aid digestion, but the key question is whether they do so to the same degree as cooking. I don’t know how to answer this question, but it does seem like one’s view here will be dependent on the relative energetic pay-offs of cooking vs. cold-processing methods. It’s also worth noting that the Zink and Lieberman article does not provide evidence against the use of cooking by H. erectus: it merely suggests that other processing methods may have been sufficient to help with chewing. Finally, there are other sources of evidence that Wrangham draws on, and which I haven’t discussed here—like the interplay between cooking and social dynamics—that do provide some additional insight and support of Wrangham’s position.
Of course, the question of when exactly cooking began is impossible to answer definitively. But even if you don’t buy Wrangham’s account, it does seem plausible that humans have been heating their food for hundreds of thousands, and perhaps over a million, years.
The view from modernity
For most of human evolutionary history, caloric scarcity was the default state. Thus, any technology that unlocked additional calories—like cooking—was a hugely beneficial cultural adaptation. These days, however, caloric scarcity is much less of a problem in relatively rich, developed countries. In fact, most contemporary debates about food, nutrition, or calories hinge on an implicit (or sometimes explicit) assumption of caloric abundance. It’s cheaper than ever to consume calories, which has likely contributed to the rise in obesity; this, in turn, has motivated endless dieting fads and exercise regimes—along with drugs like Ozempic.
As the journalist Matt Yglesias wrote a couple years ago, any debate about the underlying causes of increases in obesity probably has to grapple with this issue of caloric abundance at some level. It might be important to investigate other theories—like the idea that there’s a contaminant in the water supply—but caloric abundance does seem like a reasonable null hypothesis:
If anything, making the origins out to be some huge puzzle lends itself to the false suggestion that there’s a very simple and straightforward solution. The truth — that we are experiencing some downside to living in a society with a great deal of material abundance — is harder to wrestle with, since people would be pretty unhappy about policy changes that reversed the 100+ year trend toward food becoming tastier, more available, and more convenient.
This also ties into a similar debate about “ultra-processed foods” (UPFs), which are defined (somewhat unhelpfully) as foods characterized by a relatively involved degree of production. There’s still lots of debate about exactly what we ought to include in this category—and what exactly the health risks might be—but prototypical concerns might include things like making food hyperpalatable (e.g., “you can’t put it down”) or the use of food additives. The question of whether UPFs specifically—or simply cheaper calories—are to blame for the rise in obesity is still unclear, and as Yglesias suggested in a more recent post, cheap and abundant calories are probably a good starting assumption:
So, yes, in practice, lots of the excessive snacking that people do consists of “ultra-processed” food. But I think this is a kind of shallow explanation — if nobody could ever eat “ultra-processed” food but everything else was the same, they would overeat something else. The basic facts of human biology, human society, and market incentives would still be there.
I do think the arguments presented in Catching Fire suggest an interesting addendum to this claim. Namely, caloric abundance can come about not only through economic means—we’re richer than we used to be, and food is cheaper—but also through more efficient processing techniques. On the timescale of cultural evolution, we can think of the “application of heat to foodstuff” (cooking) as just one step along the trajectory of increasingly effective methods for caloric refinement. Historically, those methods have included things like pounding, grinding, salting, and—as discussed here—cooking. If the first claim presented in this article is correct, those techniques transform the bioavailability of calories contained with a piece of food. Contemporary techniques for processing and packaging calories might be even more efficient. Put another way: it’s not just about the raw ingredients—the way they’re processed externally influences how they’re processed in your body.
As Wrangham points out in the book, other forms of pre-processing can accomplish similar goals. For example, grinding or pounding meat makes it easier to digest; the same is true for adding lime to raw fish (as in ceviche).
The authors of that 2003 article use this evidence to suggest that cooking played an important role in shaping the diets of early hominids, i.e., in making the transition to a more meat-heavy diet:
Accordingly, these calculations imply that for meat to have become an important part of the diet, one of three conclusions is necessary. First, pre-cooking humans might have spent much longer chewing their food than any contemporary populations do. Second, unrecognized differences in mastication efficiency between chimpanzees and pre-cooking humans might have allowed humans to chew meat more efficiently than chimpanzees do. Or third, humans must have had some system for tenderizing meat. The chimpanzee model suggests that the most likely solution is the third. We therefore suggest that an important technique that enabled humans to tenderize meat was cooking.
Wrangham notes that this is inconsistent with arguments previously put forth by bodybuilders and raw-foodists, which themselves sounded generally reasonable: namely, that raw eggs should in principle be easier to digest since they’re already in liquid form.
Delicacies such as foie gras and Wagyu beef are prized for their tenderness.
This last effect is particularly relevant when considering the claim that we are biologically adapted to cooked food.
Separately, this is also the motivation for pet food brands such as BARF (biologically appropriate raw foods). The idea is that animals are not adapted for cooked food and that consuming it leads to all sorts of health problems.
There’s at least some evidence that species with bigger brains live in larger groups and develop more complex social relationships. Wrangham focuses on this “social brain hypothesis” in particular, emphasizing the importance of flexible social coalitions (pg. 107-108):
Coalitions are difficult to manage because individuals compete for the best allies, and an ally today may be a rival tomorrow. Individuals must constantly reassess one another’s moods and strategies, and alter their own behavior accordingly.
Ruminants like cows have famously complex digestive systems.
Goudsbloom’s article also discusses some of the likely prerequisites for fire use itself, which he argues includes: bipedalism (i.e., we needed our hands free to transport fire); some kind of capacity for planning and/or inhibitory control (i.e., gathering wood and feeding a fire does not yield immediate but rather delayed gratification); and a capacity for or experience with social coordination (i.e., to facilitate the kind of planning likely required for building and maintaining fires over long time periods). Goudsbloom’s argument is ultimately that humans gained a kind of species monopoly over the control of fire, preventing other species from developing in a similar direction.
The exception to this is gorillas, which are—notably—very large.
As Wrangham notes, climbing trees may have been more important when subsisting on a raw diet, as many edible fruits required climbing. But a diet consisting of cooked roots and meat reduced the need to climb trees—relaxing a selection pressure for climbing adaptations.
A compelling alternate take is that fermentation, not fire, was the initial step: https://www.nature.com/articles/s42003-023-05517-3
Fermentation has an additional benefit which would have been immediately beneficial to past populations: cacheing food for later adds its own benefits. Boone and Kessler 1999 identify this as being the potential explanation for higher overall population growth rates for agricultural vs. foraging life ways: https://www.academia.edu/1054537/James_L_Boone_and_Karen_Kessler_More_Status_or_More_Children_Social_Status_Fertility_Reduction_and_Long_term_Fitness_Evolution_and_Human_Behavior_20_257_277_1999
Thanks for the excellent overview of this fascinating topic, Scott! Claude Lévi-Strauss's _The Raw and the Cooked_ might also shed light. He talks about how the phenomena of raw and cooked would have had an impact on symbolic categories and form in human thinking, including mythical narrative. It would be interesting to see if and how the material and symbolic line up in his study of Native American myth with these new insights and hypotheses.