There are multiple lines of evidence that
an animal-based diet
best supports
human brain development
in infants and young children.
In brief / contents
- Human fetuses and infants rely on ketones for brain building.
- Weaning onto meat increases brain growth.
- The brain is an energy-intensive organ that required an animal-based diet to evolve.
- Because humans wean early, and human brain growth is extended past weaning, the post-weaning diet must support fetal-like brain growth.
- A high-fat animal-based diet best supports brain growth.
Human fetuses and infants rely on ketones for brain building.
In a previous post,
we wrote about the known (but little-spoken-of) fact that
human infants are in mild ketosis all the time,
especially when breastfed.
In other words,
ketosis is a natural, healthy state for infants.
Infancy is a critical time for brain growth,
so we expect that ketosis is advantageous for a growing brain.
Otherwise, there would have been a selective advantage to reduced ketotis in infancy.
This species-critical, rapid brain growth continues well past weaning.
For that reason,
we suggest in our article that weaning onto a ketogenic diet would probably be preferable to weaning away from ketosis.
In response to that post,
a reader sent us a paper
called Survival of the fattest: fat babies were the key to evolution of the large human brain. [1]
The authors discuss the apparently unique human trait of having extremely fat babies,
and explain it in terms of the unique need for growth of extremely large brains.
A key point they make
is that a baby’s ample fat provides more than simply a large energy supply,
(much more than could be stored as glycogen or protein; by their calculations, more than 20 times more),
but that ketone bodies are themselves important for human brain evolution.
They repeat the usual unwarranted assumption that
adult brains use mainly glucose for brain fuel by default,
and that ketone bodies are merely an alternative brain fuel.
Nonetheless,
when talking about fetuses, they are willing to say that the use of ketones is not merely an “alternative”:
In human fetuses at mid-gestation, ketones are not just an alternative fuel but appear to be an essential fuel because they supply as much as 30% of the energy requirement of the brain at that age (Adam et al., 1975).
Second, ketones are a key source of carbon for the brain to synthesize the cholesterol and fatty acids that it needs in the membranes of the billions of developing nerve connections.
[…]
Ketones are the preferred carbon source for brain lipid synthesis and they come from fatty acids recently consumed or stored in body fat. This means that, in infants, brain cholesterol and fatty acid synthesis are indirectly tied to mobilization and catabolism of fatty acids stored in body fat.
In other words, the claim is that ketones are the best source of certain brain-building materials, and specifically, that fetuses use them for that purpose.
Moreover, the thesis is that the extra body fat on human babies is there specifically for the purpose of supporting extra brain growth after birth, through continued use of ketones.
Weaning onto meat increases brain growth.
[ Please note that by convention weaning refers to the gradual process of transitioning from exclusive breastfeeding (starting with the first foods introduced, while breastfeeding is still ongoing), to the end of breastfeeding, not just the end itself. ]
We aren’t the only ones who have thought weaning onto meat would be a good idea.
A couple of studies have compared weaning onto meat rather than cereal.
One showed a larger increase in head circumference [2], which is a good index of brain growth in infants [3] and young children [4].
Moreover, higher increases in head circumference in infants are correlated with higher intelligence, independently of head circumference at birth [5].
In other words, the amount of brain growth after birth is a better predictor of intelligence than the amount of brain growth in gestation.
That study also found the meat-fed infants to have better zinc status, and good iron status despite not supplementing iron as was done in the cereal arm [2].
Zinc and iron are abundant in the brain, and zinc deficiency is implicated in learning disorders and other brain development problems [6].
Iron deficiency is a common risk in infants in our culture, because of our dietary practices, which is why infant cereal is fortified with it [7].
Another study showed better growth in general in babies weaned onto primarily meat [8].
Weaning onto meat is easy.
Here’s how I did it.
It is believed likely that early humans fed their babies pre-chewed meat [9].
I did that, too, although that wasn’t my first weaning step.
Influenced by baby-led weaning, I waited until he was expressing clear interest in my food, and then simply shared it with him.
At the time this meant:
- Broth on a spoon, increasingly with small fragments of meat in it.
- Bones from steaks and chops, increasingly with meat and fat left on them.
- Homemade plain, unseasoned jerky, which he teethed on, or sucked until it disintegrated.
- Beef and chicken liver, which has a soft, silky texture, and is extremely nutrient-dense.
–Amber
The brain is an energy-intensive organ that required an animal-based diet to evolve.
In 1995, anthropologists Leslie C. Aiello and Peter Wheeler posed the following problem [10]:
- Brains require an enormous amount of energy.
- Humans have much larger brains than other primates.
- However, human basal metabolic rates are not more than would be predicted by their body mass.
Where do we get the extra energy required to fuel our brains, and how could this have evolved?
Aiello and Wheeler explain this by noting that
at the same time as our brains were expanding, our intestines (estimated as comparably energy-intensive) were shrinking,
by almost exactly the same amount.
thereby freeing up the extra metabolic energy needed for the brain.
Both adaptations, a large brain and small guts, independently required them to adopt a “high-quality” diet, for different reasons.
Let’s mince no words; “high-quality” means meat [11].
Meat is more nutrient dense than plants, both in terms of protein and vitamins.
Plants are simply too fibrous, too low in protein and calories, and too seasonal to have been relied on for such an evolutionary change [11], [12].
It is widely accepted that meat became an important part of our diets during this change.
This is the mainstream view in anthropology [13].
Although the need for protein and brain-building nutrients is often cited as a reason for needing meat in the evolutionary diet,
energy requirements are also important to consider.
It would have been difficult to get caloric needs met from plants (especially before cooking) [13],
because they were so fibrous.
Herbivores with special guts (such as ruminants like cows with their “four stomachs”) and primates with much larger intestines than we have, actually use bacteria in their guts to turn significant amounts of fiber into fat, see eg. [14].
This strategy is not available to a such a small gut [11], [15], which is why we had to find food that was energy dense as is.
Fortunately, insofar as we were already using animal sources to get protein and nutrients, we also had access to an abundance of fat.
The animals we hunted were unlikely to have been as lean as modern game.
Evidence supports the hypothesis that human hunting was the most likely cause of the extinction of many megafauna (large animals that were much fatter than the leaner game we have left today) [16].
Humans, like carnivores, prefer to hunt larger animals whenever they are available [17].
It has been proposed that the disappearance of the fatter megafauna exerted a strong evolutionary pressure on humans, who were already fat-dependent, to become more skilled hunters of the small game we have today, to rely more on the fat from eating brains and marrow, and to learn to find the fattest animals among the herds [18].
Animal fat and animal protein provided the energy, protein, and nutrients necessary for large brains, especially given the constraint of small guts.
Because humans wean early, and human brain growth is extended past weaning, the post-weaning diet must support fetal-like brain growth.
Humans wean much earlier than other primates, and yet their brains require prolonged growth.
Our intelligence has been our primary selective advantage.
Therefore it is critical from an evolutionary standpoint that the diet infants were weaned onto was supportive of this brain growth.
In a (fascinating and well-written) paper on weaning and evolution, Kennedy puts it this way:
“[A]lthough this prolonged period of development i.e., ‘‘childhood’’ renders the child vulnerable to a variety of risks, it is vital to the optimization of human intelligence; by improving the child’s nutritional status (and, obviously, its survival), the capability of the adult brain is equally improved. Therefore, a child’s ability to optimize its intellectual potential would be enhanced by the consumption of foods with a higher protein and calorie content than its mother’s milk; what better foods to nourish that weanling child than meat, organ tissues (particularly brain and liver), and bone marrow, an explanation first proposed by Bogin (1997).”
…
“Increase in the size of the human brain is based on the retention of fetal rates of brain growth (Martin, 1983), a unique and energetically expensive pattern of growth characteristic of altricial [ born under-developed ] mammals (Portmann, 1941; Martin, 1984). This research now adds a second altricial trait—early weaning—to human development. The metabolically expensive brain produced by such growth rates cannot be sustained long on maternal lactation alone, necessitating an early shift to adult foods that are higher in protein and calories than human milk.”
The only food higher in protein and calories than breast milk is meat.
A high-fat animal-based diet best supports brain growth.
Taking these facts together:
- Even modern fetuses and breastfed infants are in ketosis, which uniquely supports brain growth.
- Infants who are weaned onto meat get essential nutrients to grow brains with: nutrients that are currently deficient in our plant-centric diets today.
Moreover, experiments have found that their brains actually grow more than babies fed cereal. - Human brains continue to grow at a fast rate even past weaning.
- It is likely that in order to evolve such large, capable brains, human babies were weaned onto primarily meat.
A meat-based, inherently ketogenic diet is not only likely to be our evolutionary heritage,
it is probably the best way to support the critical brain growth of the human child.
Acknowledgements
We would like to thank Matthew Dalby, a researcher at the University of Aberdeen, for helpful discussions about short-chain fatty acid production in the large intestines.
References
[1] | Hypothesis paper
Cunnane SC, Crawford MA.
Comp Biochem Physiol A Mol Integr Physiol. 2003 Sep;136(1):17-26.
|
[2] | Evidence type: experiment
Krebs NF, Westcott JE, Butler N, Robinson C, Bell M, Hambidge KM.
J Pediatr Gastroenterol Nutr. 2006 Feb;42(2):207-14.
(Emphasis ours) |
[3] | Evidence type: authority
Brandt I.
Klin Wochenschr. 1981 Sep 1;59(17):995-1007.
(Emphasis ours) |
[4] | Evidence type: experiment
Bartholomeusz HH, Courchesne E, Karns CM.
Neuropediatrics. 2002 Oct;33(5):239-41.
(Emphasis ours) |
[5] | Evidence type: experiment
Gale CR1, O’Callaghan FJ, Godfrey KM, Law CM, Martyn CN.
Brain. 2004 Feb;127(Pt 2):321-9. Epub 2003 Nov 25.
“Head circumference is known to correlate closely with brain volume (Cooke et al., 1977; Wickett et al., 2000) and can therefore be used to measure brain growth, but a single measurement cannot provide a complete insight into neurological development. Different patterns of early brain growth may result in a similar head size. A child whose brain growth both pre‐ and postnatally followed the 50th centile might attain the same head size as a child whose brain growth was retarded in gestation but who later experienced a period of rapid growth. Different growth trajectories may reflect different experiences during sensitive periods of brain development and have different implications for later cognitive function. |
[6] | Evidence type: review
“The total content of zinc in the adult human body averages almost 2 g. This is approximately half the total iron content and 10 to 15 times the total body copper. In the brain, zinc is with iron, the most concentrated metal. The highest levels of zinc are found in the hippocampus in synaptic vesicles, boutons, and mossy fibers. Zinc is also found in large concentrations in the choroid layer of the retina which is an extension of the brain. Zinc plays an important role in axonal and synaptic transmission and is necessary for nucleic acid metabolism and brain tubulin growth and phosphorylation. Lack of zinc has been implicated in impaired DNA, RNA, and protein synthesis during brain development. For these reasons, deficiency of zinc during pregnancy and lactation has been shown to be related to many congenital abnormalities of the nervous system in offspring. Furthermore, in children insufficient levels of zinc have been associated with lowered learning ability, apathy, lethargy, and mental retardation. Hyperactive children may be deficient in zinc and vitamin B-6 and have an excess of lead and copper. Alcoholism, schizophrenia, Wilson’s disease, and Pick’s disease are brain disorders dynamically related to zinc levels. Zinc has been employed with success to treat Wilson’s disease, achrodermatitis enteropathica, and specific types of schizophrenia.” |
[7] | Evidence type: authority From the CDC: “Who is most at risk? Young children and pregnant women are at higher risk of iron deficiency because of rapid growth and higher iron needs. Adolescent girls and women of childbearing age are at risk due to menstruation. Among children, iron deficiency is seen most often between six months and three years of age due to rapid growth and inadequate intake of dietary iron. Infants and children at highest risk are the following groups:
|
[8] | Evidence type: experiment
Minghua Tang and Nancy F Krebs
Am J Clin Nutr October 2014 ajcn.088807
(Emphasis ours) |
[9] | From Wikipedia: “Breastmilk supplement “Premastication is complementary to breastfeeding in the health practices of infants and young children, providing large amounts of carbohydrate and protein nutrients not always available through breast milk,[3] and micronutrients such as iron, zinc, and vitamin B12 which are essential nutrients present mainly in meat.[25] Compounds in the saliva, such as haptocorrin also helps increase B12 availability by protecting the vitamin against stomach acid. “Infant intake of heme iron “Meats such as beef were likely premasticated during human evolution as hunter-gatherers. This animal-derived bioinorganic iron source is shown to confer benefits to young children (two years onwards) by improving growth, motor, and cognitive functions.[26] In earlier times, premastication was an important practice that prevented infant iron deficiency.[27] “Meats provide Heme iron that are more easily absorbed by human physiology and higher in bioavailability than non-heme irons sources,[28][29] and is a recommended source of iron for infants.[30]” |
[10] | Hypothesis paper
Leslie C. Aiello and Peter Wheeler
Current Anthropology, Vol. 36, No. 2 (Apr., 1995), pp. 199-221
|
[11] | Evidence type: review
Milton K.
J Nutr. 2003 Nov;133(11 Suppl 2):3886S-3892S.
(The whole paper is worth reading, but these highlights serve our point.) |
[12] | Evidence type: review
Kennedy GE.
J Hum Evol. 2005 Feb;48(2):123-45. Epub 2005 Jan 18.
“Although some researchers have claimed that plant foods (e.g., roots and tubers) may have played an important role in human evolution (e.g., O’Connell et al., 1999; Wrangham et al., 1999; Conklin-Brittain et al., 2002), the low protein content of ‘‘starchy’’ plants, generally calculated as 2% of dry weight (see Kaplan et al., 2000: table 2), low calorie and fat content, yet high content of (largely) indigestible fiber (Schoeninger et al., 2001: 182) would render them far less than ideal weaning foods. Some plant species, moreover, would require cooking to improve their digestibility and, despite claims to the contrary (Wrangham et al., 1999), evidence of controlled fire has not yet been found at Plio-Pleistocene sites. Other plant foods, such as the nut of the baobab (Adansonia digitata), are high in protein, calories, and lipids and may have been exploited by hominoids in more open habitats (Schoeninger et al., 2001). However, such foods would be too seasonal or too rare on any particular landscape to have contributed significantly and consistently to the diet of early hominins. Moreover, while young baobab seeds are relatively soft and may be chewed, the hard, mature seeds require more processing. The Hadza pound these into flour (Schoeninger et al., 2001), which requires the use of both grinding stones and receptacles, equipment that may not have been known to early hominins. Meat, on the other hand, is relatively abundant and requires processing that was demonstrably within the technological capabilities of Plio-Pleistocene hominins. Meat, particularly organ tissues, as Bogin (1988, 1997) pointed out, would provide the ideal weaning food.” |
[13] | Plants can become more nutrient dense through cooking. That is the basis of Wrangham’s hypothesis: (From Wikipedia) “Wrangham’s latest work focuses on the role cooking has played in human evolution. He has argued that cooking food is obligatory for humans as a result of biological adaptations[9][10] and that cooking, in particular the consumption of cooked tubers, might explain the increase in hominid brain sizes, smaller teeth and jaws, and decrease in sexual dimorphism that occurred roughly 1.8 million years ago.[11] Most anthropologists disagree with Wrangham’s ideas, pointing out that there is no solid evidence to support Wrangham’s claims.[11][12] The mainstream explanation is that human ancestors, prior to the advent of cooking, turned to eating meats, which then caused the evolutionary shift to smaller guts and larger brains.[13]” |
[14] | Evidence type: review
Popovich DG1, Jenkins DJ, Kendall CW, Dierenfeld ES, Carroll RW, Tariq N, Vidgen E.
J Nutr. 1997 Oct;127(10):2000-5.
(Emphasis ours) |
[15] | The maximum amount of fat humans could get from fermenting fibre in the gut is unknown. The widely cited value of 10% of calories comes from: E. N. Bergman
Physiological Reviews Published 1 April 1990 Vol. 70 no. 2, 567-590
“The value of 6-10% for humans (Table 3) was calculated on the basis of a typical British diet where 50-60 g of carbohydrate (15 g fiber and 35-50 g sugar and starch) are fermented per day (209). It is pointed out, however, that dietary fiber intakes in Africa or the Third World are up to seven times higher than in the United Kingdom (55). It is likely, therefore, that much of this increased fiber intake is fermented to VFA and even greater amounts of energy are made available by large intestinal fermentation.” |
[16] | Evidence type: review
“As the mathematical models now seem quite plausible and the patterns of survivors versus extinct species seem inexplicable by climate change and easily explicable by hunting (7,11), it is worth considering comparisons to other systems. Barnosky et al. note that on islands, humans cause extinctions through multiple synergistic effects, including predation and sitzkrieg, and “only rarely have island megafauna been demonstrated to go extinct because of environmental change without human involvement,” while acknowledging that the extrapolation from islands to continents is often disputed (7). The case for human contribution to extinction is now much better supported by chronology (both radiometric and based on trace fossils like fungal spores), mathematical simulations, paleoclimatology, paleontology, archaeology, and the traits of extinct species when compared with survivors than when Meltzer and Beck rejected it in the 1990s, although the blitzkrieg model which assumes Clovis-first can be thoroughly rejected by confirmation of pre-Clovis sites. Grayson and Meltzer (12) argue that the overkill hypothesis has become irrefutable, but the patterns by which organisms went extinct (7,11), the timing of megafauna population reductions and human arrival when compared with climate change (5), and the assumptions necessary to make paleoecologically informed mathematical models for the extinctions to make accurate predictions all provide opportunities to refute the overkill hypothesis, or at least make it appear unlikely. However, all of these indicate human involvement in megafauna extinctions as not only plausible, but likely.” |
[17] | Evidence type: review
William J. Ripple and Blaire Van Valkenburgh
BioScience (July/August 2010) 60 (7): 516-526.
“Humans are well-documented optimal foragers, and in general, large prey (ungulates) are highly ranked because of the greater return for a given foraging effort. A survey of the association between mammal body size and the current threat of human hunting showed that large-bodied mammals are hunted significantly more than small-bodied species (Lyons et al. 2004). Studies of Amazonian Indians (Alvard 1993) and Holocene Native American populations in California (Broughton 2002, Grayson 2001) show a clear preference for large prey that is not mitigated by declines in their abundance. After studying California archaeological sites spanning the last 3.5 thousand years, Grayson (2001) reported a change in relative abundance of large mammals consistent with optimal foraging theory: The human hunters switched from large mammal prey (highly ranked prey) to small mammal prey (lower-ranked prey) over this time period (figure 7). Grayson (2001) stated that there were no changes in climate that correlate with the nearly unilinear decline in the abundance of large mammals. Looking further back in time, Stiner and colleagues (1999) described a shift from slow-moving, easily caught prey (e.g., tortoises) to more agile, difficult-to-catch prey (e.g., birds) in Mediterranean Pleistocene archaeological sites, presumably as a result of declines in the availability of preferred prey.” |
[18] | Evidence type: review
Ben-Dor M1, Gopher A, Hershkovitz I, Barkai R.
PLoS One. 2011;6(12):e28689. doi: 10.1371/journal.pone.0028689. Epub 2011 Dec 9.
“The disappearance of elephants from the diet of H. erectus in the Levant by the end of the Acheulian had two effects that interacted with each other, further aggravating the potential of H. erectus to contend with the new dietary requirements: |
This book was written a few years before the 2003 study you site regarding the survival of the fattest, and – if my memory serves me correctly – the author mentions this as well. Not sure which study she was referring to and my copy is in storage, so I cannot look it up. http://www.amazon.com/Our-Babies-Ourselves-Biology-Culture/dp/0385483627
Not sure if that URL link works from here, so I am posting the title and author in case others are interested in reading it.
"Our Babies, Ourselves" by Meredith Small
In "Nisa: The Life and Words of a !Kung Woman," Marjorie Shostak writes about the child-reading practices of the Kalahari hunter-gatherers. The women breastfed their child for an average of 4 years, with solid foods first being introduced around 1 year of age. They lived in an extremely harsh and marginal environment, so it is possible that longer breastfeeding times would have been nutritionally adventageous to their babies. If humans had access to higher fat and more abundant game during earlier periods of our evolution, it may not have been necessary to continue breastfeeding form that long. However, 2-4 years of breast feeding is a surprisingly common practice in many non-Western cultures today as discussed in "Breast-Feeding: Bio-Cultural Perspectives" by Katherine Dettwyler.
…which may – again – be a nutritional advantage for babies of cultures eating a starch-based diet.
37 years after weaning I'm still fascinated by meat n boobies.
LOL…You really are a crack-up Ash. 😜
"Beef and chicken liver, which has a soft, silky texture, and is extremely nutrient-dense."
Uh oh, that will rile the paleo crowd who have declared liver toxic for babies, as one hapless Australian chef recently discovered. I can't comment on the theoretical veracity of that claim as I haven't looked at it, but do have an empirical observation. One of our friends, a wapf adept, has used liver as a weaning food, and I have seen no adverse consequences at all. On the contrary.
What is sold as baby food in jars in our supermarket is beyond a joke. Some of it proudly presented as vegetarian or 99% fat free. I recently came across a case where a dog with suspected colitis was prescribed a canned, meat-free, 2% fat dog food by the vet and apparently sold in the veterinary clinic at over $5 a can (main ingredient: rice). Needless to say, it didn't work. It would be funny if it weren't so sad.
Paleos have declared liver toxic for babies? Where have they said this? From what I've read, it's the Australian government, publishers, and ignorant citizens that have a problem with it. I'd like to know if they've banned any of Sally Fallon's books that have similar recipes in them.
Michael Frederik,
You might enjoy this. Nora Gedgaudas has since written an excellent response to the fears of liver causing infant deaths. It's a long but good read. http://www.primalbody-primalmind.com/vitamin-a-under-attack-down-under/
I know a couple kids who were raised on raw vegan diet and they seem perfectly fine. Humans are really adaptable.
Jimmy,
Shazzie is a well-known raw vegan author who has met several other raw vegan families. She herself has noticed the ill-health, developmental delays, and failure to thrive of babies and children born and raised on raw vegan diets.
She chooses to be raw vegan for herself but she also chose a different diet for her own child base on her observations. She's quite rare and brave for having acknowledged and exposed the lies and cover-ups. I encourage you to read what she's written about it here: http://www.shazzie.com/life/articles/raw_vegan_children.shtml