Ketosis Without Starvation: the human advantage

I recently had the honour to speak at Low Carb Breckenridge 2018.
The video will be released publicly in the coming weeks, and when it does I will link to it here.
In the meantime, I’m posting my slides, notes and references.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-1.png
On a high carb diet, you might need to fast to attain an enlightened brain state.
On a ketogenic diet, as a human, that doesn’t appear to be necessary.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-2.png
The only disclosure I have to declare is that I have some generous supporters on Patreon for my writing.
Thank you!
The supported content is free, so these are donations.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-3-1.png
The foundation of our biochemical understanding of ketosis came from experiments in fasted humans and other animals.
For example the groundbreaking work of George Cahill.
I recommend his publication Fuel Metabolism in Starvation ([Cah2006]) which reviews many of his findings.
We continue to learn about mechanisms for how ketosis may increase health in a variety of ways.
However, these origins carry with them an implicit cautionary note, since starvation is generally not recommended, for obvious reasons.
It’s not sustainable indefinitely. It’s stressful to the body. And it can do real harm, sometimes with lasting detrimental consequences.
Even fasting for short periods is surrounded by controversy among experts at this very conference,
because of its potential to do damage to lean mass, and all the potential problems of protein and calorie malnutrition.
If ketosis is like fasting, we had better use it carefully, judiciously, and sparingly.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-4-1.png
Many researchers conceptualise metabolism as operating in two complementary phases.
The act of eating or not eating sets off a cascade of hormonal and molecular signals that result in one phase or the other,
Sometimes called the fed and fasting states.
In this paper [Mat2018] they are called the glucose and ketone phases.
Important things happen in both phases.
The fed state is attributed with generating and synthesising things like tissue, mitochondria, and neurons,
But the fasted state is attributed with
clearing broken structures for renewal and repair, and
providing the stimuli to direct the synthesis phase.
Ketosis is normally indicative of the fasting state

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-5-1.png
Many believe that staying in either phase prolongedly leads to disease.
And so you will hear people talk about metabolic switching, metabolic flexibility, insulin pulsatility, and so on.
Ketosis is normally an indication and a signal of the fasting state, so reason tells us that chronic long-term ketosis is unhealthy.
[Graphic from a blog post on the role of bitter in enhancing the starvation signal. https://rosemarycottageclinic.wordpress.com/2017/07/28/how-can-bitter-foods-be-good-for-us-when-they-taste-so-bad-tackling-the-paradox/]

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-6-1.png
Further, it’s been shown in longevity research that many animals use signals of fed and fasting state to
determine whether to reproduce, because it’s a time of plenty,
or to slow aging and shut down reproductive ability until more favorable conditions arise.
So again, the comparison leads us to fear that ketosis may have benefits, but that it comes with a severe cost.
[Graphic from: Insulin Signaling in the Central Nervous System. Daniel Porte, Denis G. Baskin, Michael W. Schwartz. Diabetes May 2005, 54 (5)]

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-7-1.png
But hold on.
It turns out that starvation is not the only condition where ketosis naturally arises.
Fetuses use ketones in the womb [Sha1985], [Ada1975], [Cun2016].
The placenta is full of BOHB [Mun2016].
Some mammals, humans included, have “ketosis of suckling”. Breastfed infants are in mild ketosis [Per1966], [Kra1974], [Bou1986].
In fact humans of all ages easily attain ketosis without protein or calorie deprivation, so long as they aren’t eating carbohydrates.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-8-1.png
This graph shows how quickly the concentration of BOBH goes up in humans when they stop eating.
It’s inversely related to age,
One of the stunning things about it is the orders of magnitude involved.
Look for example at the 6-8 year old children.
If the 4-hour mark is about 0.1 – 0.2,
Then in a day, it’s increased by a factor of 20 or 40.
Newborns, who typically aren’t yet eating cereal, of course, don’t start that low.
Also notice that children don’t even need to miss a day of food to get above the 0.5 mmol level of ketosis
which has been considered the threshold of nutritional ketosis by Phinney and others.
In his presentation here, he has even said that benefits likely begin even below that level.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-9-1.png
But they don’t have to abstain from eating for ketosis to happen.
For example, we have results in epileptic children.
The previous standard had been a tightly protein restricted ketogenic diet.
We now know that most children don’t need that for seizure control.
Eating a modified Atkins diet,
which mostly just means they stay in the induction phase instead of adding back carbs,
typically they are in ketosis,
Even though they eat ad libitum [Kos2013].
these are growing children and adolescents.
Unlike with the protein restricted versions of ketogenic diets for epilepsy,
which in some cases have impacted growth,
when protein isn’t restricted, neither is growth [Nat2014].

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-10-1.png
Even adults have this ability.
To know whether adults are able to stay in ketosis when protein needs are exceeded,
we have to know what our protein needs are.
It depends who you ask.
[Please see also How much protein is enough? ]

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-11-1.png
I don’t know of a study with the express purpose to find the upper bound of protein for ketosis,
but we can look at studies that recorded it.
Notice that
The figures in this chart are using current weight, not ideal weight,
and many of them are studies in overweight people,
so the g/kg estimates look lower than they would be if using ideal weight.
I’d love to see this question approached systematically, but the survey does at least suggest that
protein levels above our minimum needs based on positive nitrogen balance
still support ketosis.
[Graphic from: http://sci-fit.net/2017/carbs-protein-ketosis-research/ ]

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-12-1.png
When you compare adult humans with other species instead of with children,
It’s even more impressive.
Dogs are in many ways similar to humans.
Our digestive anatomy and physiology is very similar.
Dogs can reach ketosis from fasting, but it takes longer, and never attains the same level [Cra1941].
With adequate protein in the diet, it doesn’t happen to any significant degree at all [Rom1981], [Kro1973], [San2015].
I have spoken with staff at KetoPet Sanctuary, who treat cancerous dogs with ketosis.
They tell me that it is challenging to keep dogs in ketosis.
They have to use a combination of protein restriction, calorie restriction, and MCT oils.
It takes constant monitoring and adjustment.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-13-1.png
Rodents are often used in experimental conditions, and I do think they are very useful models,
but it takes more protein or calorie restriction to achieve an appropriate degree of ketosis than it would with humans.
The line between adequate protein and too much for ketosis is almost vanishingly small [Stephen Phinney Q&A Low Carb Cruise 2017],
and the levels they achieve are again much less spectacular [Benjamin Bikman, personal communication].
Similarly, almost any level of dietary carbohydrates is enough to shut down ketosis [Richard David Feinman, personal communication].
Some researchers believe this has to do with their relative lack of brains,
Since ketosis has been thought of as a way to spare glucose for the brain.
But ketosis isn’t the only solution for that.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-14-1.png
Obligate carnivores are always on very low carb diets,
so you might think they are always in ketosis,
but that’s not at all the case.
In fact they are specialised at gluconeogenesis,
that is, getting all their energy needs met by converting protein into glucose.
Protein needs tend to be high.
Cats have much higher protein needs than omnivores
and surprisingly, they don’t adapt well to reduced protein or fasting [Cen2002].
They don’t seem to have good mechanisms to compensate for the various amino acid and vitamin deficiencies that develop,
so they suffer from ammonia toxicity, methylation problems, and oxidative stress.
They do produce ketones fasted, but they don’t seem to use them in a productive way.
and they actually accumulate fatty acids in the liver when fasted;
the opposite of what humans do,
Because they are still producing glucose,
they become like human type two diabetics.
Dolphins are particularly interesting because they have really large brains,
and they eat a diet that would be expected to be ketogenic if fed to humans.
However, they don’t seem to even generate ketone at all, not even when fasting.
Instead, they ramp up gluconeogenesis [Rid2013].
They keep their bodies and their brains going by increased glucose.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-15-1.png
When faced with this observation that humans use ketosis even when they don’t have to for glucose production,
one obviously wonders how this happens from a mechanistic standpoint.
I have never seen the question raised in the literature, let alone answered.
If I were to take a guess, I’d say it probably happens somewhere in this process.
CPT1A is a kind of gatekeeper, transporting fatty acids into the mitochondria for oxidation.
This is normally a necessary step in the creation of ketone bodies.
The coenzyme malonyl-CoA inhibits CPT1A [Fos2004].
The functional reason it does that is because malonyl-CoA is a direct result of glucose oxidation
and is on the path to de novo lipogenesis.
It could be inefficient to be both generating fat and oxidizing it.
So this is a convenient signal to slow entry of fat into the mitochondria.
However, its action is not stictly linear.
It uses hysteresis.
Hysteresis is a way of preventing thrashing back and forth between two states at the threshold of their switch.
For example, if you set your thermostat to 20°C,
you would not want the heater to be turned on when the temperature drops to 19.999
and turned off again at 20.
This would result in constant switching.
Instead, a thermostat waits until the temperature drops a little lower
before activating the heater, and heats it a little more than required before deactivating it.
Hysteresis is implemented in CPT1A by its becoming insensitive to malonyl-CoA when levels of it are low [Ont1980], [Bre1981], [Gra1988], [Gre2009], [Akk2009].
That means that once CPT1A becomes very active in transporting fatty acids,
it takes time before the presence of malonyl-CoA will inhibit CPT1A at full strength again.
That means that fluxuations in glucose oxidation,
or small, transient increases in glucose oxidation
don’t disturb the burning of fatty acids or the production of ketones.
It could be the case that humans develop more insensitivity to malonyl-CoA under ketosis than other species do,
allowing them to metabolise more protein without disturbing ketosis.
Among humans, this is case in populations such as some Inuit with the Artic variant of CPT1A.
That mutation slows down CPT1A activity immensely.
This was permitted by their diet which was very high in
polyunsaturated fats from sea mammals.
Polyunsaturated fats upregulate fatty acid oxidation by a large proportion compared to saturated fats [Cun2002], [Fra2003], [Fue2004],
so this mutation would not necessarily have been disruptive of ketosis in that population when eating their natural diet [Lem2012].
But a second effect of the same gene further decreases the sensitivity of CPT1A to inhibition by malonyl-CoA.
That means they are less likely to be knocked out of ketosis by high protein intake.
I will go into this in much greater detail in my upcoming talk at AHS18.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-16-1.png
The second question that comes to mind is what does this difference imply about our evolutionary environment?
I would suggest that for humans to have developed the ability to stay in ketosis even with more than sufficient protein intake,
we must have at least have spent frequent long periods in a condition of very low carbohydrate, high fat access, either exogenously or endogenously,
and more than adequate protein as a dietary norm.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-17-1.png
Finally why?
Why do we stay in ketosis even when we have enough protein to
feed the brain glucose without compromising lean mass.
Or to put it another way:
Other animals continue to burn through lean mass with or without ketosis
until they have enough protein to fuel everything with glucose.
I suspect it has something to do with our brains.
I’ll suggest a few hypotheses along these lines.

The next few slides summarise topics I’ve spoken and written about before. Please see
Optimal Weaning from an Evolutionary Perspective
for more details and links about brain growth and our acquired reliance on meat during evolution.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-18-1.png
Our brains are big.
Primates are already big brained for mammals,
and from that starting point our brains tripled in size over the course of a couple million years.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-19-1.png
Brains take a lot of energy to run,
To accommodate that we made a trade.
Herbivores get most of their energy from fibre by fermenting it in the gut.
But this isn’t very efficient, because intestines also take a lot of energy.
So we transferred to a strategy of eating fat directly
Giving up colon size for brain size.
To get enough fat directly, we had to eat meat.
[Graphic from: Milton, Katharine. “Nutritional Characteristics of Wild Primate Foods: Do the Diets of Our Closest Living Relatives Have Lessons for Us?” Nutrition 15, no. 6 (June 1999): 488–98. https://doi.org/10.1016/S0899-9007(99)00078-7.
version enhanced with colour by http://roarofwolverine.com/archives/219 ]

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-20-1.png
Energy is one reason we might want to stay in ketosis.
Human brains use an extraordinary amount of energy, at least 20% in adults
Some 40g/day of that has to come from glucose,
because it houses some of the few types of cells that are glucose bound.
But the rest can be met by ketones.
Our brains use ketones preferentially when they are available.
Though in the modern context, that’s not very often.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-21-1.png
If adult brains weren’t large and expensive to run enough,
Consider how much bigger the brain of a child is relative to the body.
[Graphic from: http://vertpaleo.org/Society-News/Blog/Old-Bones-SVP-s-Blog/December-2013/Growing-up-(and-out,-and-sidways,-and-around).aspx ]

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-22-1.png
This may explain why human babies are so fat.
These graphs are from a paper exploring different hypotheses about baby fat [Kuz1998],
one of them being to supply the brain energy in the form of ketones.
The one on the left shows % body fat at birth in different species.
Newborn humans come in at 15% fat.
That actually gets higher in the first several months of life,
Peaking at about 25%.
The only other primate in that graph is the baboon infant at 4%.
The one on the right is what percent of oxygen metabolised by the whole body is going to the brain:
Humans at birth 60%, human adults 20%,… the adult chimpanzee comes in at about 9%.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-23-1.png
Another consideration is building materials,
since our brains are made mostly of fat and cholesterol and we know that ketones are used to synthesize those in situ.
The diagram here [Cot2013] shows pathways of how ketones can be generated, oxidized, or used to make fat and cholesterol.
Fetuses and newborns use ketone bodies extensively,
as I mentioned previously.
But the point here is that it’s not just because they’re using it for fuel.
It’s also a source of structural components.
In light of that,
It seems like a reasonable hypothesis that ketogenic capacity in humans is so pronounced in childhood because the brain is developing,
And ketones are for some reason the preferred material.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-25-1.png
Other species tend to wean at the time when brain growth stops.
That means that for them ketogenesis stops at the same time brain growth stops.
In humans brain growth doesn’t stop at weaning [Ken2005], [Mar1982], [Dob1973], [Dek1978].
Even after it reaches about full size in adolescence, it continues to change structurally well into adulthood.
However, quantitatively, this structural cost is very small compared to energy considerations [Kuz1998],
And so that hypothesis seems relatively weak on its own.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb-245.png
Another set of ideas comes from the metabolic effects we see in the lab and clinic.
Some of the strongest, most consistent effects we’ve seen therapeutically from ketogenic diets take place in the brain,
These are just a few metabolic changes relative to a high carb diet.
Each can have profound effects on the workings of the brain.
I do want to draw attention to the last one about availability of arachadonic acid and DHA.
These are important for the brain as they make up the phospholipids,
and they are subject to a lot of turnover.
Each of these effects has been proposed as a solution to the mystery of why a ketogenic diet treats epilepsy so effectively [Bou2007], [Nyl2009], [Mas2012], [deL2014].

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-26-1.png
But it’s not just epilepsy that ketosis is good for.
Epilepsy is just the condition with the most research, and the widest acknowledgment.
Other conditions for which at least some evidence supports improvement via a ketogenic diet
include neurological disabilities in cognition and motor control [Sta2012];
the benefit here may have to do with the proper maintenance of brain structures such as myelination
(Recall phases: tear down damage, rebuild)
Survival after brain damage, the hypoxia of stroke or blows to the head is improved in animal models [Sta2012].
There is even animal evidence that brain damage due to nerve gas is largely mitigated by being in a state of ketosis during the insult [Lan2011].
Again, this suggests a structural support and resilience provided by a ketogenic metabolism.
Resilience comes in part from not being as susceptible to damage in the first place,
and that could be from reduced oxidative stress when using ketones for fuel.
Ketogenic diets as a treatment for cancer are controversial, but some of the best evidence in support of it comes from glioblastomas.
See e.g. [Zuc2010], [Sch2012].
This could be due mostly to the hypoglycemia stalling the rate of tumour development.
And to venture into an area less well studied, but of critical importance given the epidemic that would be more apparent were it less taboo,
there is preliminary evidence in the form of case studies that ketogenic diets may be promising treatments for many psychiatric illnesses too, for example, [Kra2009], [Phe2012].
Given that anticonvulsants are also used to treat bipolar, and the solid results of ketogenic diets on epilepsy, this may not be surprising.
Additionally, the enhanced availability of AA and DHA may play a crucial role
Because these fatty acids are critical for the brain, and dysregulation in their flux has been associated with bipolar disorder and schizophrenia.
See e.g. [McN2008] and [Pee1996].

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-27-1.png
I would almost like to call a ketogenic diet a brain-growth mimicking diet.
The question of how and why humans are so ketosis prone may lead to interesting new insights about us as a species.
We seem to avoid giving up ketosis as long as possible.
only halting it when we take in so much glucose exogenously that we have to store it.
It seems likely that it facilitated the evolution of our brains,
that organ that makes us so different from other animals that we sometimes forget we are animals.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-28-1.png
Returning to the importance of metabolic switching between glucose and ketone mode,
there seems to be a false dichotomy.
There is a stage that doesn’t usually come up in discussions of fed and fasted, and that’s the “postabsorptive” phase.
The absorptive phase on a high carb diet lasts about 4 hours.
That’s how long it takes to clear away the exogenous glucose.
Only after that can you start the postabsorptive phase,
Marked by using glycogen as your source of blood sugar.
Other than overnight, SAD dieters typically don’t go more than 4 hours without eating, and so we don’t get very far.
But if you are on a protein and calorie sufficient very low carb diet, then even after eating, your glycogen stores don’t get that full in the first place.
I don’t know how long it takes to get from the meal to maximum glycogen storage,
But essentially, we should expect to get to a SAD dieter’s postabsorptive almost immediately after a meal, and easily into the ketogenic zone every day.
You can accentuate this by demanding more energy between meals (exercise)
or eating less frequently, for example only once or twice a day.
Interestingly, this often naturally happens to ketogenic dieters.
( Graphic from [Cah2006] )

https://ketotic.mytimpani.co.uk/wp-content/uploads/2018/04/lcb18-29-1.png
On a high carb diet, you might need to fast to attain an enlightened brain state.
On a ketogenic diet, as a human, that doesn’t appear to be necessary.

References

In the interest of time, I did not do my usual practice of end-to-end citations.
I will probably return to fix that later!

[Ada1975] Adam, P. A., N. Räihä, E. L. Rahiala, and M. Kekomäki. “Oxidation of Glucose and D-B-OH-Butyrate by the Early Human Fetal Brain.” Acta Paediatrica Scandinavica 64, no. 1 (January 1975): 17–24.
[Akk2009] Akkaoui, Marie, Isabelle Cohen, Catherine Esnous, Véronique Lenoir, Martin Sournac, Jean Girard, and Carina Prip-Buus. “Modulation of the Hepatic Malonyl-CoA–carnitine Palmitoyltransferase 1A Partnership Creates a Metabolic Switch Allowing Oxidation of de Novo Fatty Acids1.” Biochemical Journal 420, no. 3 (June 15, 2009): 429–38. https://doi.org/10.1042/BJ20081932.
[Bou2007] Bough Kristopher J., and Rho Jong M. “Anticonvulsant Mechanisms of the Ketogenic Diet.” Epilepsia 48, no. 1 (January 4, 2007): 43–58. https://doi.org/10.1111/j.1528-1167.2007.00915.x.
[Bou1986] Bougneres PF, C Lemmel, P Ferré, and D M Bier. Ketone body transport in the human neonate and infant. J Clin Invest. 1986 Jan; 77(1): 42–48.
[Bre1981] Bremer, J. “The Effect of Fasting on the Activity of Liver Carnitine Palmitoyltransferase and Its Inhibition by Malonyl-CoA.” Biochimica Et Biophysica Acta 665, no. 3 (September 24, 1981): 628–31.
[Cah2006] (1, 2) Cahill, George F. “Fuel Metabolism in Starvation.” Annual Review of Nutrition 26, no. 1 (August 2006): 1–22. https://doi.org/10.1146/annurev.nutr.26.061505.111258.
[Cen2002] Center, S. Feline Hepatic Lipidosis. June 2002. Australian Veterinary Practitioner 32(2)
[Cot2013] Cotter, David G., Rebecca C. Schugar, and Peter A. Crawford. “Ketone Body Metabolism and Cardiovascular Disease.” American Journal of Physiology-Heart and Circulatory Physiology 304, no. 8 (April 15, 2013): H1060–76. https://doi.org/10.1152/ajpheart.00646.2012.
[Cra1941] Crandall, Lathan. A comparison of ketosis in man and dog. March 1, 1941 The Journal of Biological Chemistry. 138, 123-128.
[Cun2002] Cunnane, S. C., K. Musa, M. A. Ryan, S. Whiting, and D. D. Fraser. “Potential Role of Polyunsaturates in Seizure Protection Achieved with the Ketogenic Diet.” Prostaglandins, Leukotrienes, and Essential Fatty Acids 67, no. 2–3 (September 2002): 131–35.
[Cun2016] Cunnane SC, Courchesne-Loyer A, Vandenberghe C, et al. Can Ketones Help Rescue Brain Fuel Supply in Later Life? Implications for Cognitive Health during Aging and the Treatment of Alzheimer’s Disease. Frontiers in Molecular Neuroscience. 2016;9:53. doi:10.3389/fnmol.2016.00053.
[Dek1978] Dekaban AS. Changes in brain weights during the span of human life: relation of brain weights to body heights and body weights. Ann Neurol. 1978 Oct;4(4):345-56.
[deL2014] Lima, Patricia Azevedo de, Leticia Pereira de Brito Sampaio, and Nágila Raquel Teixeira Damasceno. “Neurobiochemical Mechanisms of a Ketogenic Diet in Refractory Epilepsy.” Clinics 69, no. 10 (October 2014): 699–705. https://doi.org/10.6061/clinics/2014(10)09.
[Dob1973] John Dobbing and Jean Sands. Quantitative growth and development of human brain. Arch Dis Child. 1973 Oct; 48(10): 757–767.
[Fos2004] Foster, Daniel W. “The Role of the Carnitine System in Human Metabolism.” Annals of the New York Academy of Sciences 1033, no. 1 (November 2004): 1–16. https://doi.org/10.1196/annals.1320.001.
[Fra2003] Fraser, D. D., S. Whiting, R. D. Andrew, E. A. Macdonald, K. Musa-Veloso, and S. C. Cunnane. “Elevated Polyunsaturated Fatty Acids in Blood Serum Obtained from Children on the Ketogenic Diet.” Neurology 60, no. 6 (March 25, 2003): 1026–29.
[Fue2004] Fuehrlein, Brian S., Michael S. Rutenberg, Jared N. Silver, Matthew W. Warren, Douglas W. Theriaque, Glen E. Duncan, Peter W. Stacpoole, and Mark L. Brantly. “Differential Metabolic Effects of Saturated versus Polyunsaturated Fats in Ketogenic Diets.” The Journal of Clinical Endocrinology and Metabolism 89, no. 4 (April 2004): 1641–45. https://doi.org/10.1210/jc.2003-031796.
[Gra1988] Grantham, B. D., and V. A. Zammit. “Role of Carnitine Palmitoyltransferase I in the Regulation of Hepatic Ketogenesis during the Onset and Reversal of Chronic Diabetes.” Biochemical Journal 249, no. 2 (January 15, 1988): 409–14. https://doi.org/10.1042/bj2490409.
[Gre2009] Greenberg, Cheryl R., Louise A. Dilling, G. Robert Thompson, Lorne E. Seargeant, James C. Haworth, Susan Phillips, Alicia Chan, et al. “The Paradox of the Carnitine Palmitoyltransferase Type Ia P479L Variant in Canadian Aboriginal Populations.” Molecular Genetics and Metabolism 96, no. 4 (April 2009): 201–7. https://doi.org/10.1016/j.ymgme.2008.12.018.
[Ken2005] Kennedy GE. From the ape’s dilemma to the weanling’s dilemma: early weaning and its evolutionary context. J Hum Evol. 2005 Feb;48(2):123-45. Epub 2005 Jan 18.
[Kos2013] Kossoff, Eric H., Mackenzie C. Cervenka, Bobbie J. Henry, Courtney A. Haney, and Zahava Turner. “A Decade of the Modified Atkins Diet (2003–2013): Results, Insights, and Future Directions.” Epilepsy & Behavior 29, no. 3 (December 2013): 437–42. https://doi.org/10.1016/j.yebeh.2013.09.032.
[Kra2009] Kraft, Bryan D., and Eric C. Westman. “Schizophrenia, Gluten, and Low-Carbohydrate, Ketogenic Diets: A Case Report and Review of the Literature.” Nutrition & Metabolism 6 (February 26, 2009): 10. https://doi.org/10.1186/1743-7075-6-10.
[Kra1974] Kraus H, Schlenker S, Schwedesky D. Developmental changes of cerebral ketone body utilization in human infants. Hoppe Seylers Z Physiol Chem. 1974 Feb;355(2):164-70.
[Kro1973] Kronfeld DS. Diet and the performance of racing sled dogs. J Am Vet Med Assoc. 1973 Mar 15;162(6):470-3.
[Kuz1998] (1, 2) Kuzawa, Christopher W. “Adipose Tissue in Human Infancy and Childhood: An Evolutionary Perspective.” American Journal of Physical Anthropology 107, no. S27 (January 1, 1998): 177–209. https://doi.org/10.1002/(SICI)1096-8644(1998)107:27+<177::AID-AJPA7>3.0.CO;2-B.
[Lan2011] Jeffrey L. Langston, Todd M. Myers Diet composition modifies the toxicity of repeated soman exposure in rats. Neurotoxicology. 2011 Jun;32(3):342-9. doi: 10.1016/j.neuro.2011.03.001. Epub 2011 Mar 17.
[Lem2012] Lemas, Dominick J., Howard W. Wiener, Diane M. O’Brien, Scarlett Hopkins, Kimber L. Stanhope, Peter J. Havel, David B. Allison, Jose R. Fernandez, Hemant K. Tiwari, and Bert B. Boyer. “Genetic Polymorphisms in Carnitine Palmitoyltransferase 1A Gene Are Associated with Variation in Body Composition and Fasting Lipid Traits in Yup’ik Eskimos.” Journal of Lipid Research 53, no. 1 (January 1, 2012): 175–84. https://doi.org/10.1194/jlr.P018952.
[Mar1982] Martin, Robert D. Human brain evolution in an ecological context. Fifty-second James Arthur lecture on the evolution of the human brain 1982
[Mas2012] Masino, Susan A., and Jong M. Rho. “Mechanisms of Ketogenic Diet Action.” In Jasper’s Basic Mechanisms of the Epilepsies, edited by Jeffrey L. Noebels, Massimo Avoli, Michael A. Rogawski, Richard W. Olsen, and Antonio V. Delgado-Escueta, 4th ed. Bethesda (MD): National Center for Biotechnology Information (US), 2012. http://www.ncbi.nlm.nih.gov/books/NBK98219/.
[Mat2018] Mattson, Mark P., Keelin Moehl, Nathaniel Ghena, Maggie Schmaedick, and Aiwu Cheng. “Intermittent Metabolic Switching, Neuroplasticity and Brain Health.” Nature Reviews Neuroscience 19, no. 2 (January 11, 2018): 63–80. https://doi.org/10.1038/nrn.2017.156.
[McN2008] McNamara, Robert K., Ronald Jandacek, Therese Rider, Patrick Tso, Kevin E. Stanford, Chang-Gyu Hahn, and Neil M. Richtand. “Deficits in Docosahexaenoic Acid and Associated Elevations in the Metabolism of Arachidonic Acid and Saturated Fatty Acids in the Postmortem Orbitofrontal Cortex of Patients with Bipolar Disorder.” Psychiatry Research 160, no. 3 (September 30, 2008): 285–99. https://doi.org/10.1016/j.psychres.2007.08.021.
[Mun2016] Muneta, Tetsua, Eri Kawaguchi, Yasushi Nagai, Momoyo Matsumoto, Koji Ebe, Hiroko Watanabe, Hiroshi Bando. “Ketone Body Elevation in Placenta, Umbilical Cord, Newborn and Mother in Normal Delivery.” Glycative Stress Research 2016; 3 (3): 133-140
[Nat2014] Nation, Judy, Maureen Humphrey, Mark MacKay, and Avihu Boneh. “Linear Growth of Children on a Ketogenic Diet: Does the Protein-to-Energy Ratio Matter?” Journal of Child Neurology 29, no. 11 (November 2014): 1496–1501. https://doi.org/10.1177/0883073813508222.
[Nyl2009] Nylen, Kirk, Sergei Likhodii, and W. McIntyre Burnham. “The Ketogenic Diet: Proposed Mechanisms of Action.” Neurotherapeutics 6, no. 2 (April 2009): 402–5. https://doi.org/10.1016/j.nurt.2009.01.021.
[Ont1980] Ontko, J. A., and M. L. Johns. “Evaluation of Malonyl-CoA in the Regulation of Long-Chain Fatty Acid Oxidation in the Liver. Evidence for an Unidentified Regulatory Component of the System.” Biochemical Journal 192, no. 3 (December 15, 1980): 959–62. https://doi.org/10.1042/bj1920959.
[Pee1996] Peet, M., J. D. Laugharne, J. Mellor, and C. N. Ramchand. “Essential Fatty Acid Deficiency in Erythrocyte Membranes from Chronic Schizophrenic Patients, and the Clinical Effects of Dietary Supplementation.” Prostaglandins, Leukotrienes, and Essential Fatty Acids 55, no. 1–2 (August 1996): 71–75.
[Per1966] Persson B, Gentz J. The pattern of blood lipids, glycerol and ketone bodies during neonatal period, infancy and childhood. Acta Paediatr Scand 1966 Jul;55(4):353-62
[Phe2012] James R. Phelps, Susan V. Siemers & Rif S. El-Mallakh The ketogenic diet for type II bipolar disorder. Neurocase: The Neural Basis of Cognition DOI: 10.1080/13554794.2012.690421
[Rid2013] Ridgway, S. H. (2013). A Mini Review of Dolphin Carbohydrate Metabolism and Suggestions for Future Research Using Exhaled Air. Frontiers in Endocrinology, 4, 152.
[Rom1981] Romsos DR , Palmer HJ , Muiruri KL , Bennink MR Influence of a low carbohydrate diet on performance of pregnant and lactating dogs. The Journal of Nutrition [01 Apr 1981, 111(4):678-689]
[San2015] Seizures in Dogs and Cats. Sean Sanders. John Wiley & Sons, Feb 9, 2015
[Sch2012] Scheck, Adrienne C., Mohammed G. Abdelwahab, Kathryn E. Fenton, and Phillip Stafford. “The Ketogenic Diet for the Treatment of Glioma: Insights from Genetic Profiling.” Epilepsy Research 100, no. 3 (July 2012): 327–37. https://doi.org/10.1016/j.eplepsyres.2011.09.022.
[Sha1985] Shambaugh GE 3rd. Ketone body metabolism in the mother and fetus. Fed Proc. 1985 Apr;44(7):2347-51.
[Sta2012] (1, 2) Stafstrom, Carl E., and Jong M. Rho. “The Ketogenic Diet as a Treatment Paradigm for Diverse Neurological Disorders.” Frontiers in Pharmacology 3 (April 9, 2012). https://doi.org/10.3389/fphar.2012.00059.
[Zuc2010] Zuccoli, Giulio, Norina Marcello, Anna Pisanello, Franco Servadei, Salvatore Vaccaro, Purna Mukherjee, and Thomas Seyfried. “Metabolic Management of Glioblastoma Multiforme Using Standard Therapy Together with a Restricted Ketogenic Diet: Case Report.” Nutrition & Metabolism 7, no. 1 (2010): 33. https://doi.org/10.1186/1743-7075-7-33.

Does a ketogenic diet confer the benefits of butyrate without the fibre?

Tenuous arguments from fibre apologists

According to many plant-eating enthusiasts, we must eat fibre to be healthy for the following reasons:
Note:
These are not the only arguments people make for eating fibre.
These are only reasons related to butyrate.

Of the above statements, only one of them seems well-justified to me, but it also seems irrelevant.
Let’s start from the end.

Without butyrate your colon cells will die off.

This idea (a quote from Wikipedia) seems to to be an exaggerated interpretation of a study by Donohoe et al..
The authors are studying germ-free mice,
who don’t, of course, have bacteria synthesising butyrate.
They describe what looks to them like impaired colon cell energetics in the mice
and ultimately autophagy upregulation, meaning the cells are eating themselves.
They reverse these effects with butyrate.
I’ve already written about some of the curious paradoxes inherent in the study.
To summarise, other studies consistently find germ-free mice to be healthier than wild mice by many a measure,
including appearing to be more energetic, and living longer.
There seems to have been a conflation of cell energy with mitochondrial energy, by not looking for mitochondrial density changes.
So, I’m not convinced the butyrate made things better.
Likewise, the reported evidence of autophagy (increased autophagosomes attributed to upregulation of AMPK),
insofar as it indicates autophagy, could equally be a desirable result,
given the role of autophagy in maintaining healthy tissues.
See, e.g. [Miz2011].
Certainly unrestrained autophagy, with no homeostatic mechanism, should result in total loss of tissue,
but that doesn’t seem to happen with the germ-free mice.
Germ-free rodents have freakishly large caecums, and somewhat reduced small intestines,
but so far as I can tell, no colon abnormalities worth mentioning.
For an extensive review of the data already available in 1971 on germ-free animals,
including the structure and function of various organs, see
The gnotobiotic animal as a tool in the study of host microbial relationships..
In any case, if the colons of germ-free mice are at any disadvantage
there are clearly more differences that might be attributable to than mere lack of butyrate.
Are there other reasons to worry about colon cells that don’t get any?

Butyrate is the preferred fuel of the colonocyte, therefore it is essential.

If you haven’t read my thoughts on the term “preferred”,
the point is that what a cell will consume first isn’t necessarily the fuel that is the healthiest,
though it certainly can be.
Other reasons could be to get rid of it,
or to access the metabolites.
I’m not really suggesting that butyrate is toxic to colon cells.
(Though as soon as that thought occurred to me I looked for evidence that it can be,
which, of course there is [Pen2007].
Apparently it can accumulate
due to maldigestion or bacterial overgrowth
and cause serious epithelial damage. But I digress.)
All I’m saying is that habitual heavy use doesn’t imply something is needed.
The same argument has been made about glucose in the brain,
and we all know that the brain actually needs only a very small amount of glucose,
if β-hydroxybutyrate is in good supply.
It’s still possible that other fuels are as good or better than butyrate for the colonocyte.

Butyrate in the colon treats colitis.

Normally, colonocytes do metabolise butyrate, mostly into CO2 and ketone bodies,
but this is impaired in ulcerative colitis [Roe1980], [Roe1993], [Ahm2000], such that ketogenesis is is inversely proportional to the severity of the disease [Roe1980].
This impairment may explain the mixed results in treatments involving butyrate.
Some researchers have tried to treat colitis by adding more butyrate for substrate, by enema.
Perhaps unsurprisingly, that has not met with much success. Or has it?
I read a somewhat confusing review [Mal2015]
that has several citations in it that don’t appear to line up with the claims preceded by the citations,
including citing the same paper that I’ve cited above (Roe1980), as showing “that restoration of butyrate levels by intracolonic infusion treats UC”, which I can find no mention of in the paper,
and citing a single paper twice, ([Ham2010]), once to say that enemas had very limited effect (which I think is correct) and once, later, to say it was a “well demonstrated” “cure”.
These are probably just simple citation errors on my part or theirs.
There have been some successes using enemas,
but the results are mixed [Ham2008].
Insofar as there are successes, it is worth noting that the butyrate was taken in by rectal cells, not colon cells, and so the effect was post-absorptive.
In other words, it must have come systemically.
In fact, when the butyrate is applied directly to impaired cells it seems to worsen the situation.
These points are noted in the review, and motivates their own contribution.
The researchers used intraperitoneal injections of butyrate to apparently almost completely restore colonocyte integrity in rodent models of colitis.
At face value, this would suggest that it is not the butyrate that helped,
but a metabolite of butyrate, i.e. ketone bodies, since peritoneal injections normally pass through the liver [Tur2011].
If it’s systemic ketone bodies we want,
we know how to do that!
Also, this method is rarely used in humans, so it may not be easy to make any practical use of.
In any case, none of this would suggest that eating plant fibre will help colitis in any way,
given that the issue appears to depend on inability to use the butyrate.
Ulcerative Colitis UC and Crohn’s Disease (CD) constitute the Inflammatory Bowel Diseases (IBD).
There is not clear evidence that fibre intake helps with IBD,
and in fact, “low residue” or “low fibre” diets are usually recommended (see below).
In case you were wondering, “residue” means anything that survives digestion,
and comes all the way through the intestines.
That includes fibre , but also
microorganisms, and secretions and cells shed from the alimentary tract.
While there are studies that support the benefit of fibre in IBD,
there are others showing harm.
The evidence is mixed enough to be called weak and inconclusive [Kap2016].
Anecdotes such as the “Crohn’s Carnivore” suggest a different solution might hold for some:

“Eight years ago I decided to eat nothing but meat for a year. Now I have a perfectly normal colon. If those two events are indeed correlated, and someone could figure out exactly how, a whole lot of people would be able to find relief from a terrible disease.”

That experience runs both with and possibly against current dietary guidelines for IBD.
In a 2011 review [Bro2011], the authors show that most guidelines advise low fibre intake, especially during flares.
Some also advise low fat intake, and in particular, to eat lean meat.
I’m not sure whether the Crohn’s Carnivore was eating lean or fatty meat during his year of healing.
At first blush, the low fat advisory looks like just another “extra-mile” kind of recommendation, in which guideline writers are throwing in other ideas about healthy diet for good measure.
However, they state that it comes from the reported reactions of some patients.
They also cite patient surveys which list meat as a provoking food in 25% of respondents.
(The most common response was vegetables, at 40%).
One wonders if there are conflations.
Later, the authors specifically say that there is little to support or refute a low fat recommendation.
Another anecdote, this time elevated to “case study” level, because physicians penned it, comes from the Evolutionary Medicine Working Group, in Budapest, Hungary [Tot2016].
They report complete resolution of symptoms in a child with Crohn’s and cessation of medications from an essentially meat-only diet.
The exception was that patient was allowed some honey, but it was low enough that ketosis was maintained.
This was a 2:1 fat:protein diet, so definitely not low fat.
The child had previously tried low fat, low fibre, and several medications without improvement.
It is interesting to note that even one dose of “paleo approved” fibre caused a flare up.

“Given the patient’s severe condition upon the first visit the paleolithic ketogenic diet was started in the strictest form thus containing no vegetables and fruits at all. Such a diet may first sound restrictive but our previous experience indicate that a full fat-meat diet is needed in the most severe cases of Crohn’s disease. In addition, our experience shows that even a single occasion of deviation from diet rules may result in lasting relapse. This was the case in the present patient too where breaking the strict rules (eating the “paleo cakes”) resulted in a thickening of the bowel wall. Based on our experience this is due to the components of the popular paleolithic diet including coconut oil, oil seeds and sugar alcohols which may trigger inflammation.”

In other words, a fibre-free ketogenic diet appears help IBD more than a diet including fibre, even a ketogenic diet including fibre.

Butyrate prevents colon cancer.

The idea that butyrate might be protective of colon cancer seems to have started in the 1980s (see, e.g., [Sen2006].
This area of research is extensive, and I am by no means an expert.
If you haven’t guessed, that butyrate has a protective effect on colon cancer
is the one statement I think is entirely defensible.
It’s not known exactly how butyrate exerts its protective effects,
but some mechanisms held to be important are also induced by β-hydroxybutyrate.
For example butyrate’s histone deacelytase (HDAC) inhibition is considered an important mechanism [Hin2002], [Blo2011].
β-Hydroxybutyrate is also an HDAC inhibitor [Shi2013].
Gpr109a receptor activation is a recently identified mechanism [Sin2014].
Gpr109a has many aliases, including hydroxycarboxylic acid receptor 2 (HCA2) or niacin receptor 1 (NIACR1), and HM74a/PUMA-G.
Gpr109a is activated by β-hydroxybutyrate [Tag2005], [Rah2014], [Gam2012].
It is sometimes simply called the β-hydroxybutyrate receptor.
In fact, the argument behind the relevance of the Gpr109a discovery is just as strong an argument for a ketogenic diet as for eating fibre!
(This sentence is incorrect. See Edit.)
That is, the researchers demonstrated that butyrate could substitute for niacin in activating these receptors,
and that just as niacin activation of Gpr109a in fat cells is protective of cardiovascular disease, it may also be in diseases of the colon,
and this argues for eating fibre to substitute for pharmalogic doses of niacin.
From a press release:

“We think mega-doses of niacin may be useful in the treatment and/or prevention of ulcerative colitis, Crohn’s disease, and colorectal cancer as well as familial adenomatous polyposis, or FAP, a genetic condition that causes polyps to develop throughout the gastrointestinal tract”

“Research teams at GlaxoSmithKline and the University of Heidelberg, Germany showed in 2003 that Gpr109a receptors on the surface of fat cells mediate the protective cardiovascular effect of niacin, including increasing good cholesterol, or HDL, while decreasing levels of disease-producing LDL. Their search for other activators identified butyrate, which led Ganapathy to find that not only is the Gpr109a receptor expressed on the surface of colon cells, but that with sufficient fiber intake, butyrate levels in the colon can activate it.”

[Edit] 2018-01-04:
A critic pointed out that the cell receptors for SCFAs are facing the lumen, and therefore argued that beta-hydroxybutyrate from the portal side would be irrelevant.
Indeed, the researchers using niacin also assume that the extremely high dose of niacin does not act sytemically, but rather reaches the lumen because of the super-high doses.
So the statement I made above, about the argument for beta-hydroxybutyrate being equal to that for niacin is not correct.
The argument still stands that the beta-hydroxybutyrate metabolites activating targets inside could be where the majority of the benefits of butyate come from.
That is where the HDAC inhibition occurs and where the immune cell receptors are.
At least one research group agrees with my speculation that the interior metabolites may be important for the effect [Siv2017]
” “As the cell-surface receptors for SCFAs are located on the lumen-facing apical membrane of colonic epithelial cells (see below), the luminal concentrations of these agonists are physiologically relevant. SCFAs are low-affinity agonists for these receptors, and the normal luminal concentrations of these bacterial metabolites are in the millimolar levels, sufficient to activate these receptors from the luminal side. However, some of the molecular targets for these metabolites are either inside the cells (e.g., HDACs) or on the surface of the immune cells located in the lamina propria. Therefore, concentrations of these metabolites inside the colonic epithelial cells and in the lamina propria are relevant to impact these molecular targets. The intracellular target HDAC is inhibited by butyrate and propionate at low micromolar concentrations. There are effective transport systems for SCFAs in the apical membrane of colonic epithelial cells (e.g., proton-coupled and sodium-coupled monocarboxylate transporters) [47], thus making it very likely for these SCFAs to reach intracellular levels sufficient to inhibit HDACs. Even though the luminal concentrations of SCFAs are in the millimolar range, it is unlikely that they reach lamina propria at significant levels to activate the cell-surface receptors present on the mucosal immune cells. These metabolites are present only at micromolar levels in the portal blood [57], indicating that they undergo robust metabolism inside the colonic epithelial cells. This raises the question as to the physiological relevance of these bacterial metabolites to the activation of the cell-surface SCFA receptors in immune cells located in the lamina propria. With regard to this issue, it is important to note that colonic epithelial cells are highly ketogenic; they use acetate and butyrate to generate the ketone body β-hydroxybutyrate [58]. This ketone body is released from the cells into portal blood. As β-hydroxybutyrate is 3–4 times more potent than butyrate in activating its receptor GPR109A, it can be speculated that the colon-derived ketone body is most likely involved in the activation of the SCFA receptor in mucosal immune cells.”
Moreover, see the preliminary systemic evidence below.

Interestingly, as in the case of colitis, colorectal cancer appears to involve a dysfunction in ability to use butyrate.
Specifically, there are detrimental changes in membrane transport that reduce its entry into the cell [Gon2016].
Therefore, it’s unclear that once the disease process has begun, increased fibre intake will be of any use.
Beta-hydroxybutyrate in the bloodstream, however, might.
There is at least some preliminary evidence that butyrate in the bloodstream has similar effects on intestinal tissue as butyrate coming from the colon itself [Kor1990], [Rol1997], [Bar2004],
as does infusion of glutamine and acetoacetate, another ketone body [Rom1990].
Ketogenic diets do increase blood acetoacetate.
If bloodstream infusion of butyrate is as effective as absorption of butyrate in the intestines in protecting colon cells from degradation, then it seems reasonable to hypothesise that β-hydroxybutyrate in the bloodstream would also have this effect.

These common mechanisms suggest that much or even all of the benefits obtainable by butyrate are equally achievable simply through ketogenic diets, making additional butyrate in the context of a ketogenic diet potentially superfluous.

Fibre is the only way to get butyrate.

Even though it seems likely that a fibre-free ketogenic diet is not only sufficient for colon health,
but better for treating colon disease,
we might feel cautious about going without the butyrate from fibre,
given the dire pronouncements from nutritional scientists.
Is there any other way to get butyrate?
The most significant food source, butter, doesn’t give much.
Only about 3-4% of butter is butyric acid.
According to [Sen2006] we produce >200mmol per day.
That would take about a pound of butter!
Stepping back, it should be obvious that carnivores such as felines and canines
provide an important source of data relevant to this question.
Carnivores have colons, and they are not normally in ketosis unless food is scarce.
Either their colons don’t need butyrate,
or they are getting sufficient butyrate from some other source.
As it happens, there are microbes that ferment amino acids in to short chain fatty acids (SCFAs), including butyrate.
Carnivores are known to get “animal fibre” from their prey.
That is, amino acids from incompletely digested animal parts reach their colons and are fermented.
In particular, in cheetahs, casein, collagen, and glucosamine have been shown to result in butyrate production comparable to fructo-oligosaccharides [Dep2012].
Beyond poorly digested animal sourced fibre,
many amino acids are fermented into SCFAs, including butyrate [Ras1988],
and these amino acids are abundant in human intestines and colons and are fermented there [Vit2014], [Dai2015], [Nei2015], [Wie2017].
I was unable to determine how much butyrate this would account for.
I did find research comparing the SCFA levels produced in dogs under conditions of high fibre vs. meat alone
showing that they produced almost as much VFA (another word for SCFA) in their colons eating meat alone [Ban1979].
In any case, we certainly do generate butyrate in the absence of dietary fibre.

In sum

Although many in the medical community consider butyrate an essential fuel for colon cells, there may be a parallel to glucose and brain cells, in that some or all of this functionality could be replaceable with β-hydroxybutyrate. This idea is supported by these observations:

  • Carnivores and even germ-free mice have intact, working colons without contributions from fibre-derived butyrate, so it stands to reason that humans may not need it either.
  • Although not discussed in this post, some recent societies thrived on animal-based diets with little and infrequent plant intake.
  • β-hydroxybutyrate triggers many of the same mechanisms that butyrate does; those very mechanisms thought to explain its role in preventing colon cancer and the intestinal degradation seen in diseased colons or the colons of those receiving reduced fibre diets to promote bowel rest.
  • β-hydroxybutyrate may even be the pathway through which butyrate exerts its beneficial effects, given that it is a direct metabolite of butyrate, and that systemic butyrate appears to be as effective or even more effective in treating colitis, than direct application of butyrate to the cells.
  • Even without eating fibre, our intestinal microbes produce butyrate from amino acids. If systemic ketone bodies supplant or even just reduce the need for butyrate, amino acid derived butyrate may supply this need, even if the quantities turn out to be less than we would get from fibre.

End-to-end citations

[Ahm2000] Evidence type: non-human animal experiment

Ahmad MS, Krishnan S, Ramakrishna BS, Mathan M, Pulimood AB, Murthy SN.
Gut. 2000 Apr;46(4):493-9.

“Abstract
“BACKGROUND/AIMS:
“Impaired colonocyte metabolism of butyrate has been implicated in the aetiopathogenesis of ulcerative colitis. Colonocyte butyrate metabolism was investigated in experimental colitis in mice.
“METHODS:
“Colitis was induced in Swiss outbred white mice by oral administration of 4% dextran sulphate sodium (DSS). Colonocytes isolated from colitic and normal control mice were incubated with [(14)C]butyrate or glucose, and production of (14)CO(2), as well as of intermediate metabolites (acetoacetate, beta-hydroxybutyrate and lactate), was measured. The effect of different substrate concentrations on oxidation was also examined.
“RESULTS:
“Butyrate oxidation (micromol/h per mg protein; mean (SEM)) was significantly reduced in DSS colitis, values on day 7 of DSS administration being 0.177 (0.007) compared with 0.406 (0.035) for control animals (p<0.001). Glucose oxidation (micromol/h per mg protein; mean (SEM)) on day 7 of DSS administration was significantly higher than in controls (0.06 (0.006) v 0.027 (0.004), p<0.001). Production of beta-hydroxybutyrate was decreased and production of lactate increased in DSS colitis compared with controls. Increasing butyrate concentration from 10 to 80 mM enhanced oxidation in DSS colitis (0.036 (0.002) to 0.285 (0.040), p<0.001), although it continued to remain lower than in controls. Surface and crypt epithelial cells showed similar ratios of butyrate to glucose oxidation. When 1 mM DSS was added to normal colonocytes in vitro, it did not alter butyrate oxidation. The initial histological lesion of DSS administration was very patchy and involved crypt cells. Abnormal butyrate oxidation became apparent only after six days of DSS administration, at which time histological abnormalities were more widespread.
“CONCLUSIONS:
“Colonocyte metabolism of butyrate, but not of glucose, is impaired in DSS colitis, and may be important in pathophysiology. Histological abnormalities preceded measurable defects in butyrate oxidation.”

[Ban1979] Evidence type: non-human animal experiment

Banta, C. A., Clemens, E. T., Krinsky, M. M., and SheiIy, B. E., 1979,
J. Nutr. 109:1592-1600.

“Two commercial type diest, one a cereal based dry food, the other a fortified all meat canned food were fed to male and female adult beagle dogs to evaluate effects of diet on rate of digesta passage and organic acid concentration along the gastrointestinal tract. […] Concentrations of VFA were highest in the cecum and colon and were not significantly affected by diet.”

https://ketotic.mytimpani.co.uk/wp-content/uploads/2017/11/dog-vfa.jpg
“Symbols on the abscissa denote sec tions of tract as follows : cranial stomach ( Si ) ; caudal stomach ( 82) ; proximal ( Sii ), middle (SI2) and distal (SL) thirds of the small intestine; cecum (Ce); and proximal (Ci) and distal (C«) halves of the colon ( n = 3 ).”
[…]
“It was surprising to see high concentrations of VFA produced in the lower gut of dogs fed the meat diet. It was logical to assume that liver and muscle glycogen could serve as the fermentable substrate for lactate production in the stomach, but most of this should have been digested and absorbed by the small intes tine. Another possible source of ferment able substance which could survive passage through the small intestine is the protein- polysaccharides of the connective tissue ground substance found in abundance in the meat by-products and whole ground chicken. The ground substance is made up of chondroitin sulfates and hyaluronic acid. The polysaccharide portion of these substances is composed of long chains of disaccharide units consisting of glucosa- mine or galactosamine and glucuronic acid. The linkages of these polysaccharides are not such that they can be cleaved by the endogenous digestive enzymes found in the gut but they could be split by microbial enzymes.”

[Bar2004] Evidence type: non-human animal experiment

Bartholome AL1, Albin DM, Baker DH, Holst JJ, Tappenden KA.
JPEN J Parenter Enteral Nutr. 2004 Jul-Aug;28(4):210-22; discussion 222-3.

“BACKGROUND:
“Supplementation of total parenteral nutrition (TPN) with a mixture of short-chain fatty acids (SCFA) enhances intestinal adaptation in the adult rodent model. However, the ability and timing of SCFA to augment adaptation in the neonatal intestine is unknown. Furthermore, the specific SCFA inducing the intestinotrophic effects and underlying regulatory mechanism(s) are unclear. Therefore, we examined the effect of SCFA supplemented TPN on structural aspects of intestinal adaptation and hypothesized that butyrate is the SCFA responsible for these effects.
“METHODS:
“Piglets (n = 120) were randomized to (1) control TPN or TPN supplemented with (2) 60 mmol/L SCFA (36 mmol/L acetate, 15 mmol/L propionate and 9 mmol/L butyrate), (3) 9 mmol/L butyrate, or (4) 60 mmol/L butyrate. Within each group, piglets were further randomized to examine acute (4, 12, or 24 hours) and chronic (3 or 7 days) adaptations. Indices of intestinal adaptation, including crypt-villus architecture, proliferation and apoptosis, and concentration of the intestinotrophic peptide, glucagon-like pepide-2 (GLP-2), were measured.
“RESULTS:
“Villus height was increased (p < .029) within 4 hours by supplemented TPN treatments. Supplemented TPN treatments increased (p < .037) proliferating cell nuclear antigen expression along the entire intestine. Indicative of an antiapoptotic profile, jejunal Bax:Bcl-w abundance was decreased (p = .033) by both butyrate-supplemented TPN treatments, and ileal abundance was decreased (p = .0002) by all supplemented TPN treatments, regardless of time. Supplemented TPN treatments increased (p = .016) plasma GLP-2 concentration at all time points.
“CONCLUSIONS:
“Butyrate is the SCFA responsible for augmenting structural aspects of intestinal adaptations by increasing proliferation and decreasing apoptosis within 4 hours postresection. The intestinotrophic mechanism(s) underlying butyrate’s effects may involve GLP-2. Ultimately, butyrate administration may enable an infant with short-bowel syndrome to successfully transition to enteral feedings by maximizing their absorptive area.”

[Bro2011] Evidence type: review

Brown AC, Rampertab SD, Mullin GE.
Expert Rev Gastroenterol Hepatol. 2011 Jun;5(3):411-25. doi: 10.1586/egh.11.29.

“In terms of existing guidelines for dietary modifications, three suggested limiting dairy if lactose intolerant, two suggested limiting excess fat, one indicated decreasing excess carbohydrates, and five suggested avoiding high-fiber foods, especially during flares. The question of whether or not to use probiotics continues to be debated.”
[…]
“Reducing high-fiber foods during symptoms appears to have generated the most support in the dietary guidelines. It may be important to communicate to IBD patients that high-fiber foods are not recommended, especially for those with CD, during flares or in the presence of active disease states, fistulas or strictures. There appears to be a tendency among the dietary guidelines to restrict foods such as raw fruits, raw vegetables, beans, bran, popcorn, seeds, nuts, corn hulls, whole grains, brown rice and wild rice. Although not mentioned, raw salads would also fall into this category.”
[…]
“Some patients with IBD react to excess dietary fat and perhaps this is where the recommendation is derived. Few research studies are available to support or refute such a recommendation. The topic needs further investigation because patients with malabsorption may be at risk of not obtaining their necessary essential fatty acids. Perhaps saturated fats should be limited, with more of an emphasis on more healthy fat intakes.”

[Dai2015] Evidence type: review

Zhaolai Dai Zhenlong Wu Suqin Hang Weiyun Zhu Guoyao Wu
MHR: Basic science of reproductive medicine, Volume 21, Issue 5, 1 May 2015, Pages 389–409

“Recent studies with the human colonic bacteria have shown that protein- and AA-fermenting bacteria are abundant and diverse in the colon. The abundance of the AA-fermenting bacteria in the large intestine is very high and their number can reach up to 1011 per gram dry feces (Smith and Macfarlane, 1998). Using the traditional plate counting technique, the authors have also reported that the dominant bacterial species for the utilization of single AA or pairs of AA are very different. For instance, Clostridium bifermentans is the predominant bacteria for the utilization of lysine or proline, and pairs of AA (e.g. phenylalanine/leucine, isoleucine/tryptophan and alanine/glycine), whereas Peptostreptococcus spp. bacteria are predominant for the utilization of glutamate or tryptophan. Many species of bacteria utilize the same AA as substrates for growth (Smith and Macfarlane, 1998). Overall, bacteria belonging to the Clostridium spp. dominate in AA fermentation in the human large intestine, but other bacterial species, such as Fusobacterium spp., Bacteroides spp., Veillonella spp., Megasphaera elsdenii and Selenomonas ruminantium, may also be important for AA metabolism in the large intestine (Smith and Macfarlane, 1998; Dai et al., 2011).”

[Dep2012] Evidence type: non-human animal experiment

Depauw, S., G. Bosch, M. Hesta, K. Whitehouse-Tedd, W. H. Hendriks, J. Kaandorp, and G. P. J. Janssens. 2012.
J. Anim. Sci. 90:2540-2548. doi:10.2527/jas.2011-4377

“End-product profile per unit of OM differed among substrates (Table 3). The greatest total SCFA production was recorded for FOS (P < 0.05), followed by collagen, casein, and glucosamine (P < 0.05). The FOS and collagen showed comparable acetate production. Collagen not only had a high production of total SCFA but also resulted in a greater acetate to propionate ratio relative to all other substrates (8.41:1 for collagen and 1.67:1–2.97:1 for other substrates). Chicken cartilage and glucosamine-chondroitin produced similar total SCFA production, which was moderate compared with FOS (P < 0.05). Total SCFA production from incubated rabbit bone and skin was low (P < 0.05), whereas total SCFA production from rabbit hair was negligible and comparable with the negative control cellulose. Butyrate production was greatest for casein and glucosamine (P < 0.05). Incubation with casein resulted in the greatest total BCFA production (P < 0.05), which was more than double compared with all other substrates that had similar total BCFA production. Considerable variation in BCFA ratios was observed among substrates. In all animal substrates, isovalerate was the main BCFA, whereas fermentation of FOS, glucosamine, and glucosamine-chondroitin led to valerate as the main BCFA. The greatest amount of ammonia production was observed for casein, collagen, and rabbit bone (P < 0.05), whereas the least ammonia production was detected for FOS, cellulose, and rabbit hair (P < 0.05).”

[Gam2012] Evidence type: non-human animal and human cell in vitro experiments

Gambhir D, Ananth S, Veeranan-Karmegam R, et al.
Investigative Ophthalmology & Visual Science. 2012;53(4):2208-2217. doi:10.1167/iovs.11-8447.

“GPR109A is the G-protein–coupled receptor responsible for mediating the antilipolytic actions of niacin (nicotinic acid), a B-complex vitamin and also a drug used widely to lower blood lipid levels.1 β-hydroxybutyrate (β-HB) is the physiologic ligand for this receptor.2 GPR109A expression was initially thought to be limited to adipocytes, the cell type in which its antilipolytic functions are most warranted, and immune cells.3–5 Recent reports, however, have described expression of the receptor in a number of other cell types, including hepatocytes6 and epithelial cells of the small intestine and colon.7,8 In addition, we demonstrated GPR109A expression in the retinal pigment epithelium (RPE), localized specifically to the basolateral membrane.9 Although GPR109A is most noted functionally for its antilipolytic effects in adipocytes, recent studies suggest that activation of the receptor also is associated with novel immunomodulatory responses.10–12 We have characterized expression of GPR109A in RPE; however, the functional significance of receptor expression in this cell type remains unknown.”

[Gon2016] Evidence type: review

Pedro Gonçalves and Fátima Martel
Porto Biomedical Journal Volume 1, Issue 3, July–August 2016, Pages 83-91

“The most important molecular mechanisms involved in the anticarcinogenic effect of BT are dependent on its intracellular concentration (because HDAC expression is overregulated,41,42 while BT membrane receptors (GPR109A and GPR43) are silenced or downregulated in CRC34,38). So, knowledge on the mechanisms involved in its membrane transport is relevant to both its physiological and pharmacological benefits. Also, changes in transporter expression or function will have an obvious impact on the effect of BT, and therefore, knowledge on the regulation of its membrane transport seems particularly important.
[…]
“[D]ifferences in MCT1, SMCT1 and BCRP expression between normal colonocytes and tumoral cells contribute to the different effects of BT in these cells (‘the BT paradox’). More specifically, BT is transported into normal colonic epithelial cells by both MCT1 and SMCT1, but its intracellular concentration is kept low because it is efficiently metabolized and effluxed from these cells by BCRP-mediated transport. In contrast, colonic epithelial tumoral cells show a decrease in SMCT1 protein expression, and BT is taken up by these cells through MCT1. In these cells, BT accumulates intracellularly because it is inefficiently metabolized (due to the fact that glucose becomes the primary energy source of these cells) and because there is a reduction in BCRP expression.”

[Hin2002] Evidence type: human cell in vitro experiment

Brian F. Hinnebusch, Shufen Meng, James T. Wu, Sonia Y. Archer, and Richard A. Hodin
J. Nutr. May 1, 2002 vol. 132 no. 5 1012-1017

“The short-chain fatty acid (SCFA) butyrate is produced via anaerobic bacterial fermentation within the colon and is thought to be protective in regard to colon carcinogenesis. Although butyrate (C4) is considered the most potent of the SCFA, a variety of other SCFA also exist in the colonic lumen. Butyrate is thought to exert its cellular effects through the induction of histone hyperacetylation. We sought to determine the effects of a variety of the SCFA on colon carcinoma cell growth, differentiation and apoptosis. HT-29 or HCT-116 (wild-type and p21-deleted) cells were treated with physiologically relevant concentrations of various SCFA, and histone acetylation state was assayed by acid-urea-triton-X gel electrophoresis and immunoblotting. Growth and apoptotic effects were studied by flow cytometry, and differentiation effects were assessed using transient transfections and Northern blotting. Propionate (C3) and valerate (C5) caused growth arrest and differentiation in human colon carcinoma cells. The magnitude of their effects was associated with a lesser degree of histone hyperacetylation compared with butyrate. Acetate (C2) and caproate (C6), in contrast, did not cause histone hyperacetylation and also had no appreciable effects on cell growth or differentiation. SCFA-induced transactivation of the differentiation marker gene, intestinal alkaline phosphatase (IAP), was blocked by histone deacetylase (HDAC), further supporting the critical link between SCFA and histones. Butyrate also significantly increased apoptosis, whereas the other SCFA studied did not. The growth arrest induced by the SCFA was characterized by an increase in the expression of the p21 cell-cycle inhibitor and down-regulation of cyclin B1 (CB1). In p21-deleted HCT-116 colon cancer cells, the SCFA did not alter the rate of proliferation. These data suggest that the antiproliferative, apoptotic and differentiating properties of the various SCFA are linked to the degree of induced histone hyperacetylation. Furthermore, SCFA-mediated growth arrest in colon carcinoma cells requires the p21 gene.”

[Blo2011] Evidence type: in vitro experiments

Blouin, J.-M., Penot, G., Collinet, M., Nacfer, M., Forest, C., Laurent-Puig, P., Coumoul, X., Barouki, R., Benelli, C. and Bortoli, S. (2011)
Int. J. Cancer, 128: 2591–2601. doi:10.1002/ijc.25599

“Butyrate, a short-chain fatty acid produced by the colonic bacterial fermentation is able to induce cell growth inhibition and differentiation in colon cancer cells at least partially through its capacity to inhibit histone deacetylases. Since butyrate is expected to impact cellular metabolic pathways in colon cancer cells, we hypothesize that it could exert its antiproliferative properties by altering cellular metabolism. We show that although Caco2 colon cancer cells oxidized both butyrate and glucose into CO2, they displayed a higher oxidation rate with butyrate as substrate than with glucose. Furthermore, butyrate pretreatment led to an increase cell capacity to oxidize butyrate and a decreased capacity to oxidize glucose, suggesting that colon cancer cells, which are initially highly glycolytic, can switch to a butyrate utilizing phenotype, and preferentially oxidize butyrate instead of glucose as energy source to produce acetyl coA. Butyrate pretreated cells displayed a modulation of glutamine metabolism characterized by an increased incorporation of carbons derived from glutamine into lipids and a reduced lactate production. The butyrate-stimulated glutamine utilization is linked to pyruvate dehydrogenase complex since dichloroacetate reverses this effect. Furthermore, butyrate positively regulates gene expression of pyruvate dehydrogenase kinases and this effect involves a hyperacetylation of histones at PDK4 gene promoter level. Our data suggest that butyrate exerts two distinct effects to ensure the regulation of glutamine metabolism: it provides acetyl coA needed for fatty acid synthesis, and it also plays a role in the control of the expression of genes involved in glucose utilization leading to the inactivation of PDC.”

[Jas1985] Evidence type: armchair

“Abstract
“Butyric acid has two contrasting functional roles. As a product of fermentation within the human colon, it serves as the most important energy source for normal colorectal epithelium. It also promotes the differentiation of cultured malignant cells. A switch from aerobic to anaerobic metabolism accompanies neoplastic transformation in the colorectum. The separate functional roles for n-butyrate may reflect the different metabolic activities of normal and neoplastic tissues. Relatively low intracolonic levels of n-butyrate are associated with a low fibre diet. Deficiency of n-butyrate, coupled to the increased energy requirements of neoplastic tissues, may promote the switch to anaerobic metabolism. The presence of naturally occurring differentiating agents, such as n-butyrate, may modify the patterns of growth and differentiation of gastrointestinal tumours.”

[Ham2008] Evidence type: review

HAMER, H. M., JONKERS, D., VENEMA, K., VANHOUTVIN, S., TROOST, F. J. and BRUMMER, R.-J. (2008)
Alimentary Pharmacology & Therapeutics, 27: 104–119. doi:10.1111/j.1365-2036.2007.03562.x

“Although some controlled studies with enemas containing butyrate or SCFA mixtures in UC patients did not find beneficial effects121 or only trends towards clinical improvement,46, 118, 119 various other studies revealed a significant improvement in clinical and inflammatory parameters.45, 115, 120, 124, 126 Studies in patients with diversion colitis reported inconsistent results with regard to improvement in clinical symptoms and inflammatory parameters in response to administration of mixtures of SCFAs vs. placebo.96, 114 Two other human intervention studies determined mucosal cell proliferation in patients after Hartmann’s procedure and found trophic effects of SCFA mixtures in the mucosa of the closed rectal and sigmoid segment.73, 116”
“The effects of butyrate containing enemas on radiation proctitis113, 117, 122, 125 and pouchitis123 have been studied in small groups and besides one report125 that showed that butyrate was an effective treatment of radiation proctitis, other studies did not report clear-cut beneficial effects of SCFA irrigation in these two patient groups.”

[Ham2010] Evidence type: human experiment

Hamer HM, Jonkers DM, Vanhoutvin SA, Troost FJ, Rijkers G, de Bruïne A, Bast A, Venema K, Brummer RJ.
Clin Nutr. 2010 Dec;29(6):738-44. doi: 10.1016/j.clnu.2010.04.002. Epub 2010 May 15.

“Abstract
“BACKGROUND & AIMS:
“Butyrate, produced by colonic fermentation of dietary fibers is often hypothesized to beneficially affect colonic health. This study aims to assess the effects of butyrate on inflammation and oxidative stress in subjects with chronically mildly elevated parameters of inflammation and oxidative stress.
“METHODS:
“Thirty-five patients with ulcerative colitis in clinical remission daily administered 60 ml rectal enemas containing 100mM sodium butyrate (n=17) or saline (n=18) during 20 days (NCT00696098). Before and after the intervention feces, blood and colonic mucosal biopsies were obtained. Parameters of antioxidant defense and oxidative damage, myeloperoxidase, several cytokines, fecal calprotectin and CRP were determined.
“RESULTS:
“Butyrate enemas induced minor effects on colonic inflammation and oxidative stress. Only a significant increase of the colonic IL-10/IL-12 ratio was found within butyrate-treated patients (p=0.02), and colonic concentrations of CCL5 were increased after butyrate compared to placebo treatment (p=0.03). Although in general butyrate did not affect colonic glutathione levels, the effects of butyrate enemas on total colonic glutathione appeared to be dependent on the level of inflammation.
“CONCLUSION:
“Although UC patients in remission were characterized by low-grade oxidative stress and inflammation, rectal butyrate enemas showed only minor effects on inflammatory and oxidative stress parameters.”

[Kap2016] Evidence type: review

Gilaad G. Kaplan, MD, MPH, FRCPC
Clinical Gastroenterology and Hepatology , Volume 14 , Issue 8 , 1137 – 1139

“After reviewing the study from Brotherton et al and prior literature, information for patients with IBD on the effects of fiber on the risk of flaring is unclear. The current article adds to this discussion but does not definitively answer the question. Overall, the data suggest that in the absence of a known fibrostenotic stricture with obstructive symptoms, a high fiber diet is likely safe in patients with IBD and may impart a weak benefit. Yet, answering these clinically relevant questions with more confidence and detail is within our grasp. The advent of e-cohorts offers the potential to transform research in the future by allowing investigators to design cost-efficient Web-based clinical studies, particularly for interventional environmental clinical trials.”

[Kor1990] Evidence type: non-human animal experiment

Koruda MJ1, Rolandelli RH, Bliss DZ, Hastings J, Rombeau JL, Settle RG.
Am J Clin Nutr. 1990 Apr;51(4):685-9.

“Abstract
“When enteral nutrition is excluded from animals maintained solely with total parenteral nutrition (TPN), atrophy of the intestinal mucosa is observed. Short-chain fatty acids (SCFAs) are produced in the colon by the fermentation of dietary carbohydrates and fiber polysaccharides and have been shown to stimulate mucosal-cell mitotic activity in the intestine. This study compared the effects of an intravenous and an intracecal infusion of SCFAs on the small-bowel mucosa. Rats received standard TPN, TPN with SCFAs (sodium acetate, propionate, and butyrate), TPN with an intracecal infusion of SCFAs, or rat food. After 7 d jejunal and ileal mucosal weights, DNA, RNA, and protein were determined. Standard TPN produced significant atrophy of the jejunal and ileal mucosa. Both the intracecal and intravenous infusion of SCFAs significantly reduced the mucosal atrophy associated with TPN. The intravenous and intracolonic infusion of SCFAs were equally effective in inhibiting small-bowel mucosal atrophy.”

[Mal2015] Evidence type: non-human animal experiment

Joshua J. Malago and Catherine L. Sangu
Zhejiang Univ Sci B. 2015 Mar; 16(3): 224–234. doi: 10.1631/jzus.B1400191

“Earlier studies that linked the development of UC and butyrate levels in the colon, observed that deficiency of butyrate leads to disease development and that restoration of butyrate levels by intracolonic infusion treats UC (Roediger, 1980). Since then, butyrate enemas have popularly been used as medicaments stemming from their potential to impart beneficial attributes to the colon. This potential involves an increase in mechanical strength of injured colonic mucosa to hasten the healing process (Bloemen et al., 2010; Mathew et al., 2010), suppression of IL-8 production by intestinal epithelial cells to protect against the inflammatory process (Malago et al., 2005), and clinical remission of UC by protecting against inflammatory and oxidative stress parameters of the disease (Hamer et al., 2010b). Much as butyrate tends to impart a protective effect, several authors have indicated failures or limited success of butyrate to relieve IBD patients (Harig et al., 1989; Sanderson, 1997; Hamer et al., 2010b).”

“Topical administration of butyrate to cure colitis has been fairly well demonstrated (Scheppach et al., 1992; Hamer et al., 2010a; 2010b). This is done mainly through intrarectal administration of enemas that contain butyrate. The procedure is one of the earliest approaches to treat UC even in patients who had been unresponsive to or intolerant of standard therapy (Scheppach et al., 1992). The intrarectally administered butyrate needs to be absorbed before it works. Normally butyrate absorption mainly occurs in proximal colon whose function is impaired during UC. This hinders absorption of topically administered butyrate and may not benefit UC patients. However, butyrate absorption in the colon can be increased by manipulating electrolyte composition in the rectal lumen (Holtug et al., 1995) since rectal butyrate absorption remains normal during UC (Hove et al., 1995). Thus, topical butyrate, given intrarectally in form of SB, plays a double role; firstly by employing sodium ions, it accelerates rectal absorption of SB and secondly, the absorbed butyrate imparts healing to the colonocytes. The end result is epithelial proliferation to restore the damaged epithelium, especially the lost colonic epithelial continuity.”

“We have demonstrated the potential of intraperitoneally administered butyrate to prevent the severity of AA-induced UC lesions. To the best of our knowledge, this finding has not been reported before. However, the systemic effect of butyrate to other body systems and organs has been reported. For instance, intraperitoneal injection of butyrate at 50–200 mg/kg body weight decreases gentamicin-induced nephrotoxicity in rats by enhancing renal antioxidant enzyme activity and expression of prohibitin protein (Sun et al., 2013). When given at 1200 mg/kg, intraperitoneal butyrate ameliorates an aging-associated deficit in object recognition memory in rats (Reolon et al., 2011). Silingardi et al. (2010) further demonstrated that chronic intraperitoneal administration of butyrate to long-term monocularly deprived adult rats causes a complete recovery of visual acuity. A more recent study has also reported that intraperitoneal injections of butyrate for 28 d to adult C57BL/6 mice prevent repressed contextual fear memory caused by isoflurane (Zhong et al., 2014). All these facts and our own study affirm that butyrate has a potential to impart protective roles to various body organs and systems through systemic administration.”

[Mil2017] Evidence type: non-human animal experiment

Miles JP, Zou J, Kumar MV, Pellizzon M, Ulman E, Ricci M, Gewirtz AT, Chassaing B.

“Abstract
“BACKGROUND:
“Lack of dietary fiber has been suggested to increase the risk of developing various chronic inflammatory diseases, whereas supplementation of diets with fiber might offer an array of health-promoting benefits. Consistent with this theme, we recently reported that in mice, compositionally defined diets that are made with purified ingredients and lack fermentable fiber promote low-grade inflammation and metabolic syndrome, both of which could be ameliorated by supplementation of such diets with the fermentable fiber inulin.
“METHODS:
“Herein, we examined if, relative to a grain-based mouse diet (chow), compositionally defined diet consumption would impact development of intestinal inflammation induced by dextran sulfate sodium (DSS) and moreover, whether DSS-induced colitis might also be attenuated by diets supplemented with inulin.
“RESULTS:
“Analogous to their promotion of low-grade inflammation, compositionally defined diet of high- and low-fat content with cellulose increased the severity of DSS-induced colitis relative to chow. However, in contrast to the case of low-grade inflammation, addition of inulin, but not the insoluble fiber cellulose, further exacerbated the severity of colitis and its associated clinical manifestations (weight loss and bleeding) in both low- and high-fat diets.
“CONCLUSIONS:
“While inulin, and perhaps other fermentable fibers, can ameliorate low-grade inflammation and associated metabolic disease, it also has the potential to exacerbate disease severity in response to inducers of acute colitis.”

[Miz2011] Evidence type: review

Mizushima N, Komatsu M.
Cell. 2011 Nov 11;147(4):728-41. doi: 10.1016/j.cell.2011.10.026.

“Autophagy is the major intracellular degradation system by which cytoplasmic materials are delivered to and degraded in the lysosome. However, the purpose of autophagy is not the simple elimination of materials, but instead, autophagy serves as a dynamic recycling system that produces new building blocks and energy for cellular renovation and homeostasis. Here we provide a multidisciplinary review of our current understanding of autophagy’s role in metabolic adaptation, intracellular quality control, and renovation during development and differentiation. We also explore how recent mouse models in combination with advances in human genetics are providing key insights into how the impairment or activation of autophagy contributes to pathogenesis of diverse diseases, from neurodegenerative diseases such as Parkinson disease to inflammatory disorders such as Crohn disease.”

[Nei2015] Evidence type: review

Neis EPJG, Dejong CHC, Rensen SS.
Nutrients. 2015;7(4):2930-2946. doi:10.3390/nu7042930.

“Although protein breakdown followed by amino acid absorption in the small intestine is a rather efficient process, substantial amounts of amino acids seem to escape assimilation in the small intestine in humans [38]. These amino acids can subsequently be used by the microbiota in the colon, or transported from the lumen into the portal blood stream. In addition, the host itself produces substrates such as glycoproteins (e.g., mucins) which contribute to the available amino acids within the colon [39]. ”
[…]
“Regarding the large intestine, it appears that amino acids are not significantly absorbed by the colonic mucosa, but rather are intensively metabolized by the large intestinal microbiota [23]. This higher rate of bacterial protein fermentation has been related to high pH and low carbohydrate availability in the large intestine [22]. The preferred amino acid substrates of colonic bacteria include lysine, arginine, glycine, and the BCAA leucine, valine, and isoleucine [32], resulting in the generation of a complex mixture of metabolic end products including among others ammonia, SCFA (acetate, propionate, and butyrate), and branched-chain fatty acids (BCFA; valerate, isobutyrate, and isovalerate). ”

[Pen2007] Evidence type: non-human animal experiment

Luying Peng, Zhenjuan He, Wei Chen, Ian R Holzman and Jing Lin
Pediatric Research (2007) 61, 37–41; doi:10.1203/01.pdr.0000250014.92242.f3

“In premature infants, the maturation of the intestinal barrier function does not develop properly in the absence of enteral nutrients (6). Intestinal barrier function is significantly less developed in full-term newborn piglets receiving total parental nutrition compared with those receiving enteral nutrition (7). Production of SCFA in the bowel may be crucial for gastrointestinal adaptation and maturation in the early stage of postnatal life (8). However, overproduction and/or accumulation of SCFA in the bowel due to maldigestion and bacterial overgrowth may be toxic to mucosal cells and cause intestinal mucosal injury (9,10). Overproduction and/or accumulation of SCFA in the bowel and inability to clear the intraluminal SCFA because of poor gastrointestinal motility in premature infants have been hypothesized to play a role in the pathogenesis of neonatal NEC (11).”

[Rah2014] Evidence type: non-human animal experiment

Rahman M, Muhammad S, Khan MA, Chen H, Ridder DA, Müller-Fielitz H, Pokorná B, Vollbrandt T, Stölting I, Nadrowitz R, Okun JG, Offermanns S, Schwaninger M.
Nat Commun. 2014 May 21;5:3944. doi: 10.1038/ncomms4944.

“Abstract
“The ketone body β-hydroxybutyrate (BHB) is an endogenous factor protecting against stroke and neurodegenerative diseases, but its mode of action is unclear. Here we show in a stroke model that the hydroxy-carboxylic acid receptor 2 (HCA2, GPR109A) is required for the neuroprotective effect of BHB and a ketogenic diet, as this effect is lost in Hca2(-/-) mice. We further demonstrate that nicotinic acid, a clinically used HCA2 agonist, reduces infarct size via a HCA2-mediated mechanism, and that noninflammatory Ly-6C(Lo) monocytes and/or macrophages infiltrating the ischemic brain also express HCA2. Using cell ablation and chimeric mice, we demonstrate that HCA2 on monocytes and/or macrophages is required for the protective effect of nicotinic acid. The activation of HCA2 induces a neuroprotective phenotype of monocytes and/or macrophages that depends on PGD2 production by COX1 and the haematopoietic PGD2 synthase. Our data suggest that HCA2 activation by dietary or pharmacological means instructs Ly-6C(Lo) monocytes and/or macrophages to deliver a neuroprotective signal to the brain.”

[Ras1988] Evidence type: in vitro experiment

Rasmussen HS, Holtug K, Mortensen PB.
Scand J Gastroenterol. 1988 Mar;23(2):178-82.

“Short-chain fatty acids (SCFA) originate mainly in the colon through bacterial fermentation of polysaccharides. To test the hypothesis that SCFA may originate from polypeptides as well, the production of these acids from albumin and specific amino acids was examined in a faecal incubation system. Albumin was converted to all C2-C5-fatty acids, whereas amino acids generally were converted to specific SCFA, most often through the combination of a deamination and decarboxylation of the amino acids, although more complex processes also took place. This study indicates that a part of the intestinal SCFA may originate from polypeptides, which apparently are the major source of those SCFA (isobutyrate, valerate, and isovalerate) only found in small amounts in the healthy colon. Moreover, gastrointestinal disease resulting in increased proteinous material in the colon (exudation, mucosal desquamation, bleeding, and so forth) may hypothetically influence SCFA production.”

[Roe1980] (1, 2) Evidence type: human experiment

“The view that UC might be due to a metabolic defect in the epithelial cells[5,6] has received little general recognition. The present study was undertaken to assess the metabolic performance of the mucosa in UC and especially to explore whether a metabolic abnormality could be detected. To facilitate this approach a method of preparing suspensions of colonocytes was devised.[7] Colonocytes have been used to determine the utilisation of respiratory fuels by the non-diseased ascending and descending colon in man.[8] The results showed that short-chain fatty acid (SCFA), especially n-butyrate of bacterial origin, was the predominant contributor to cellular oxidation and that a large proportion of the carbon atoms of colonocyte respiration was derived from SCFAs. Mucosa of the distal colon depended metabolically mostly on n-butyrate, whereas the proximal colonic mucosa depended mostly on glucose and glutamine for respiratory fuel.[8] These same respiratory fuels were chosen for the investigation of colonocytes prepared from the mucosa of patients with ulcerative colitis.

“Generation of 14C02 from radioactively labelled butyrate was observed for at least 40 min. Production of 14C02 was linear whenever this could be tested for 60 min. Generation of 14C02 was significantly less in quiescent and acute-colitis cells than in controls (p = <0.001) (table II). Some of the oxidised butyrate appeared as ketone bodies (acetoacetate and β-hydroxybutyrate, table III). The diminished production of ketone bodies mirrors the decreased oxidation of butyrate to CO2. Ketogenesis was significantly lower in the quiescent-colitis group than the control group and lower still in the acute-colitis group.

“The metabolism of colonocytes from patients with UC seemed to differ in three respects from the metabolism of colonocytes prepared from non-ulcerated and apparently normal mucosa. In UC: 1. Butyrate oxidation to C02 and ketone bodies was significantly impaired, and the impairment correlated with the acute or chronic involvement of the mucosa. 2. Glucose oxidation was increased. 3. Glutamine oxidation was increased.”

“Ketogenesis was significantly lower in the quiescent-colitis group than the control group and lower still in the acute-colitis group.”

[Roe1993] Evidence type: non-human animal experiment

Roediger WE, Duncan A, Kapaniris O, Millard S.
Clin Sci (Lond). 1993 Nov;85(5):623-7.

“Abstract
“Isolated colonic epithelial cells of the rat were incubated for 40 min with [6-14C]glucose and n-[1-14C]butyrate in the presence of 0.1-2.0 mmol/l NaHS, a concentration range found in the human colon. Metabolic products, 14CO2, acetoacetate, beta-hydroxybutyrate and lactate, were measured and injury to cells was judged by diminished production of metabolites. 2. Oxidation of n-butyrate to CO2 and acetoacetate was reduced at 0.1 and 0.5 mmol/l NaHS, whereas glucose oxidation remained unimpaired. At 1.0-2.0 mmol/l NaHS, n-butyrate and glucose oxidation were dose-dependently reduced at the same rate. 3. To bypass short-chain acyl-CoA dehydrogenase activity necessary for butyrate oxidation, ketogenesis from crotonate was measured in the presence of 1.0 mmol/l NaHS. Suppression by sulphide of ketogenesis from crotonate (-10.5 +/- 6.1%) compared with control conditions was not significant, whereas suppression of ketogenesis from n-butyrate (-36.00 +/- 5.14%) was significant (P = < 0.01). Inhibition of FAD-linked oxidation was more affected by NaHS than was NAD-linked oxidation. 4. L-Methionine (5.0 mmol/l) significantly redressed the impaired beta-oxidation induced by NaHS. Methionine equally improved CO2 and ketone body production, suggesting a global reversal of the action of sulphide. 5. Sulphide-induced oxidative changes closely mirror the impairment of beta-oxidation observed in colonocytes of patients with ulcerative colitis. A hypothesis for the disease process of ulcerative colitis is that sulphides may form persulphides with butyryl-CoA, which would inhibit cellular short-chain acyl-CoA deHydrogenase and beta-oxidation to induce an energy-deficiency state in colonocytes and mucosal inflammation.”

[Rol1997] Evidence type: non-human animal experiment

Rolandelli RH, Buckmire MA, Bernstein KA.
Dis Colon Rectum. 1997 Jan;40(1):67-70.

“PURPOSE:
“Intracolonic infusions of short chain fatty acids promote healing of colonic anastomoses. Because the intravenous route may have wider clinical application, we studied the effect of intravenous n-butyrate on the mechanical strength of colonic anastomoses in the rat.
“METHODS:
“After placement of an indwelling intravenous catheter, the descending colon was transected and an anastomosis was performed. Rats were then randomized to receive total parenteral nutrition (TPN group; n = 15) or total parenteral nutrition plus 130 mM/l of n-butyrate (TPN+BUT group; n = 13). On the fifth postoperative day, bursting pressure and bowel wall tension of the anastomoses were measured in situ. Anastomotic tissues were analyzed for hydroxyproline.
“RESULTS:
“The TPN+BUT group had a significantly higher bursting pressure (107.5 +/- 30.3 vs. 83 +/- 41.0 mmHg; P = 0.04) and bowel wall tension (20.7 +/- 7.6 vs. 14.1 +/- 9.9 Newton; P = 0.03). Tissue hydroxyproline was not different between the two groups (TPN, 45.8 +/- 9.2, and TPN+BUT, 47.9 +/- 2.9 microg/mg tissue nitrogen).
“CONCLUSIONS:
“We conclude that intravenous butyrate improves mechanical strength of a colonic anastomosis without a detectable change in total collagen content.”

[Rom1990] Evidence type: review

Rombeau J.L., Kripke S.A., Settle R.G. (1990)
In: Kritchevsky D., Bonfield C., Anderson J.W. (eds) Dietary Fiber. Springer, Boston, MA

“As mentioned previously hepatic metabolism of butyrate and acetate results in the production of glutamine and the ketone bodies acetoacetate and which are the preferred oxidative fuels of enterocytes (Windmueller and Spaeth, 1978). The enteral or parenteral provision of glutamine and acetoacetate has been shown to be trophic to both small and large intestinal mucosa (Fox et al., 1987; Kripke et al., 1988a).”

[Sen2006] (1, 2) Evidence type: review

Sengupta S, Muir JG, Gibson PR.
J Gastroenterol Hepatol. 2006 Jan;21(1 Pt 2):209-18.

“Abstract
“Butyrate, the four-carbon fatty acid, is formed in the human colon by bacterial fermentation of carbohydrates (including dietary fiber), and putatively suppresses colorectal cancer (CRC). Butyrate has diverse and apparently paradoxical effects on cellular proliferation, apoptosis and differentiation that may be either pro-neoplastic or anti-neoplastic, depending upon factors such as the level of exposure, availability of other metabolic substrate and the intracellular milieu. In humans, the relationship between luminal butyrate exposure and CRC has been examined only indirectly in case-control studies, by measuring fecal butyrate concentrations, although this may not accurately reflect effective butyrate exposure during carcinogenesis. Perhaps not surprisingly, results of these investigations have been mutually contradictory. The direct effect of butyrate on tumorigenesis has been assessed in a number of in vivo animal models, which have also yielded conflicting results. In part, this may be explained by methodological differences in the amount and route of butyrate administration, which are likely to significantly influence delivery of butyrate to the distal colon. Nonetheless, there appears to be some evidence that delivery of an adequate amount of butyrate to the appropriate site protects against early tumorigenic events. Future study of the relationship between butyrate and CRC in humans needs to focus on risk stratification and the development of feasible strategies for butyrate delivery.”

[Shi2013] Evidence type: non-human animal experiment

Shimazu T, Hirschey MD, Newman J, et al.
Science (New York, NY). 2013;339(6116):211-214. doi:10.1126/science.1227166.

“Concentrations of acetyl–coenzyme A and nicotinamide adenine dinucleotide (NAD+) affect histone acetylation and thereby couple cellular metabolic status and transcriptional regulation. We report that the ketone body d-β-hydroxybutyrate (βOHB) is an endogenous and specific inhibitor of class I histone deacetylases (HDACs). Administration of exogenous βOHB, or fasting or calorie restriction, two conditions associated with increased βOHB abundance, all increased global histone acetylation in mouse tissues. Inhibition of HDAC by βOHB was correlated with global changes in transcription, including that of the genes encoding oxidative stress resistance factors FOXO3A and MT2. Treatment of cells with βOHB increased histone acetylation at the Foxo3a and Mt2 promoters, and both genes were activated by selective depletion of HDAC1 and HDAC2. Consistent with increased FOXO3A and MT2 activity, treatment of mice with βOHB conferred substantial protection against oxidative stress.”

[Sin2014] Evidence type: non-human animal experiment

“The most widely studied function of butyrate is its ability to inhibit histone deacetylases. However, cell surface receptors have been identified for butyrate; these receptors, GPR43 and GPR109A (also known as hydroxycarboxylic acid receptor 2 or HCA2), are G protein coupled and are expressed in colonic epithelium, adipose tissue, and immune cells (Blad et al., 2012, Ganapathy et al., 2013). GPR43-deficient mice undergo severe colonic inflammation and colitis in DSS-induced colitis model and the GPR43 agonist acetate protects germ-free mice from DSS-induced colitis (Maslowski et al., 2009). Although GPR43 is activated by all three SCFAs, GPR109A (encoded by Niacr1) is activated only by butyrate (Blad et al., 2012, Taggart et al., 2005). GPR109A is also activated by niacin (vitamin B3) (Blad et al., 2012, Ganapathy et al., 2013). In colonic lumen, butyrate is generated at high concentrations (10–20 mM) by gut microbiota and serves as an endogenous agonist for GPR109A (Thangaraju et al., 2009). We have shown that Gpr109a expression in colon is induced by gut microbiota and is downregulated in colon cancer (Cresci et al., 2010, Thangaraju et al., 2009). Gpr109a in immune cells plays a nonredundant function in niacin-mediated suppression of inflammation and atherosclerosis (Lukasova et al., 2011). Gut microbiota also produce niacin. Niacin deficiency in humans results in pellagra, characterized by intestinal inflammation, diarrhea, dermatitis, and dementia (Hegyi et al., 2004). It is of great clinical relevance that lower abundance of GPR109A ligands niacin and butyrate in gut is associated with colonic inflammation.”
[…]
“Activation of Gpr109a Suppresses Colonic Inflammation and Carcinogenesis in the Absence of Gut Microbiota or Dietary Fiber
“We then examined the relevance of niacin, a pharmacologic agonist for GPR109A, to colonic inflammation. For this, we first depleted gut microbiota with antibiotics, which reduces the production of butyrate, the endogenous agonist for GPR109A. Antibiotic treatment resulted in >300-fold reduction in aerobic and anaerobic bacterial counts in the stool (data not shown). Antibiotic treatment increased DSS-induced weight loss, diarrhea, and bleeding in WT mice (Figures 7B and S6A). Consistent with increased inflammation, we found that antibiotic treatment increased the number of polyps (8.2 ± 2.2 polyps/mouse with antibiotics; 1.6 ± 1.5 polyps/mouse without antibiotics) in WT mice (Figures 7C and 7D). We then tested whether administration of niacin protects antibiotic-treated mice against colonic inflammation and carcinogenesis. Niacin was added to drinking water along with antibiotic cocktail. Niacin ameliorated AOM+DSS-induced weight loss, diarrhea, and bleeding and reduced colon cancer development in antibiotic-treated WT mice (Figures 7B–7D and S6A). Consistent with a role of niacin in IL-18 induction, the protective effect of niacin in DSS-induced weight loss and diarrhea in antibiotic-treated Il18−/− mice was significantly blunted (Figure S6B). Niacin did not alter the development of weight loss, diarrhea, rectal bleeding, and colon cancer in antibiotic-treated Niacr1−/− mice, suggesting an essential role of Gpr109a in niacin-mediated promotion of colonic health (Figures 7B–7D and S6A). Antibiotic treatment reduced colonic inflammation and number of polyps in Niacr1−/− mice. This may be due to the presence of altered colitogenic gut microbiota in Niacr1−/− animals. ”
[…]
“Although it has been known for decades that the commensal metabolite butyrate suppresses inflammation and carcinogenesis in colon, the exact identity of molecular target(s) of butyrate in this process remained elusive. The present studies identify Gpr109a as an important mediator of butyrate effects in colon and also as a critical molecular link between colonic bacteria and dietary fiber and the host. These findings have important implications for prevention as well as treatment of inflammatory bowel disease and colon cancer and suggest that under conditions of reduced dietary fiber intake and/or decreased butyrate production in colon, pharmacological doses of niacin might be effective to maintain GPR109A signaling and consequently protect colon against inflammation and carcinogenesis.”

[Siv2017] Evidence type: review

Cell-Surface and Nuclear Receptors in the Colon as Targets for Bacterial Metabolites and Its Relevance to Colon Health. <http://europepmc.org/articles/pmc5579649#B58-nutrients-09-00856>-
Sathish Sivaprakasam, Yangzom D. Bhutia, Sabarish Ramachandran, and Vadivel Ganapathy
Nutrients. 2017 August; 9(8): 856.

“As the cell-surface receptors for SCFAs are located on the lumen-facing apical membrane of colonic epithelial cells (see below), the luminal concentrations of these agonists are physiologically relevant. SCFAs are low-affinity agonists for these receptors, and the normal luminal concentrations of these bacterial metabolites are in the millimolar levels, sufficient to activate these receptors from the luminal side. However, some of the molecular targets for these metabolites are either inside the cells (e.g., HDACs) or on the surface of the immune cells located in the lamina propria. Therefore, concentrations of these metabolites inside the colonic epithelial cells and in the lamina propria are relevant to impact these molecular targets. The intracellular target HDAC is inhibited by butyrate and propionate at low micromolar concentrations. There are effective transport systems for SCFAs in the apical membrane of colonic epithelial cells (e.g., proton-coupled and sodium-coupled monocarboxylate transporters) [47], thus making it very likely for these SCFAs to reach intracellular levels sufficient to inhibit HDACs. Even though the luminal concentrations of SCFAs are in the millimolar range, it is unlikely that they reach lamina propria at significant levels to activate the cell-surface receptors present on the mucosal immune cells. These metabolites are present only at micromolar levels in the portal blood [57], indicating that they undergo robust metabolism inside the colonic epithelial cells. This raises the question as to the physiological relevance of these bacterial metabolites to the activation of the cell-surface SCFA receptors in immune cells located in the lamina propria. With regard to this issue, it is important to note that colonic epithelial cells are highly ketogenic; they use acetate and butyrate to generate the ketone body β-hydroxybutyrate [58]. This ketone body is released from the cells into portal blood. As β-hydroxybutyrate is 3–4 times more potent than butyrate in activating its receptor GPR109A, it can be speculated that the colon-derived ketone body is most likely involved in the activation of the SCFA receptor in mucosal immune cells.”

[Tag2005] Evidence type: in vitro non-human animal experiment

Taggart AK1, Kero J, Gan X, Cai TQ, Cheng K, Ippolito M, Ren N, Kaplan R, Wu K, Wu TJ, Jin L, Liaw C, Chen R, Richman J, Connolly D, Offermanns S, Wright SD, Waters MG.
J Biol Chem. 2005 Jul 22;280(29):26649-52. Epub 2005 Jun 1.

“Here we show that the fatty acid-derived ketone body (d)-β-hydroxybutyrate ((d)-β-OHB) specifically activates PUMA-G/HM74a at concentrations observed in serum during fasting. Like nicotinic acid, (d)-β-OHB inhibits mouse adipocyte lipolysis in a PUMA-G-dependent manner and is thus the first endogenous ligand described for this orphan receptor. These findings suggests a homeostatic mechanism for surviving starvation in which (d)-β-OHB negatively regulates its own production, thereby preventing ketoacidosis and promoting efficient use of fat stores.”

[Tot2016] Evidence type: human case study

Tóth C, Dabóczi A, Howard M, Miller NJ, Clemens Z.
Int J Case Rep Images 2016;7(10):570–578.

“Given the ineffectiveness of standard therapies the parents of the child were seeking for alternative options. When we first met the patient he reported bilateral pain and swelling of the knee, frequent episodes of fever and night sweats as well as fatigue. He looked pale. We offered the paleolithic ketogenic diet along with close monitoring of the patient. The patient started the diet on 4 January 2015. The diet is consisting of animal fat, meat, offal and eggs with an approximate 2:1 fat : protein ratio. Red and fat meats instead of poultry as well as regular intake of organ meats from pork and cattle were encouraged. Grains, milk, dairy, refined sugars, vegetable oils, oilseeds, nightshades and artificial sweeteners were excluded. Small amount of honey was allowed for sweetening. The patient was not taking any supplements. Regular home monitoring of urinary ketones indicated sustained ketosis. Regular laboratory follow-up was used to monitor the course of the disease as well as for giving feedback how to fine tune the diet. The patient was under our close control and gave frequent feedbacks and so we could assess the level of dietary compliance. The patient maintained a high level dietary adherence on the long-term, yet on his birthday, he made a mistake: he has eaten two pieces of commercially available “paleo” cake which contained coconut oil, flour from oilseeds as well as sugar alcohol. Clinical consequences are discussed later. From July 2015 onwards he also consumed small amounts of vegetables and fruits. Given the persistence of certain alterations in laboratory values (mild anemia) on 10 November 2015, despite 10 months on the paleolithic ketogenic diet, we suggested to tighten the diet again. From this time on he did neither consume vegetables and fruits nor vegetable oil containing spices such as cumin and cinnamon.
“Discontinuing medication
“Within two weeks after diet onset the patient discontinued azathioprine, the only medicine he was taking at this time. Currently, he is without medicines for 15 months.
“Symptoms
“The frequent night sweats of the patient disappeared within three weeks after diet onset and thus his sleep improved significantly. The knee pains of the patient began to lessen at 4th week on the diet and completely disappeared by the third month. From this time onwords he regularly went to school by bike (20 km daily). He reported restored energy and increased physical and mental fitness. Although during the eight months before diet onset his weight was declining, following diet onset he began to gain weight. At diet onset his weight was 41 kg and was 152 cm tall (BMI = 17.7). At 12 months after diet onset, his height was 160 cm and weighted 50 kg (BMI: 19.5). The change in his height and weight is depicted in Figure 5. At the time of writing the article he is on the diet for 15 months and is free of symptoms as well as side effects.”

[Tur2011] Evidence type: review

Patricia V Turner, Thea Brabb, Cynthia Pekow, and Mary Ann Vasbinder
J Am Assoc Lab Anim Sci. 2011 Sep; 50(5): 600–613.

“Intraperitoneal administration.
“Injection of substances into the peritoneal cavity is a common technique in laboratory rodents but rarely is used in larger mammals and humans. Intraperitoneal injection is used for small species for which intravenous access is challenging and it can be used to administer large volumes of fluid safely (Table 1) or as a repository site for surgical implantation of a preloaded osmotic minipump. Absorption of material delivered intraperitoneally is typically much slower than for intravenous injection. Although intraperitoneal delivery is considered a parenteral route of administration, the pharmacokinetics of substances administered intraperitoneally are more similar to those seen after oral administration, because the primary route of absorption is into the mesenteric vessels, which drain into the portal vein and pass through the liver.74 Therefore substances administered intraperitoneally may undergo hepatic metabolism before reaching the systemic circulation. In addition, a small amount of intraperitoneal injectate may pass directly across the diaphragm through small lacunae and into the thoracic lymph.”

[Vit2014] Experiment type: metagenomic analysis

Marius Vital, Adina Chuang Howe, and James M. Tiedje
mBio. 2014 Mar-Apr; 5(2): e00889-14. Published online 2014 April 22. doi: 10.1128/mBio.00889-14

“Diet is a major external force shaping gut communities (33). Good reviews of studies investigating the influence of diet on butyrate-producing bacteria exist (11 and 34) and suggest that plant-derived polysaccharides such as starch and xylan, as well as cross-feeding mechanisms with lactate-producing bacteria, are the main factors governing their growth. Our metagenomic analysis supports the acetyl-CoA pathway as the main pathway for butyrate production in healthy individuals (Fig. 4), implying that a sufficient polysaccharide supply is probably sustaining a well-functioning butyrate-producing community, at least in these North American subjects. However, the detection of additional amino acid-fed pathways, especially the lysine pathway, indicates that proteins could also play an important role in butyrate synthesis and suggests some flexibility of the microbiota to adapt to various nutritional conditions maintaining butyrate synthesis. Whether the prevalence of amino acid-fed pathway is associated with a protein-rich diet still needs to be assessed. It should be noted that those pathways are not restricted to single substrates, as displayed in Fig. 1, i.e., glutarate and lysine, but additional amino acids, such as aspartate, can be converted to butyrate via those routes as well (26). Furthermore, the acetyl-CoA pathway also can be supplied with substrates derived from proteins either by cross-feeding with the lysine pathway (as discussed above) or by direct fermentation of amino acids to acetyl-CoA (35). However, whereas diet-derived proteins are probably important for butyrate synthesis in the ileum, where epithelial cells use butyrate as a main energy source as well (36), it still needs to be assessed whether enough proteins reach the human colon to serve as a major nutrient source for microorganisms. Another possible colonic protein source could originate with lysed bacterial cells. Enormous viral loads have been detected in this environment, suggesting fast cell/nutrient turnover, which might explain the presence of corresponding pathways in both fecal isolates and metagenomic data (Fig. 1, 4, and 5). Detailed investigations of butyrate-producing communities in the colon of carnivorous animals will add additional key information on the role of proteins in butyrate production in that environment. It should be noted that diet provides only a part of the energy/carbon sources for microbial growth in the colon, since host-derived mucus glycans serve as an important nutrient source as well. Several butyrate-producing organisms do specifically colonize mucus (37), and for some, growth on mucus-derived substrates was shown (38). ”

[Wie2017] Evidence type: review

van der Wielen N, Moughan PJ, Mensink M.
J Nutr. 2017 Aug;147(8):1493-1498. doi: 10.3945/jn.117.248187. Epub 2017 Jun 14.

“Protein digestion and fermentation in the large intestine. Intact proteins that escape the small intestine or produced in the large intestine (mucus, cells, microbial proteins) are digested further in the large intestine by bacterial enzymes and the surviving pancreatic proteases and peptidases (35, 36). This protein degradation has been reported to be highest in the distal large intestine and is m ost likely related to the pH in the different regions (37). The di gested proteins can be used by the microbiota, which produce several metabolites such as SCFAs, ammonia, and amines. These metabolites may be linked to several health outcomes (38).”
[…]
“The large intestine is important for whole-body protein and nitrogen metabolism, in particular via bacterial metabolism. Both small and large intestinal microbiota are capable of synthesizing AAs, and absorption of microbial AAs has been demonstrated to take place in the intestine.”

Ketogenic Diets, Vitamin C, and Metabolic Syndrome

This is an excerpt from today’s guest post on breaknutrition.com:
The Recommended Daily Allowances (RDA) for different nutrients were developed on Western diets, and therefore, high-carb diets. Given that a ketogenic metabolism uses different metabolic pathways and induces cascades of drastically different metabolic and physiological effects, it would be astonishing if any of the RDAs are entirely applicable as is.
One micronutrient that seems to be particularly warranting reassessment is vitamin C, because vitamin C is biochemically closely related to glucose. Most animals synthesize it themselves out of glucose. It shares cellular uptake receptors with glucose. Some argue that because we don’t make vitamin C, we need to ensure a large exogenous supply. I will argue the opposite: so long as we are eating a low-carb diet, we actually need less. On our way, we’ll briefly re-examine the relationship between vitamin C deficiency and insulin resistance.

End-to-End Citations:

[1]
Evidence type: review
Louis Rosenfeld
Clinical Chemistry Vol. 43, Issue 4 April 1997
“In 1911, Casimir Funk isolated a concentrate from rice polishings that cured polyneuritis in pigeons. He named the concentrate “vitamine” because it appeared to be vital to life and because it was probably an amine. Although the concentrate and other “accessory food substances” were not amines, the name stuck, but the final “e” was dropped. “
[2]
Evidence type: review
Drouin, Guy, Jean-Rémi Godin, and Benoît Pagé.
Current Genomics 12.5 (2011): 371–378. PMC. Web. 19 Dec. 2016.
“Vitamin C (ascorbic acid) plays important roles as an anti-oxidant and in collagen synthesis. These important roles, and the relatively large amounts of vitamin C required daily, likely explain why most vertebrate species are able to synthesize this compound. Surprisingly, many species, such as teleost fishes, anthropoid primates, guinea pigs, as well as some bat and Passeriformes bird species, have lost the capacity to synthesize it. Here, we review the genetic bases behind the repeated losses in the ability to synthesize vitamin C as well as their implications. In all cases so far studied, the inability to synthesize vitamin C is due to mutations in the L-gulono-γ-lactone oxidase (GLO) gene which codes for the enzyme responsible for catalyzing the last step of vitamin C biosynthesis. The bias for mutations in this particular gene is likely due to the fact that losing it only affects vitamin C production. Whereas the GLO gene mutations in fish, anthropoid primates and guinea pigs are irreversible, some of the GLO pseudogenes found in bat species have been shown to be reactivated during evolution. The same phenomenon is thought to have occurred in some Passeriformes bird species. Interestingly, these GLO gene losses and reactivations are unrelated to the diet of the species involved. This suggests that losing the ability to make vitamin C is a neutral trait.”
[3]
Evidence type: observation
Bánhegyi Gábor,Csala Miklós,Braun László,Garzó Tamás and Mandl József
FEBS Letters, 381, doi: 10.1016/0014-5793(96)00077-4 (1996)

“Ascorbic acid and glutathione are involved in the antioxidant defense of the cell. Their connections and interactions have been described from several aspects: they can substitute each other [1], dehydroascorbate can be reduced at the expense of GSH [2] and glutathione depletion results in the stimulation of ascorbate synthesis [3]. In ascorbate-synthesising animals, the formation of ascorbate from gulonolactone catalysed by microsomal gulonolactone oxidase is accompanied by the stoichiometric consumption of 02 and production of the oxidant hydrogen peroxide [4]. Metabolism of hydrogen peroxide by glutathione peroxidase requires reduced glutathione. Therefore, we supposed that synthesis of ascorbate should decrease the intracellular glutathione level. To prove our hypothesis, experiments were undertaken to investigate the effect of ascorbate synthesis stimulated by the addition of gulonolactone on the oxidation of GSH in isolated mouse hepatocytes and liver microsomal membranes.”

“In this paper, a new connection between ascorbate and GSH metabolism is described. Our data show that the synthesis of ascorbate leads to consumption of GSH, the other main intracellular antioxidant (Fig. 1). We suppose that the formation of hydrogen peroxide is underlying the increased GSH consumption. First, oxidation of GSH caused by increased ascorbate synthesis was prevented by the addition of catalase in microsomal membranes (Table 1). Second, inhibition of glutathione peroxidase by mercaptosuccinate moderated the gulonolactone-dependent glutathione consumption in microsomes (Table 2). Third, the inhibition of catalase by aminotriazole deepened the ascorbate synthesis-dependent GSH depletion in isolated hepatocytes (Table 3). This interaction may be one of the causes why primates and some other species have lost their ascorbate-synthesising ability. This event occurred in the ancestors of primates about 70 million years ago, owing to mutation(s) in the gulonolactone oxidase gene [14]. Despite the well-known benefits [15] of ascorbate, the mutation(s) had to be advantageous, as this metabolic error did not remain an enzymopathy affecting only a minority of the population, but spread widely amongst the species (and individuals) of primates and became exclusive [16]. There is no explanation for this unexpected outcome. Based on these analytical data, the following conceptual evolutionary hypothesis can be outlined: in the tropical jungle of the Cretaceous Period, when exogenous ascorbate was abundant [17,18], the loss of gulonolactone oxidase activity could have proved to be advantageous. It saved the reduced GSH, the main defence system against oxidants, while the access to ascorbate was not hindered. Later, the evolutionary gains of these periods allowed the conservation of the genetic disorder manifested in the loss of ascorbate synthesis.”
[4]
Evidence type: experiment

Abstract

Reactive oxygen species (ROS)-induced mitochondrial abnormalities may have important consequences in the pathogenesis of degenerative diseases and cancer. Vitamin C is an important antioxidant known to quench ROS, but its mitochondrial transport and functions are poorly understood. We found that the oxidized form of vitamin C, dehydroascorbic acid (DHA), enters mitochondria via facilitative glucose transporter 1 (Glut1) and accumulates mitochondrially as ascorbic acid (mtAA). The stereo-selective mitochondrial uptake of D-glucose, with its ability to inhibit mitochondrial DHA uptake, indicated the presence of mitochondrial Glut. Computational analysis of N-termini of human Glut isoforms indicated that Glut1 had the highest probability of mitochondrial localization, which was experimentally verified via mitochondrial expression of Glut1-EGFP. In vitro mitochondrial import of Glut1, immunoblot analysis of mitochondrial proteins, and cellular immunolocalization studies indicated that Glut1 localizes to mitochondria. Loading mitochondria with AA quenched mitochondrial ROS and inhibited oxidative mitochondrial DNA damage. mtAA inhibited oxidative stress resulting from rotenone-induced disruption of the mitochondrial respiratory chain and prevented mitochondrial membrane depolarization in response to a protonophore, CCCP. Our results show that analogous to the cellular uptake, vitamin C enters mitochondria in its oxidized form via Glut1 and protects mitochondria from oxidative injury. Since mitochondria contribute significantly to intracellular ROS, protection of the mitochondrial genome and membrane may have pharmacological implications against a variety of ROS-mediated disorders.
[5]
Evidence type: non-human animal experiment

“Effect of starvation and subsequent feeding. The effect of starvation was then investigated, and it appeared that a 24 hr. period of starvation was enough to decrease the synthesis of ascorbic acid (Table 2). Since Caputto et al. (1958) had shown that the maximum effect of vitamin-E deficiency on the synthesis of ascorbic acid was reached as shortly as 3-4 days after deprivation, the possibility was considered that the effect of starvation was actually due to lack of vitamin E. This was discounted by giving starved animals enough vitamin E to prevent formation of peroxides; there was no effect on the synthesis of ascorbic acid. The effect ofstarving was quickly reversed by feeding the rats again for 24 hr.”

“Effect of omission of carbohydrates from the diet and of administration of precursors: The effect of starvation could be attributed either to the stress or to the lack of some dietary components. A strong impairment of the synthesis of ascorbic acid was observed in rats given a carbohydrate-free diet for 24 hr., whereas values significantly higher but still below normal ones were obtained by giving this same diet for 6 days (Table 3). Rats on this ration had a lower content of ascorbic acid in the liver, but showed an enhanced excretion of ascorbic acid in the urine. Since carbohydrates are precursors of ascorbic acid in the rat, this observation led to the hypothesis of an adaptive response of the enzyme system to lack of substrates, and evidence was sought by giving glucuronolactone to rats. Administration of glucuronolactone did not affect the rate of synthesis in normal rats, but caused a moderate but significant enhancement in starved animals. However, a similar enhancement followed the administration of an equal amount of glucose. All rats receiving glucuronolactone had a higher liver content and an enhanced urinary excretion of ascorbic acid.”
[6]
Evidence type: non-human animal experiment
Braun L1, Garzó T, Mandl J, Bánhegyi G.
FEBS Lett. 1994 Sep 19;352(1):4-6.

“The role of the hepatic glycogen content in ascorbic acid synthesis was investigated in isolated mouse hepatocytes. The cells were prepared from fed or 48 h-starved mice and the ascorbic acid content was measured in the suspension (cells+medium). After 48 h starvation hepatocytes did not contain measurable amounts of glycogen. The initial concentration of ascorbic acid was lower in the suspension of glycogen-depleted hepatocytes compared to the fed controls (Fig. 1) and only a moderate synthesis could be observed under both nutritional conditions. The effects of dibutyryl CAMP and glucagon on ascorbate synthesis were examined. Glucagon or dibutyryl cyclic AMP caused a stimulation of ascorbic acid synthesis in hepatocytes from fed mice, while in hepatocytes from 48 hstarved animals ascorbic acid production was not increased significantly by the two agents (Fig. 1). The addition of glucose and gluconeogenic precursors to the incubation medium did not result in a significant increase in ascorbic acid production (Fig. 1). In another series of experiments glucose and ascorbic acid production of the cells was measured simultaneously. The rate of glucose production (in the absence of gluconeogenic precursors mainly via glucogenolysis) and ascorbic acid synthesis showed a close correlation (r = 0.9091) (Fig. 2). As ascorbic acid synthesis and glycogenolysis seemed to be connected, we examined the effect on ascorbic acid synthesis of various agents known to increase glycogenolysis. The al agonist phenylephrine, the protein phosphatase inhibitor okadaic acid and vasopressin all increased the rate of ascorbic acid production in isolated hepatocytes prepared from fed mice similarly to glucagon (Table 1).
“Glycogenolysis was stimulated by the in vivo addition of glucagon. Glucagon elevated the blood glucose level of mice by 50%; at the same time a more than fifteenfold increase of plasma ascorbic acid concentration could be observed (Table 2). The concentration of ascorbic acid in the liver was also increased, indicating a stimulated hepatic synthesis (Table 2).
“Discussion
Glycogen content is considered to be a sensitive marker showing the actual metabolic state of the liver. Observations described in this paper suggest that ascorbic acid synthesis in murine liver is tightly connected with the glycogen pool; the source of ascorbic acid is glycogen. The following results gained in isolated hepatocytes support this assumption: first, in hepatocytes isolated from glycogen-depleted animals the ascorbic acid level as well as the rate of synthesis is lower than that in hepatocytes from control fed mice (Fig. 1); second, different glycogen-mobilizing agents acting via different mechanisms enhance ascorbic acid production in hepatocytes from fed but not from fasted animals (Fig. 1, Table 1); third, addition of glucose to hepatocytes prepared from glycogen-depleted mice failed to increase the formation of ascorbic acid (Fig. 1). The results gained under in vitro conditions in isolated hepatocytes were confirmed by in vivo experiments: a single i.p. injection of glucagon elevated both the plasma and liver ascorbic acid levels within 15 min (Table 2). “

“The finding that the source of ascorbate production is glycogenolysis is in according with the fact that liver and kidney -the main sites of glycogen storage – are responsible for the ascorbic acid supply in most animal species [2]. The increased hepatic ascorbic acid production after glucagon administration can be explained as a compensatory mechanism of the missing intake of ascorbate, i.e. adaptation of ascorbic acid supply from external to internal sources. Considering the fifteenfold elevation of plasma ascorbate levels, in the light of recent findings concerning the effect of ascorbate on insulin secretion [18] and on the calcium channels in pancreatic beta cells [19] it might be also regarded as a possible intercellular messenger. “
[7]
Evidence type: experiment
Drew KL, Tøien Ø, Rivera PM, Smith MA, Perry G, Rice ME.
Comp Biochem Physiol C Toxicol Pharmacol. 2002 Dec;133(4):483-92.
“During hibernation plasma ascorbate concentrations w(Asc)px were found to increase 3–5 fold in two species of ground squirrels, AGS and 13-lined ground squirrels (TLS); S. tridecemlineatus and cerebral spinal fluid (CSF) ascorbate concentration w(Asc)CSFx doubled in AGS (CSF was not sampled in TLS) (Drew et al., 1999). During arousal, however, when oxygen consumption peaks and the generation of reactive oxygen species is thought to be maximal, plasma ascorbate concentrations progressively decrease to levels typical for euthermic animals (Fig. 3).”
[8]
Evidence type: observation
George Mann and Pamela Newton
Ann N Y Acad Sci. 1975 Sep 30;258:243-52.
“We have formulated two hypotheses. The first proposes that the transport of ascorbate across cell membranes may be impaired by glucose. The second proposes that the transport of ascorbate in certain tissues is facilitated by insulin. If either hypothesis is valid, those species requiring exogenous ascorbate would be in double jeopardy if they were also hyperglycemic. Carbohydrate intolerance resulting from eithcr a lack of or a resistance to insulin is common in Western man. Gore et al. have shown with electron microscopy that the vascular lesion of scurvy involves collagenous structures in the basement membranes, and this is also the site of the lesion in diabetic microangiopathy. These hypotheses, which propose that the intracellular availability of dehydroascorbate (DHA), the transportable form of vitamin C, would be impaired in certain tissues by either hypcrglycemia or lack of insulin, suggest that diabetic microangiopathy, the main complication of human diabetes, may be a consequence of local ascorbate deficiency. The laboratory investigations described here deal with the first and somewhat simpler of these hypotheses: Glucose will impair the transport of dehydroascorbate into cells. The data collected show that D-glUCOSe does inhibit the transport of dehydroascorbate into human red blood cells, a noninsulin-dependent tissue. Trials wiih other sugars show a hierarchy of sugars that inhibit transport, suggesting that DHA and D-glucose share a carrier mechanism.”
[9]
Evidence type: review

“Hyperglycemia-induced ascorbic acid deficiency
Vitamin C is a derivative of glucose and Mann [138] proposed that the structural similarity between these two molecules may account for many of the complications of diabetes. Glucose has been shown to inhibit vitamin C transport in several mesenchymal cell types, including endothelial cells [139], mononuclear leukocytes [140], neutrophils [141,142], fibroblasts [143,144], and erythrocytes [145]. Facilitative glucose transporters (GLUTs) bind dehydroascorbic acid and are thought to be the primary transporters of vitamin C in mammalian cells [146]. After transport, dehydroascorbic acid is quickly reduced to ascorbic acid. Glucose competitively inhibits the uptake of dehydroascorbic acid but does not affect ascorbic acid transport. Ascorbic acid is transported by a family of membrane-bound proteins that are Na+-dependent and whose function is not directly inhibited by elevated extracellular concentrations of glucose [146,147]. This latter system is prevalent in bulk-transporting epithelia (e.g. kidney and small intestine) and have been recently isolated in both human [148] and rat [149] biological systems. Many cell types, of course, [150,151] express both transport systems.
High blood glucose concentrations mimic the conditions of vitamin C deficiency. Acute hyperglycemia, for example, impairs endothelium-dependent vasodilation in healthy humans [152], an effect which can be reversed by acute administration of vitamin C [153]. Ascorbic acid plays an important role in extracellular matrix regulation and has a stimulatory effect on sulfate incorporation in mesangial cell and matrix proteoglycans; high glucose concentrations have been shown to impair this effect [154]. Endothelial surface proteoglycans help prevent thrombus formation and also inhibit smooth-muscle growth [1]. High glucose concentrations also have been shown to inhibit the stimulatory effect of ascorbic acid on collagen and proteoglycan synthesis in cultured fibroblasts [114]. Moreover, a high concentration of glucose can induce the expression of intercellular adhesion molecule-1 (ICAM-1) in human umbilical vein endothelial cells [155]. Endothelial cells express these and other membrane-bound proteins to enable leukocyte adhesion and transmigration across the endothelium during an inflammatory response. Atherosclerosis is one such inflammatory response.

Experimental and clinical studies suggest that latent scurvy is characterized by IGT [16,24] and diabetes mellitus is a disease complex characterized by impaired glucose and vitamin C metabolism [27,28]. Diabetic patients are prone to hyperglycemia, prolonged wound healing, infection, increased synthesis of cholesterol, decreased liver glycogen, and notably, diffuse vascular disease. All of these findings are consistent with latent scurvy [16]. Diabetic platelets have been shown to have low intracellular ascorbic acid concentrations and display hypercoagulability [156]. Long-term vitamin C administration has beneficial effects on glucose and lipid metabolism in aged NIDDM patients [157]. It has also been suggested that vitamin C consumption above the RDA may provide important health benefits for individuals with IDDM [158]. This latter recommendation is supported by recent evidence. For example, mesenchymal cells from patients with IDDM have an impaired uptake of dehydroascorbic acid that persists in culture [159] and ascorbic acid has been shown to prevent the inhibition of DNA synthesis induced by high glucose concentrations in cultured endothelial cells [160]. Diabetic patients have been observed to have a lowered ascorbic acid/dehydroascorbic acid plasma ratio, indicating a decreased vitamin C status [161]. Therefore, diabetic patients may benefit from vitamin C supplementation to alleviate multiple physiologic and metabolic impairments in a variety of cell types.”
[10]
Evidence type: review
Padayatty SJ, Katz A, Wang Y, Eck P, Kwon O, Lee JH, Chen S, Corpe C, Dutta A, Dutta SK, Levine M.
J Am Coll Nutr. 2003 Feb;22(1):18-35.

“Problems in Demonstrating Antioxidant Benefit of Vitamin C in Clinical

“Studies Despite epidemiological and some experimental studies, it has not been possible to show conclusively that higher than anti-scorbutic intake of vitamin C has antioxidant clinical benefit. This is despite the fact that vitamin C is a powerful antioxidant in vitro. It is of course possible that the lack of antioxidant effect of vitamin C in clinical studies is real. It seems more likely that vitamin C has antioxidant or other benefits. Detection of these benefits has remained elusive due to the vicissitudes of experimental design. Vitamin C may be a weak antioxidant in vivo, or its antioxidant actions may have no physiological role, or its role may be small. The oxidative hypothesis is unproven, and oxidative damage may have a smaller role than anticipated in some diseases. Further, antioxidant actions of vitamin C may occur at relatively low plasma vitamin C concentrations. Thus additional clinical benefits that occur at higher vitamin C concentrations may be difficult to demonstrate. Although all these are possible explanations, it seems unlikely that these are the real reasons for the lack of detectable effects of vitamin C in clinical studies. Many factors may contribute to the failure so far to demonstrate clear antioxidant benefits of vitamin C in clinical studies. The antioxidant actions of vitamin C may be specific to certain reactions or occur only at specific locations. In either case, beneficial effects can be shown only in disorders where such reactions or sites are the focus of disease process. There may be many different antioxidants that are active at the same time. In the face of such redundancy, only multiple antioxidant deficiencies will have detectable clinical effects. Antioxidant deficiency may have to be of long duration for accumulated damage to be noticeable. Antioxidant effects may be of importance only in those with oxidant stress. Thus, normal subjects or those with mild disease may have no need for high antioxidant concentrations. In a way analogous to the effect of acetaminophen on fever, antioxidants may have no effect in the absence of marked oxidant stress. A further problem is presented by the sigmoidal dose concentration curve for vitamin C. Small changes in oral intake of vitamin C produce large changes in plasma vitamin C concentrations. This makes it difficult to conduct controlled studies such that the plasma vitamin C concentrations of the control and study groups differ sufficiently to have physiological meaning.”
[11]
Evidence type: review
Padayatty SJ, Levine M
Oral Dis. 2016 Sep;22(6):463-93. doi: 10.1111/odi.12446. Epub 2016 Apr 14.

(Emphasis mine)
“Collagen hydroxylation

“Common symptoms of scurvy include wound dehiscence, poor wound healing and loosening of teeth, all pointing to defects in connective tissue (Crandon et al, 1940; Lind, 1953; Hirschmann and Raugi, 1999). Collagen provides connective tissue with structural strength. Vitamin C catalyzes enzymatic (Peterkofsky, 1991) posttranslational modification of procollagen to produce and secrete adequate amounts of structurally normal collagen by collagen producing cells (Kivirikko and Myllyla, 1985; Prockop and Kivirikko, 1995). Precollagen, synthesized in the endoplasmic reticulum, consists of amino acid repeats rich in proline. Specific prolyl and lysyl residues are hydroxylated, proline is converted to either 3-hydroxyproline or 4-hydroxyproline, and lysine is converted to hydroxylysine. The reactions catalyzed by prolyl 3-hydroxylase, prolyl 4- hydroxylase, and lysyl hydroxylase (Peterkofsky, 1991; Prockop and Kivirikko, 1995; Pekkala et al, 2003) require vitamin C as a cofactor. Hydroxylation aids in the formation of the stable triple helical structure of collagen, which is transported to the Golgi apparatus and eventually secreted by secretory granules. In the absence of hydroxylation, secretion of procollagen decreases (Peterkofsky, 1991) and it probably undergoes faster degradation. However, some hydroxylation can occur even in the absence of vitamin C (Parsons et al, 2006). Secreted procollagen is enzymatically cleaved to form tropocollagen that spontaneously forms collagen fibrils in the extracellular space. These fibrils form intermolecular collagen cross-links, giving collagen its structural strength. Independent of its effects on hydroxylation, ascorbate may stimulate collagen synthesis (Geesin et al, 1988; Sullivan et al, 1994). Collagen synthesis may be decreased in scorbutic animals (Peterkofsky, 1991; Kipp et al, 1996; Tsuchiya and Bates, 2003). Reduced collagen cross-links may be a marker of vitamin C deficiency in the guinea pig (Tsuchiya and Bates, 2003) but this may not be specific to vitamin C deficiency. Although many features of human scurvy appear to be due to weakening of connective tissue, it has not been shown that these lesions are due to defective collagen synthesis.
[12]
Evidence type: non-human animal experiment
J Mårtensson, J Han, O W Griffith, and A Meister
Proc Natl Acad Sci U S A. 1993 Jan 1; 90(1): 317–321.
“Guinea pigs given an ascorbate-deficient diet gained weight through day 14, but gained at a slower rate than the control animals,and then lost weight(Table1,groupA).The animals givenGSH ester(groupB)gained more weight than those of group A, and the weight gain during days 10-14 was =70% of the control group. Animals in group A became obviously sick after about day 17. They could not walk and moved very little, apparently immobilized by fractures of the hind legs and by swelling of the joints of the extremities, which were tender and had periosteal hematomas. Radiography showed major fractures of the femur in two animals. Animals in group A died or were sacrificed on day 21 or 22. Animals in group B(GSHester)did not have fractures or hematomas; 75% of these animals were indistinguishable by general appearance from controls. Histological study showed significant loss of osteoid material from long bones in group A,whereas most animals in group B had no decrease of osteoid material (Fig.1)or only a moderate decrease. In a separate experiment, several animals comparable to those of group B were kept for 40 days and showed no significant signs of scurvy (tender swollen joints,fractures);they exhibited some weight loss.”
[13]
Evidence type: in vitro experiment
Li X, Qu ZC, May JM.
Antioxid Redox Signal. 2001 Dec;3(6):1089-97.

“Abstract

“Liver is the site of ascorbic acid synthesis in most mammals. As human liver cannot synthesize ascorbate de novo, it may differ from liver of other species in the capacity or mechanism for ascorbate recycling from its oxidized forms. Therefore, we compared the ability of cultured liver-derived cells from humans (HepG2 cells) and rats (H4IIE cells) to take up and reduce dehydroascorbic acid (DHA) to ascorbate. Neither cell type contained appreciable amounts of ascorbate in culture, but both rapidly took up and reduced DHA to ascorbate. Intracellular ascorbate accumulated to concentrations of 10-20 mM following loading with DHA. The capacity of HepG2 cells to take up and reduce DHA to ascorbate was more than twice that of H4IIE cells. In both cell types, DHA reduction lowered glutathione (GSH) concentrations and was inhibited by prior depletion of GSH with diethyl maleate, buthionine sulfoximine, and phenylarsine oxide. NADPH-dependent DHA reduction due to thioredoxin reductase occurred in overnight-dialyzed extracts of both cell types. These results show that cells derived from rat liver synthesize little ascorbate in culture, that cultured human-derived liver cells have a greater capacity for DHA reduction than do rat-derived liver cells, but that both cell types rely largely on GSH- or NADPH-dependent mechanisms for ascorbate recycling from DHA.”
[14]
Evidence type: non-human animal experiment
Jarrett SG, Milder JB, Liang LP, Patel M.
J Neurochem. 2008 Aug;106(3):1044-51. doi: 10.1111/j.1471-4159.2008.05460.x. Epub 2008 May 5.

“Abstract

“The ketogenic diet (KD) is a high-fat, low carbohydrate diet that is used as a therapy for intractable epilepsy. However, the mechanism(s) by which the KD achieves neuroprotection and/or seizure control are not yet known. We sought to determine whether the KD improves mitochondrial redox status. Adolescent Sprague-Dawley rats (P28) were fed a KD or control diet for 3 weeks and ketosis was confirmed by plasma levels of beta-hydroxybutyrate (BHB). KD-fed rats showed a twofold increase in hippocampal mitochondrial GSH and GSH/GSSG ratios compared with control diet-fed rats. To determine whether elevated mitochondrial GSH was associated with increased de novo synthesis, the enzymatic activity of glutamate cysteine ligase (GCL) (the rate-limiting enzyme in GSH biosynthesis) and protein levels of the catalytic (GCLC) and modulatory (GCLM) subunits of GCL were analyzed. Increased GCL activity was observed in KD-fed rats, as well as up-regulated protein levels of GCL subunits. Reduced CoA (CoASH), an indicator of mitochondrial redox status, and lipoic acid, a thiol antioxidant, were also significantly increased in the hippocampus of KD-fed rats compared with controls. As GSH is a major mitochondrial antioxidant that protects mitochondrial DNA (mtDNA) against oxidative damage, we measured mitochondrial H2O2 production and H2O2-induced mtDNA damage. Isolated hippocampal mitochondria from KD-fed rats showed functional consequences consistent with the improvement of mitochondrial redox status i.e. decreased H2O2 production and mtDNA damage. Together, the results demonstrate that the KD up-regulates GSH biosynthesis, enhances mitochondrial antioxidant status, and protects mtDNA from oxidant-induced damage.”
[15]
Evidence type: non-human animal experiment
Milder JB, Liang LP, Patel M.
Neurobiol Dis. 2010 Oct;40(1):238-44. doi: 10.1016/j.nbd.2010.05.030. Epub 2010 May 31.

“Abstract

“The mechanisms underlying the efficacy of the ketogenic diet (KD) remain unknown. Recently, we showed that the KD increased glutathione (GSH) biosynthesis. Since the NF E2-related factor 2 (Nrf2) transcription factor is a primary responder to cellular stress and can upregulate GSH biosynthesis, we asked whether the KD activates the Nrf2 pathway. Here we report that rats consuming a KD show acute production of H(2)O(2) from hippocampal mitochondria, which decreases below control levels by 3 weeks, suggestive of an adaptive response. 4-Hydroxy-2-nonenal (4-HNE), an electrophilic lipid peroxidation end product known to activate the Nrf2 detoxification pathway, was also acutely increased by the KD. Nrf2 nuclear accumulation was evident in both the hippocampus and liver, and the Nrf2 target, NAD(P)H:quinone oxidoreductase (NQO1), exhibited increased activity in both the hippocampus and liver after 3 weeks. We also found chronic depletion of liver tissue GSH, while liver mitochondrial antioxidant capacity was preserved. These data suggest that the KD initially produces mild oxidative and electrophilic stress, which may systemically activate the Nrf2 pathway via redox signaling, leading to chronic cellular adaptation, induction of protective proteins, and improvement of the mitochondrial redox state.”
[16]
Evidence type: review
Meister A
J Biol Chem. 1994 Apr 1;269(13):9397-400.

“Ascorbate and GSH have actions in common and can spare each other under appropriate experimental conditions; this redundancy reflects the metabolic importance of such antioxidant activity.

[Sorry, this paper is hard to quote. It’s free. Go look. :-)]
[17]
Evidence type: review
Bruce N. Ames, Richard Cathcart, Elizabeth Schwiers, and Paul Hochstein
Proc. NatL Acad. Sci. USA Vol. 78, No. 11, pp. 6858-6862, November 1981 Biochemistry
“During primate evolution, a major factor in lengthening life-span and decreasing age-specific cancer rates may have been improved protective mechanisms against oxygen radicals. We propose that one of these protective systems is plasma uric acid, the level of which increased markedly during primate evolution as a consequence of a series of mutations. Uric acid is a powerful antioxidant and is a scavenger of singlet oxygen and radicals. We show that, at physiological concentrations, urate reduces the oxo-heme oxidant formed by peroxide reaction with hemoglobin, protects erythrocyte ghosts against lipid peroxidation, and protects erythrocytes from peroxidative damage leading to lysis. Urate is about as effective an antioxidant as ascorbate in these experiments. Urate is much more easily oxidized than deoxynucleosides by singlet oxygen and is destroyed by hydroxyl radicals at a comparable rate. The plasma urate level in humans (about 300 ILM) is considerably higher than the ascorbate level, making it one of the major antioxidants in humans. Previous work on urate reported in the literature supports our experiments and interpretations, although the findings were not discussed in a physiological context.”
[18]
Evidence type: review
Glantzounis GK, Tsimoyiannis EC, Kappas AM, Galaris DA.
Curr Pharm Des. 2005;11(32):4145-51.

“It has been proposed that UA may represent one of the most important low-molecular-mass antioxidants in the human biological fluids [23-26]. Ames et al. proposed in the early eighties that UA can have biological significance as an antioxidant and showed, by in vitro experiments, that it is a powerful scavenger of peroxyl radicals (RO2 . ), hydroxyl radicals (. OH) and singlet oxygen [23]. The authors speculated that UA may contribute to increased life-span in humans by providing protection against oxidative stress-provoked ageing and cancer. UA is an oxidizable substrate for haem protein/H2O2 systems and is able to protect against oxidative damage by acting as an electron donor [27]. Apart from its action as radical scavenger, UA can also chelate metal ions, like iron and copper, converting them to poorly reactive forms unable to catalyse free-radical reactions [28-30].”
[…]

“A randomized placebo-controlled double-blind study has evaluated the effects of systemic administration of 100 mg UA, in healthy volunteers compared with vitamin C 1000 mg [45]. A significant increase in serum free radical scavenging capacity from baseline was observed during UA and vitamin C infusion – but the effect of UA was substantially greater. No adverse reactions to UA administration were reported. Another clinical study indicated a significant inverse relationship between serum UA concentrations and oxidative stress during acute aerobic exercise [46], while an increase in muscle allantoin levels was detected [32]. The authors concluded that ROS are formed in human skeletal muscle during intense sub-maximal exercise and urate is used as a local antioxidant. Another clinical trial involving healthy young men showed that 50 and 80 km marches led to 25 % and 37 % rises, respectively, in plasma levels of UA, probably due to increases in the metabolic rate and consequently pyrimidine nucleotide metabolism [47]. A randomized double-blind placebo controlled crossover study evaluated the free radical properties of UA in healthy volunteers [48]. UA (0.5 g in 250 ml of 0.1 % lithium carbonate / 4 % dextrose vehicle or vehicle alone as control) was given to subjects who then performed high intensity aerobic exercise for 20 min to induce oxidative stress. A single high-intensity exercise caused oxidative stress (as reflected by increased plasma F2- isoprostanes) immediately after exercise and recovery. Administration of UA increased circulating UA concentrations, which increased serum free radical scavenging capacity and reduced the exercise-induced increases in plasma F2-isoprostanes. The authors concluded that the antioxidant properties of UA are of physiological consequence and support the view that UA has potentially important free radical scavenging effects in vivo.”
[19]
Evidence type: observational
“Urate has been shown to be a major antioxidant in human serum and was postulated to have a biological role in protecting tissues against the toxic effects of oxygen radicals and in determining the longevity of primates. This possibility has been tested by determining if the maximum lifespan potentials of 22 primate and 17 non-primate mammalian species are positively correlated with the concentration of urate in serum and brain per specific metabolic rate. This analysis is based on the concept that the degree of protection a tissue has against oxygen radicals is proportional to antioxidant concentration per rate of oxygen metabolism of that tissue. Ascorbate, another potentially important antioxidant in determining longevity of mammalian species, was also investigated using this method. The results show a highly significant positive correlation of maximum lifespan potential with the concentration of urate in serum and brain per specific metabolic rate. No significant correlation was found for ascorbate. These results support the hypothesis that urate is biologically active as an antioxidant and is involved in determining the longevity of primate species, particularly for humans and the great apes. Ascorbate appears to have played little or no role as a longevity determinant in mammalian species.”
[20]
Evidence Type: review
Charles V. Mobbs, Jason Mastaitis, Minhua Zhang, Fumiko Isoda, Hui Cheng, and Kelvin Yen
Interdiscip Top Gerontol. 2007; 35: 39–68.

(emphasis mine)
“Glucose Oxidation Favors Complex I, Lipid/Amino Acid Oxidation Favors Complex II

“The significance of the shift in source of carbon atoms for oxidation produced by dietary restriction may be that the oxidation of lipids and amino acids depends much more on mitochondrial complex II than on (free-radical generating) complex I, whereas glucose oxidation depends much more on complex I than on complex II. When glucose is broken down by glycolysis, the only reducing equivalents it makes are in the form of NADH. When the final carbon product of glucose, pyruvate, is metabolized in the Krebs cycle, almost all the reducing equivalents are produced in the form of NADH, except for one step at complex II (succinate dehydrogenase) that makes (then oxidizes) FADH2. Ultimately the metabolism of one molecule of glucose produces an NADH: FADH2 ratio of 5:1 [53, p. 20]. In contrast, when lipids are broken down by β-oxidation (fatty acid counterpart to glycolysis), an equal number of NADH and FADH2 molecules are formed. When the lipid-derived carbons are metabolized in the Krebs cycle, reducing equivalents are produced in the ratio of 3 NADH molecules per FADH2 molecule. Therefore ultimately lipid metabolism yields an NADH:FADH2 ratio of about 2:1 [53, p. 38] or even more if the fatty acid contains enough carbon atoms. For example, when one molecule of palmitate is oxidized, it produces 15 molecules of FADH2 and 31 molecules of NADH, which are ultimately oxidized to produce a net total of 129 ATP molecules. In contrast, production of the same number of ATP molecules from glucose would entail producing then oxidizing 8.66 FADH2 and 43.3 NADH molecules. Amino acid oxidation also proceeds by a similar 2-step mechanism yielding an NADH:FADH2 ratio between that of lipids and that of glucose, the precise number depending on the specific amino acid. The significance of this shift in the NADH:FADH2 ratio is that NADH is oxidized only at mitochondrial complex I, whereas FADH2 is oxidized only at complex II [53, p. 17]. Thus palmitate oxidation entails utilizing complex II at roughly twice the (FADH2-dependent) rate as glucose oxidation entails. Therefore shifting away from glucose utilization toward lipid and amino acid utilization would be expected to substantially reduce the production of reactive oxygen species, without necessarily reducing ATP production. As described below, other beneficial effects also occur as a result of this altered pattern of glucose fuel use, including a shift toward producing antioxidizing NADPH and increased protein and lipid turnover, which reduces the accumulation of oxidized protein and lipids.
[21]
Evidence type: experiment
Greco T, Glenn TC, Hovda DA, Prins ML
J Cereb Blood Flow Metab. 2016 Sep;36(9):1603-13

“Mechanisms of ketogenic improvement

“As mentioned previously, it is thought that much of the KD’s improvement in cellular metabolism and neuroprotection is through its ability to act as an alternative substrate. Here, we show rather that it first acts in an antioxidant manner to reverse mitochondrial dysfunction. Both Complex I and II–III are inhibited in CCISTD mice at 6 h post-injury. Increased production of lactate is a reflection of impairment of oxidative phosphorylation as well as an attempt to maintain ATP concentrations and cellular membrane potential through increased glycolytic output.13 While Complex I activity returns to sham levels by 24 h, Complex II–III activity remains inhibited. ONOO has been shown to not only inhibit Complex II–III, but also Complex V40 and suggests that the observed decline in ATP production in PND35 animals13 is due in part to impaired Complex III and/or V activity. In addition to inhibition of mitochondrial complexes, decomposition products of ONOO increase the amount of lipid peroxidation leading to thiol linkages and pore formation in the inner membrane ultimately uncoupling the mitochondria. Although Complex I activity is inhibited in CCI-KD animals, Complex II–III activity is not. This will continue to allow electron flow through the respiratory chain and production of ATP. KD not only has antioxidant properties, but may provide substrates beyond Acetyl-CoA. The reaction of AcAc with Succinyl-CoA produces succinate, and animals either fed a KD or infused with ßOHB show a significant increase in succinate concentrations.41,42 Other groups have also shown that KD increases Complex II activity (succinate dehydrogenase activity).43 By increasing Complex II activity and its substrate, KD is able to maintain mitochondrial membrane potential and ATP production and prevent bioenergetic failure. At 24 h post-injury, KD is likely to exert its affects through three mechanisms: (1) continued direct and indirect ROS/RNS scavenging, (2) increased Complex II activity and (3) increased acetyl-CoA and succinate.”

[22]
Evidence type: experiment

I cannot access the original experiment, but it is referred to here in the documents used by the RDA:

“Overall, while evidence suggests that vitamin C deficiency is linked to some aspects of periodontal disease, the relationship of vitamin C intake to periodontal health in the population at large is unclear. Beyond the amount needed to prevent scorbutic gingivitis (less than 10 mg/day) (Baker et al., 1971), the results from current studies are not sufficient to reliably estimate the vitamin C requirement for apparently healthy individuals based on oral health endpoints.”

Baker EM, Hodges RE, Hood J, Sauberlich HE, March SC, Canham JE. 1971. Metabolism of 14C- and 3H-labeled L-ascorbic acid in human scurvy. Am J Clin Nutr 24:444–454.
[23]
Evidence type: observation

Abstract

The current recommended dietary allowance (RDA) for vitamin C for adult nonsmoking men and women is 60 mg/d, which is based on a mean requirement of 46 mg/d to prevent the deficiency disease scurvy. However, recent scientific evidence indicates that an increased intake of vitamin C is associated with a reduced risk of chronic diseases such as cancer, cardiovascular disease, and cataract, probably through antioxidant mechanisms. It is likely that the amount of vitamin C required to prevent scurvy is not sufficient to optimally protect against these diseases. Because the RDA is defined as “the average daily dietary intake level that is sufficient to meet the nutrient requirement of nearly all healthy individuals in a group,” it is appropriate to reevaluate the RDA for vitamin C. Therefore, we reviewed the biochemical, clinical, and epidemiologic evidence to date for a role of vitamin C in chronic disease prevention. The totality of the reviewed data suggests that an intake of 90-100 mg vitamin C/d is required for optimum reduction of chronic disease risk in nonsmoking men and women. This amount is about twice the amount on which the current RDA for vitamin C is based, suggesting a new RDA of 120 mg vitamin C/d.

Optimal Weaning from an Evolutionary Perspective:

The evolution of our brains, meat eating, and a reliance on ketogenic metabolism

I recently had the privilege of presenting a talk, with the same title as this post, at the Ancestral Health Symposium.
I am posting the video here, with a transcript, some references, and related material.

Transcript

This is what I said for each slide, with comments / clarifications in square brackets.
Times are approximate.

Optimal Weaning from an Evolutionary Perspective

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s01.png
0:00
My talk is called Optimal Weaning from an Evolutionary Perspective and I’d like to break down that title a little bit.
‘Optimal’ implies best for something, and here that something is going to be brain development.
The word ‘weaning’ can also benefit from clarification,
because we often use it to mean the end of breastfeeding,
but I use the convention meaning the beginning of the end,
with the introduction of first foods.
For ‘evolutionary perspective’, I just want to point out that what we know about our past can inform our understanding of physiology,
but our physiology can also constrain the possibilities of the past.

Overview

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s02.png
0:36
I’ve concluded that weaning infants onto an animal based diet best meets their nutritional needs,
and the rest of this talk will be about why.
Primarily I’ll be talking about the unique properties that resulted from the evolution of our brains.
I’ll also give a bit of evidence from modern health studies and trials,
and then finally I’ll give a little bit of the how,
based on my own experience in weaning one of my children onto animal based foods.

Human brains are unique

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s025.png
1:08
Human brains are unique in many ways, but one of the most striking things is their sheer size, especially relative to our bodies.
In particular, when you take into account that we are primates, it’s really quite extraordinary.
Primates already have brains that are about three times as large as most other mammals, at least relative to their size [1],
and then humans have again about two and a half to three times as large brains as other primates do.
And we didn’t always have that large a brain,
that three times expansion occurred
over the course of a few million years.
And a second related way that our brains are unique, is that our individual human brains do most of their growth after birth [2].

Altricial vs. Precocial

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s03.png
1:53
It’s helpful to think about this in context of the distinction between altricial and precocial animals,
which is based on their degree of development at birth.
Altricial animals are underdeveloped.
They tend to have a short gestation, compared to precocial animals, who have a long gestation.
They’re poorly developed, so they may be missing hair.
They usually have underdeveloped sense organs, for example unopen eyes.
They’re usually born in litters, as opposed to singletons,
and they have less adult-like proportions,
whereas precocial animals are essentially adult-like in their proportions.
They have underdeveloped limbs,
which means that they can’t do what precocial animals do,
which is move like the adults that they’re born from,
and they tend to be smaller at birth,
and their parents are younger when they reproduce.

Humans appear altricial but are precocial

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s04.png
2:41
Humans don’t really fit into this paradigm very well
when you look at it at first glance,
because they appear to be altricial,
but they’re actually better understood as being precocial.
Primates in general are highly precocial,
and humans, when they fit that,
are to the extreme,
for example, we have enormous newborns, and we reproduce relatively late.
Our babies appear altricial, though, because they’re born helpless,
they don’t have adult proportions at all,
and they can’t walk or have the motor skills that you would expect them to have.
But it’s helpful to think of them as actually precocial, but born early.
And one reason to think that is because of fetal brain growth rates.
We have our brains growing at the same rate as fetuses do persisting for up to a year [ should say at least ]
after birth,
and if you then look at our babies when they are a year old
they look a lot more like you would expect them to look if they were born precocial:
they have motor skills that you would expect them to have,
and teeth, for example.

Human bains continue to grow postnatally

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s05.png
3:52
Here’s [sic] a couple of graphs from the Smithsonian.
There’s one for chimpanzee brain growth and one for human brain growth.
As you can see with the chimpanzee brain growth,
they complete about half of their brain growth in gestation, and the rest over the course of a couple of years.
Note that chimpanzees, like many primates wean quite late compared to us [3];
they wean at about four years of age,
which is well after all their brain growth is completed.
Humans, on the other hand, have a very steep rate of growth before birth,
and it continues into the second year [4], [5],
and then the rate slows down some,
although it’s still pretty significant,
and then it’s followed by what looks on this graph like a levelling off,
but this graph does end at age 10
and we know that there are growth spurts after that, too.

Rapid brain growth sustained beyond weaning

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s06.png
4:40
What I want to draw attention to with that is that
the fetal-like brain growth doesn’t just extend beyond birth,
but it also extends beyond the end of weaning.
[This is a mistake. I meant to say that the rapid brain growth continues past the end of weaning,
but it is “fetal-like” only past the
beginning of weaning.]
So, we have this fetal-like growth in the first year,
continued rapid growth to 5 years [ Or is it 4? ] [6],
continued slower growth through childhood,
and then, if you combine that with the fact that we wean early [3],
we realise we need to support that kind of rate of growth even beyond weaning.
[ I’m also wondering whether weight is the best measure. Volume, density, cholesterol levels are all other measures to consider, but I won’t get into that here and now. ]
Our brains are really vulnerable, and they have many critical periods,
each of which builds on the one before,
so if you haven’t completed one of your stages of brain growth,
you may not be able to complete the next stage successfully,
and that means that you need continuous support through a long period of time [7].

Brain growth requirements

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s07.png
5:28
What kind of support do we need to give growing brains?
Well there are at least three kinds of ways that we need to support a growing
brain.
One is that they need specific micro-nutrients.
Even adult brains can suffer if they don’t get enough of certain kinds of micronutrients and
certainly developing brains that are missing these nutrients, if they’re
missing them at critical times sometimes they can’t even recover from the detriment.
Secondly brains require an enormous amount of energy.
At least 20% of the energy that we consume as adults goes to our brain
and that’s even more extreme in a newborn who has about three quarters of the energy that they consume go[ing] right to
the brain [8], [9].
And then thirdly, of course we need material for the structural
components, and brains are made mostly of cholesterol and fat.

Brain evolution requirements

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s08.png
6:24
In parallel to that,
the evolving of the brain has similar requirements.
We needed those micronutrients and the energy and the structural components.
We needed them to be available over a period of years for each individual and then that needed to
be compounded more or less continuously for millions of years for us to be able
to make that three times expansion.

Brain requirements: co-adaptations

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s09.png
6:51
While our brains were expanding over this long
evolutionary period,
there were co-adaptations that allowed them to expand,
particularly contributing to the extraordinarily high energy requirements.
These co-adaptations I would like to talk about in more specifics:
a high quality diet (by which I mean high in animal foods),
shrinking intestines, a reliance on the ketogenic metabolism,
and increased body fat particularly in babies.

Co-adaptation: eating meat

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s10.png
7:20
First of all, meat eating.
The plants that were available to us at the time that we were expanding
our brains were simply too fibrous, too low in protein, too seasonal, and too low
in calories to provide the needed energy.
So significant fatty meat eating was necessary for the protein and the energy as well as the micro-nutrients for
developing our brains to our current form.
[ See the post, Meat is best for growing brains for more detail about the implausibility of plants as a sufficient food source.]

Brain requirements: micronutrients

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s11.png
7:39
I’m going to just zoom in on a few
of those particularly critical micronutrients.
We have the minerals
iodine [10] , iron [11], and zinc [12];
the fatty acid DHA [13], which is in all your brain cells in the phospholipids.
It’s particularly important in vision (retinal cells) in the
synapses,
and vitamins A and D.
If you don’t get enough
of these vitamins and minerals and fatty acids as your brain
is developing you can suffer developmental delay, disability.
There is a tendency to emotional fragility and susceptibility to
psychiatric disorders and it’s often not recoverable.

Micronutrient sources

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s12.png
8:27
For these micronutrients, animal foods are either the only, the best, or the most bioavailable source.
For DHA it’s almost exclusively found in animals.
It’s true that there is some in microalgae,
but it’s not very plausible but that’s where we were getting it while we were evolving.
Vitamin D is only available in animal sources.
It’s true you can get it from sunshine, but again if you take into account the
seasonality and the various geological periods we went through,
It’s — we would need more.
Iron is available in plants, but it’s three times more bio-available in animal sources [14].
Similarly with vitamin A, which is 12 to 24 times more bioavailable in animal sources [15], [16].
If you think about the sheer amount of plant food that you would have
to eat to try to make up for that,
it’s just not plausible at all.
For zinc it’s simply — animal sources are simply the best.
And then I’d also like to note that some plants actually interfere with the absorption of those minerals,
so it might not just be not a benefit to try to get them from plants but it could actually
be a detriment.
[ I refer the interested reader to the blog of Dr. Georgia Ede, and in particular, her post on vegetables ]

Co-adaptation: shrinking intestines

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s13.png
9:40
A second co-aptation is shrinking intestines.
In 1995 Aiello and Wheeler came up with a hypothesis to try to explain how
it could be that these brains that we’re growing which requires so much energy
could have gotten that energy without giving up something else,
and they noticed that we did give up something else.
We gave up a drastic amount of the size of our intestines.
Intestines are also really energy-intensive,
so that smaller size freed up energy for the brain
But there’s also a feedback loop,
because having less intestines reduced our ability to consume fibre.
A lot of other primates get a lot of their energy by consuming fibre and
putting them through the factory of bacteria that turns that fibre into fat.
We no longer have much of that ability at all and
so that also increased our need to get our fat directly from an animal based diet.
[ See the post, Meat is best for growing brains for more detail about the the Expensive Tissue Hypothesis, and shrinking intestines. ]

Brain requirements: structural components

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s14.png
10:30
Going back to brain requirements,
I wanted to re-emphasize the structural components
I’d said that brains are mostly fat and cholesterol.
By dry weight it’s about 60% lipids [17],
about 40% of that which are cholesterol [18],
but there’s a problem because fatty acids don’t cross
the blood-brain barrier very easily [19],
[ That is, DHA and AA enter the brain easily, but not the long chain fatty acids that white matter, gray matter, and myelin are mainly composed of. ] [20], [21]
cholesterol almost not at all [22].
So all of that fat and cholesterol is reconstructed in the brain and it’s
reconstructed we know, out of ketone bodies [ See next two slides ].

Ketone body fates

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/ketone-body-fates.png
[ This slide I inadvertently omitted! It shows the biochemical pathways of ketone bodies being made in the liver,
and what is relevant for this talk, being transformed into fuel, as is familiar to many,
but also into fat and cholesterol, which may be new to many in audience. ]

Co-adaptation: reliance on ketogenic metabolism

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s14.2.png
11:00
That brings me to the third co-adaptation,
which is using fat for energy and for substrates in the brain
with ketone bodies.
Ketone bodies are directly usable by the brain for energy,
unlike fatty acids.
They are used to create most of the fat and all the cholesterol.
[ Correction! Most of the fat and all of the cholesterol is synthesised in the brain,
and preferentially by ketone bodies, but some is also made from glucose (which, of course, can be made on demand from protein. ] [23], [24], [25]
and most importantly, they can easily and abundantly cross the blood-brain barrier.
There are other benefits to being in a ketogenic metabolism,
for example, it increases the density of mitochondria in brain cells which allows more energy to flow
and it also decreases the vulnerability of the growing brain to stress and trauma.
You may be aware of the extreme neuroprotective properties
of the ketogenic diet.
For example, it mitigates drastically the damage that you would incur if you had a traumatic brain injury or stroke, so that’s
obviously adaptive.
[ See the post The medical-grade diet, for more on neuroprotective properties of ketogenic diets. ]

Co-adaptation: reliance on ketogenic metabolism

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s15.png
11:52
The fact that we use ketone bodies for brain energy and material,
which we and some other species also do in gestation,
explains why newborns are in mild ketosis all the time [26].
Infants use ketones three to four times more efficiently than adults
[ Correction! four to five times. (Three to four is in newborn rats.) ] [27],
so mild ketonemia for a baby is more like a deeper ketosis for an adult.
Even children as old as 12 and probably older
can become ketogenic much more quickly and easily than you might expect.
We’re talking about a matter of hours of fasting to develop the kind of ketosis
that would take adults several days [28].
But even human adults become ketogenic more easily than other species and they do it without calorie restriction.
This is really significant.
I know of no other species that sustains ketosis without either starvation or semi-starvation,
and it has implications for animal models of ketogenic diets therapies,
because there may be cases where an animal requires caloric restriction for the therapy to be
effective, whereas in humans it probably doesn’t,
and would be a detriment to compliance and to health outcomes.
So I wanted to emphasize that humans have co-opted this trait
that was previously an adaptation to cope with periods of starvation, and it
still is in other species, but we have co-opted it into a default metabolism at
least for the period of childhood to support the brain growth in particular,
but also to meet the brain’s ongoing energy requirements.

Co-adaptation: increased body fat

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s16.png
13:30
Finally, the last co-adaptation I want to talk about is increased body fat,
because it goes along with all the others
It’s striking, again, when you compare humans with other primates,
how fat they are.
Even adults are fat compared to other primates.
Other primates and most terrestrial animals
actually have less than 5% body fat,
and humans have easily somewhere between 15 and 20%, even very lean ones.
Human babies take that to the extreme.
They start out at about 15%.
That’s doubled in a couple of months and it continues to increase over the first
year.
Baby fat is different in character from the kind of fat you’d see in obese
adults.
It’s subcutaneous, not visceral [29], and it’s very low in polyunsaturated fatty acids
even if their mother is eating a lot of polyunsaturated fatty acids,
whereas obese adults tend to have a kind of roughly corresponding level and
quality of fatty acids to what they’re eating.
So there’s obviously a lot of filtering going on.
And what polyunsaturated fatty acids are there are almost all DHA and arachidonic acid,
which is another important brain fat,
so it seems that this extreme body fat in babies is there to provide a
continuous supply of fat that can be used by the brain both for energy and
for materials via the ketogenic metabolism that we are relying on [30].

Summary of Evolutionary Evidence for Meat

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/summary.png
[ This is the other slide I missed.]
14:56
I seem to be missing a slide.
I just wanted to quickly summarize what what I’ve said about evolution of the brain.
The first is that we needed to evolve — we needed to eat meat to allow us to evolve
the brains that we did.
That’s for energy and for micronutrients.
And I also wanted to emphasize the ketogenic metabolism part,
because not only is it a natural normal default state for children but it shows that it’s not detrimental,
it’s actually a benefit.
It’s actually critical.
It’s actually part of the mechanism of how we build our brains.
And so I’m bringing that up because someone who’s thinking
about weaning their baby onto animal-based foods might worry:
Wouldn’t this make them ketogenic and could that be a problem?
And I just want to emphasize that not only is it not a problem,
it’s the way it’s supposed to be
and you could hardly stop if you wanted to because even when they sleep they’re
going to go into ketosis.

Weaning onto meat: clinical trials

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s17.png
16:07
OK, so onto clinical trials.
I know of two clinical trials that compared eating — weaning an infant onto the fortified cereals
that we mostly recommend now, versus weaning them onto exclusively meat.
The first one compared or took some measurements comparing them and the meat
weaned children had a higher zinc status,
which we know is very important.
They had adequate iron without the benefit of supplementation that the cereal arm had.
They had increased head growth which in children is a good index of brain growth,
and it’s also correlated with higher intelligence and that’s not
even taking into account the size of your head at birth so it’s
not just the size of their head,
it’s the amount of growth that happened between birth
and the later time that’s correlated with the higher intelligence.
And the second study just showed better general growth without increased adiposity
That was what the researchers were worried about was that if you wean babies onto
meat they would get fat in a way that would increase the risk for modern
diseases and that of course didn’t happen.

Slide with refs

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s19.png
17:30
And I just have those references
there for your reference.
This kind of study is what I think has led to certain
agencies like the Canadian government and the La Leche League to include meat as a
recommended first food.

How? It’s easy.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s20.png
17:41
Finally I’m going to talk a little bit about how to
do that just based on my experience from doing that with my third child.
I was very influenced by Baby-Led Weaning.
The core understanding from them is that
you don’t — you can basically give a baby
the same food that an adult eats.
The risk of choking has been greatly exaggerated.
You don’t need to buy into this whole, you know, factory-made baby food stuff.
You can give them what you eat for the most part.
So what I have done, for example:
I was in the habit of making bone broths that had some meat in the
broth and I started by giving him broth on a spoon and increasingly
over time added some fragments of meat.
I also gave him bones to teethe on from my
steaks and chops, and again I increasingly left meat and fat on it,
which he enjoyed a lot.
I fed him a lot of egg yolks and beef and chicken liver,
which have a nice soft, silky texture.
They’re extremely nutrient dense and to
this day — this child is almost seven and liver is one of his favourite foods which
pleases me to no end.
I’m really grateful to Aaron for being the first to bring up the word pre-masticate in this conference yesterday,
so I didn’t have to be,
and I also know from being in the audience that several people besides me did prechew their
food for their babies and it’s certainly plausible —
I would expect that a lot of people in the past did that and I
did that.
I also often made plain unseasoned beef jerky which is really
good for teething — sort of reminds me of a dog with rawhide he would gum down on
it and pull and then he’d suck on it for a long time and it would basically just
disintegrate.
Also still one of his favorite foods.

(Photo slide)

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/s21.png
19:35
And I’ll just leave you
with a couple of photos of that baby who is almost 7.
On the left here we have him at six months with a lamb bone that he was teething on.
At the bottom: when he was two I discovered that he had liberated a stick of butter from the
fridge, because that’s so delicious,
and by two-and-a-half he was scrambling his
own eggs.
This child basically had almost no plant matter in his diet for
the first two years of his life and even now his diet is primarily animal-based.
Please give me your questions. Thank you.

Q&A

I’ve included the names of the questioners that I knew.
If you are one I left out, introduce yourself!
Again, my clarifications or further comments in brackets.

Question 1 (Christopher Kelly)

C: I fully(?) subscribe to your ideas presented here and I have a very healthy two and a half year old daughter
that’s eaten much the same way.
But I thinks there’s an important point missing from your talk that is:
the ketones come from medium chain triglycerides,
that come from mom’s milk from eating carbohydrates.
So the carbohydrates are synthesised in the breast tissue
that make MCTs.
Those MCTs are put in the breast milk,
and that’s a really important ketogenic substrate, so I think that mom should be in ketosis,
You see what I’m saying? The ketones should be synthesised by the baby.
A: I understand what you’re saying, yes.
So I just want to give a couple of counter-examples.
I didn’t eat any carbohydrate while I was making breast milk,
and although there are medium chain
triglycerides in the breast milk,
that’s certainly not the only reason that
babies are in ketosis.
Even the babies postweaning that i mentioned earlier get
into ketosis very rapidly, because it’s just the natural state.
You can make it — that’s why what I’m positing here,
and I’m not — it’s not my idea — but what
I’m saying here is that the baby fat that is there is being turned into
ketones just from the fat that’s stored on the body,
just the same way that I make them
C: Right
A: And I forgot to mention that,
I talked about how fat babies are and how
fat adults are compared to other primates,
and I think it’s quite significant.
It would be unusual to see an animal that’s that fat if you
thought that we weren’t naturally ketogenic animals.
C: Yeah and I’ve actually measured ketones, blood ketones, in my
daughter when she was still an infant, breastfed only, and it was 1.6 mmol.
But I can actually send you the studies that show that the the MCTs in milk,
they go down and there’s studies where they’ve looked at giving MCTS to mom
and they don’t go anywhere. Mom metabolizes them all.
None of them appear in the breast milk, so I think like the carbohydrates for mom,
It’s not my opinion, I can send you the studies that show that that might be
important.
A: I’d like to see that. I guess what I’m trying to say is that the ketones that
are in the baby’s blood don’t only come for a medium chain triglycerides…
C: Right, right, I understand that.
Okay, yeah. I work with a doctor, he’s just finishing his PhD in neonatal
neuroprotection so he’s done quite a lot of research in this area so I’ll send you
some studies.
A: That’s fantastic. I would love that.
C: Okay, thank you.

Question 2

22:53
Q: Hey, that was great talk. Thank you. Can you say something about the
timeframe and you’re, you know after three children and all of your research
and interest in this, your thoughts on the timeframe for beginning
the weaning process and then also how large that window of transition looks
A: Sure. There’s a lot I don’t know but I know that the recommendation currently
is to start weaning at around four to six months and I think that the reason
for that is because the amount of breast milk that children get, the caloric input
just can’t provide much more than what they need by the time they’re that
large and so I would say to start giving your baby food as soon as they start to
express interest in it.
Just, you know, let them be the ones who say “I’m ready to
start eating. Give me that.”
And then how long it goes: Humans tend to wean a lot
younger than other primates and I don’t know to what degree that’s enculturated
and to what degree that’s natural.
With my experience, my first
child, I, he stopped breastfeeding at about two years and then each one after
that was earlier and earlier with the last one, he stopped at nine months.
So, I’m sorry I don’t know more about that.
Q: No, no, it’s okay. I just kind of wanted to
see what your thoughts are.
I suppose there’s some, aside from nutritional implications
of how soon or early or late you you move away from breastfeeding
I’m sure there’s other implications as well but it’s just it’s hard to understand.
I just have a newborn, so I was just interested.
A: Congratulations!
Q: Yeah. Thank you so much!

Question 3 (Georgia Ede)

24:49
G: Amber, thank you for an exceptionally good talk.
I just had a curiosity question as a psychiatrist.
You having raised three
children on this unique diet, which I wish were more common,
[ Clarification: Unfortunately only my third child was weaning onto meat, though our household was always generally a low carb one. ]
can you comment at all
about how your children fared emotionally and physically compared to
their peers? As a mother I would be very curious to hear.
A: Well there’s so much
individuality I don’t want to necessarily claim too much.
I know that my youngest
child does have a very even temperament, especially compared to one of his
brothers, but then on the other hand his oldest brother has perhaps the most even
temperament of all, so don’t I don’t know what to conclude about that.
One interesting thing that has been commented on to me many, many times is
that my youngest child was never — he never missed a single day of daycare
throughout — when he started at two and, so the entire three-year period, many
of his peers,
all of his peers missed significant time to many illnesses and he missed
not a single day, so I like to attribute that to his diet.
G: Thank you very much. It was fascinating.
A: Thank you.

Question 4 (Ben Sima)

26:02
B: Has it been difficult to maintain
his diet of high-fat from a social perspective, for example, they go over to
someone else’s house and they have candy or something with other parents?
A: It is a challenge and the older they get the more of a challenge it is.
My other
children also at the time that I was weaning him, they were also
transitioning to a more meat-based diet and yes it’s
— I mean for example there,
the number of special occasions that you have when you’re at school seem to be
almost as numerous as the number of days
Like, it’s always somebody’s birthday or
some occasion and that’s always being celebrated with some kind of gluteny,
sugary snack and yeah, it’s a struggle.
B: So do you find that he has a sweet tooth or does he kind of shun that
A: He loves sweet things when he gets his hands on them, but he doesn’t seem to be obsessed
with them.

Question 5

27:04
Q: I just wanted to offer the cross cultural perspective that the worldwide
age of breastfeeding cessation is about four to five years.
[ But see footnote 3 below, which argues that the natural age is about 2.5 and for important, persuasive reasons. ]
It’s only in the United States that it’s young, around a year, but if you look cross-culturally it is
actually four years in most cultures.
A: Thank you for that.
So that’s for the very end of breastfeeding?
Q: Yeah. So that’s kind of our biological norm.
It’s more of a cultural thing here.
The other thing that’s interesting is around four to six months —
infants get a big bolus of iron from the placenta especially if we allow for delayed cord
clamping —
and then around four to six months that initial iron starts to go
down which is another reason why, like you’re saying,
meats are such a good first food,
but that’s why that four to six months seems to be a good time to
start foods is because that —
not that breast milk is lacking in iron and zinc,
but that that’s not where they’re supposed to get it from.
You get for placentally and
then it starts to go down around four to six months which is why,
traditionally the idea of “ok let’s put iron in rice cereal”
which we know — not a good idea but yeah that’s another reason why that four to six-month window seems
to be a good time for getting those iron and zinc rich foods in.

Question 6 (Nick Mailer)

28:20
Q6: Thanks for the talk, it was very good.
Something that hasn’t been discussed so much in this community
is that weaning and continued breastfeeding is not merely about
nutrition but it’s also about keeping the bond between the mother and the
child,
and that’s something that’s often overlooked.
I know that there are
people who I know who have generally weaned but, you know,
when the child is a bit ill
or is feeling a little bit insecure the child will revert for a little while to
getting a little bit of breast milk or maybe once at night just to say goodnight.
It becomes part of a ritual and part of a bonding process rather than as an
essential continuing of nutrition,
which is why as long as you’re
comfortable with it there’s no harm in weaning in that extent, finally later,
and that sometimes people feel the pressure — okay it’s four to six months I better stop
by six months or something will happen — and people do feel that pressure which
is why I think in the US and the UK people kind of feel that it’s a race
to the final cessation of weaning and it doesn’t have to be as far as I’ve heard.
A: Right. Excellent point. Thank you.

Question 7

29:22
Hi. Thanks for the talk.
So I have three children also.
My youngest is 15 months.
We thought a very similar, you know baby led weaning
process, as your youngest.
My question is, so my oldest is 11 too — quite a gap in
between them,
and you mentioned that vitamin D is one of the critical elements for brain
development and prior to my son being born I had never seen like my
pediatrician recommending vitamin D supplementation.
So i guess my question
is what are your thoughts on supplementing, like, drops as a newborn,
and also what are some of the better animal sources other than I think fatty
fish to get vitamin D from.
A: Yes, liver fatty fish… I’m surprised that you
didn’t, weren’t recommended vitamin D drops because I remember that from even 15
years ago when my first son was born.
Q: Yeah I don’t remember if it’s possible. Five years between each of them.
So, I think that’s weird you know.
A: Right.
Q: Did you do those?
A: I did. I did do those with the first two children.
Actually I did it with all of them, come to think of it.
Yeah.
Q: Thanks.
A: I figured there’s — the amount you would
have to get to overdose is high enough that it wasn’t going to hurt.
Q: Yeah. We did it too.
I just wasn’t sure. I hadn’t heard it before him, and you mentioned it, so
thanks. Alright.
A: Thank you.

Question 8

30:53
Q: Hi. I missed the first part your talk which I’m bummed about, but
I have a four-year-old who regularly steals butter out of the fridge, and her first
foods were I think egg yolk, and
I don’t think I did liver right away because I wasn’t doing
that much liver but now she loves liver too.
It’s like her favorite food.
A: Isn’t it good!
Q: Yeah, I mean I don’t particularly like it, but I eat it.
But she like — she loves it.
I just wanted to add, too, maybe this will be covered in the next talk,
about breastmilk and the microbiome, but one of the things that I found interesting about
breastfeeding and the importance of it for the longer term is that it actually,
the way that children remove milk from the breast actually
helps to form the jaw and the palate,
and so we see a lot today where women have to
go back to work, you know six weeks, 12 weeks after giving birth and so they’re
pumping a lot and because we’re getting bottles
and that’s really changing the
way that our mouths are structured, I mean as are our nutrients in
the womb and the palate formation.
I mean, anyone familiar with Weston Price’s work knows that, right,
but I just think it’s an interesting piece, too,
and I don’t think that there’s this —
once kids start food they have to stop breast milk.
In fact those things go
together quite well for a long time because of the emotional factors and
because of the palate formation
and the muscle strength and the jaw formation.
A: Right.
Q: So, that I think is an interesting interesting piece, too,
and yeah I’ve seen the same sort of statistics that hunter-gatherers usually
breastfed three to four years,
but they actually had a lower body fat,
and so that would suppress ovulation for longer,
which is why their children were
spaced 4-5 years apart.
And there was no dairy.
People weren’t eating dairy, so only dairy that was available was
breast milk and the way that that dairy produces certain vitamins…
A: Lactose in particular is broken down into glucose and galactose
and galactose is used to build some the brain material as well.
Q: There’s a question, so I have anoher question I’ll ask you later.

Question 9 (Kevin Boyd)

32:52
Q: Okay that was interesting.
Who are you? That’s, it’s interesting that you’d, she’s — that’s my whole
talk this afternoon.
Please come.
Nutrition is concerned with nutrients, but not mechanical aspects of food
processing and what she brought up was what I was gonna talk about a little bit,
but how did you learn about baby led weaning, because, that is,
for people who might not
know could you explain a little bit about what it is and how you learned
about it and how you are executing it with your own children?
A: Well, I’m not sure where I first heard of it, but the thing that I said that
was the core important idea from it is what I’ve taken mostly from it,
and that’s that babies don’t necessarily need you too mush everything up you can,
you can give them a chicken drumstick and they’ll deal with it.
Q: Yeah. I’m going to really elaborate and so many wonderful points you made today,
at one-thirty today.
A: Ok. Well, I won’t steal your thunder, then!
Q: It was a great talk.
A: Thank you.

Question 10

34:05
Q: Hi, I’m the token pre-mastication question.
So you know, it goes:
You know, you have your first baby and you sterilize everything before
it touches her mouth and by the third baby you’re picking up a pacifier and
you’re popping it in your own mouth before you pop it in theirs,
and there was some concern about that in terms of, I guess, oral hygiene and what I
had heard was, you know, it’s not such a wonderful thing to introduce your mouth
germs to your baby, but if your pre-masticating their food perhaps you
disagree with that.
A: Yes. Yes. I suppose if you had something unhealthy going on your mouth, that would
be a problem, but I — I don’t really think that there’s anything unhygienic about
the mouth, if you’re healthy.
Q: OK.
35:00
OK, well, thank you.

Acknowledgements

I’ve never given a talk to an audience of this size and calibre before.
I particularly want to thank Sean Baker, Zooko Wilcox, and Jeff Pedelty for their support and encouragement in making it happen.
I’m grateful also to the patient organisers of AHS for welcoming me and helping me through the process, particularly Katherine Morrison, Grace Liu, and Ben Sima.

References

[1] Using EQ (encephalisation quotient), that is: a measure of relative brain size for mammals that takes into account some physical characteristics that affect the brain-body ratio.
[2] Evidence type: experimental

Martin, Robert D.
Fifty-second James Arthur lecture on the evolution of the human brain 1982

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/foetal-brain-growth-index.png

[3] Evidence type: review of data collection

Kennedy GE1.
J Hum Evol. 2005 Feb;48(2):123-45. Epub 2005 Jan 18.

“Although humans have a longer period of infant dependency than other hominoids, human infants, in natural fertility societies, are weaned far earlier than any of the great apes: chimps and orangutans wean, on average, at about 5 and 7.7 years, respectively, while humans wean, on average, at about 2.5 years. Assuming that living great apes demonstrate the ancestral weaning pattern, modern humans display a derived pattern that requires explanation, particularly since earlier weaning may result in significant hazards for a child. Clearly, if selection had favored the survival of the child, humans would wean later like other hominoids; selection, then, favored some trait other than the child’s survival. It is argued here that our unique pattern of prolonged, early brain growth — the neurological basis for human intellectual ability — cannot be sustained much beyond one year by a human mother’s milk alone, and thus early weaning, when accompanied by supplementation with more nutritious adult foods, is vital to the ontogeny of our larger brain, despite the associated dangers.”

[On the data set:]
“Weaning is a process, not an event that can be placed at a specific point in time; therefore, it is not subject, in any meaningful way, to precise mathematical or statistical analyses or even to exact determination. Sellen’s (2001) recent paper has, perhaps, done as much as possible to overcome the inherent problems of determining human weaning time. An ‘‘average’’ age of weaning can only suggest the age at which most young in a particular group cease nursing; moreover, in humans, as the Amele demonstrate, it is not uncommon for a mother to continue to nurse an older youngster even though she has an infant as well. Data reported in Table 1 were taken from field studies, individual ethnographic reports, and from the Human Relations Area Files (HRAF: category 862, on-line edition); data points were included only when a definite age or clear range was expressed. All were pre-industrial, ‘‘natural fertility’’ populations practicing a range of subsistence economies from agriculture to foraging, and many were mixed economies.”

“Although a mean weaning age can be calculated from the human data in Table 1 (30.1 months; n = 46), it seems more accurate to conclude that the ‘‘natural’’ weaning age for humans is between 2-3 years and generally occurs about midway in that range. The minimum reported weaning age was one year (Fiji, Kogicol) and the maximum was about 4 years (several native American groups); several entries, however, reported that individual children may nurse as long as 6 years. Goodall (1986) also reported that a few Gombe chimps also nursed far longer than the population average. Sellen (2001), using a slightly larger sample (n = 113) also taken from the HRAF (microfiche edition), reported a very similar mean (29 months +/- 10 months), and a very similar peak weaning period between 2 and 3 years.”

“As noted below, stable nitrogen isotope analysis on bone tissue from several prehistoric societies suggests a somewhat wider range of ‘‘natural’’ weaning ages. For example, since nursing infants occupy a different (higher) trophic level than do their mothers, the isotopic composition of nursing infants’ bones and teeth should, in theory, differ from that of the adults in their group. Weaning time, therefore, should correspond to the point at which infant and adult tissues reach a similar isotopic composition ( Herring et al., 1998 ). Following Fogel et al. (1989), several authors have found an elevated level of δ15 N in infant osteological remains (relative to adults of the same group), which, they argued, constitutes a ‘‘nursing signal’’ ( Katzen- berg, 1992; Katzenberg et al., 1993, 1996; Schurr, 1994; Tuross and Fogel, 1994; White and Schwarcz, 1994 ). For example, at the Sully site in North Dakota and at the Angel site in the Ohio Valley, δ15 N reached adult levels at about 24 months (Tuross and Fogel, 1994; Schurr, 1997), suggesting rather early weaning. In Nubia, on the other hand, there was a gradual decline up to about age 6, indicating a slow introduction of adult foods ( White and Schwarcz, 1994 ). Others have used stable carbon and oxygen isotopes in dental enamel to track dietary changes in young children. Stable carbon (δ13 C), for example, may be used to detect the introduction of solid foods, and hence the beginning of the weaning period, while oxygen isotopes (δ18 O) may track the decreasing consumption of human milk (Wright and Schwarcz, 1998). Using this approach, it was found that, among the Preclassic and Postclassic Maya, solid foods were first introduced probably late in the first year, but that the weaning process was not concluded until 5 or 6 years ( Wright and Schwarcz, 1998).”

“[E]xtensive field data, collected in modern traditional societies, ancient textual references, and biochemical evidence from prehistoric societies, all suggest that in humans, the ‘‘natural’’ weaning age is generally between 2 and 3 years, although it may continue longer in some groups.”

[4] Evidence type experiment

John Dobbing and Jean Sands
Arch Dis Child. 1973 Oct; 48(10): 757–767.

“One hundred and thirty-nine complete human brains ranging in age from 10 weeks’ gestation to 7 postnatal years, together with 9 adult brains, have been analysed in order to describe the human brain growth spurt quantitatively… The growth spurt period is much more postnatal than has formerly been supposed.”
[…]
“The postnatal cut-off point of the sigmoid curve of weight accumulation seems to be between 18 postnatal months and 2 years for whole brain.”
https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/brain-weight-curve.png

[5] Evidence Type: review of experiments

Martin, Robert D.
Fifty-second James Arthur lecture on the evolution of the human brain 1982

[Emphasis ours]
“The foregoing comparisons have demonstrated that Homo sapiens shares a number of general features of brain size and its development with the other primates, most notably in producing precocial off-spring and in the shift to a distinctive relationship between brain size and body size during foetal development (fig. 8). But human beings also exhibit a number of special features which set them apart from other primates, or at least from their closest relatives the great apes. These may be listed as follows:

  1. The remarkably large size of the adult brain relative to body size.
  2. The rapid development of both brain and body during foetal development, resulting in a distinctively large brain and body size at birth, compared to great apes.
  3. The greater degree of postnatal growth of the brain, accomplished by continuation of foetal brain : body relationships for at least one year after birth and associated with the “secondary altricial condition.

[This shows pattern of brain to body weight ratio, not just brain weight]
https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/brain-to-body-ratio.png

[6] Evidence type: experiment

Changes in brain weights during the span of human life: relation of brain weights to body heights and body weights.
Dekaban AS.

“More than 20,000 autopsy reports from several general hospitals were surveyed for the purpose of selecting brains without a pathological lesion that had been weighed in the fresh condition. From this number, 2,773 males and 1,963 females were chosen for whom body weight, body height, and cause of death had been recorded. The data were segregated into 23 age groups ranging from birth to 86+ years and subjected to statistical evaluation. Overall, the brain weights in males were greater than in females by 9.8%. The largest increases in brain weights in both sexes occurred during the first 3 years of life, when the value quadruples over that at birth, while during the subsequent 15 years the brain weight barely quintuples over that at birth.”
[…]
“[T]he largest increases in brain weight occur during the first year of life, when the weight more than doubles that at birth (see Tables 2, 3; Figs 2A, 5). Further increases in brain weight also occur quite rapidly, although the increments from the preceding age groups are smaller. At about 3 years of age in males and between 3 and 4 years in females, the brain weight reaches four times the value at birth. Further growth of the brain is considerably slower, as it takes the brain 15 years (between ages 4 and 18) to nearly quintuple its birth weight and reach its mean highest value in young adults.”

[7] Evidence Type: review of human and non-human animal experiments

CIBA Foundation Symposium
John Wiley & Sons, Sep 16, 2009 – Science – 192 pages

“The human growth spurt appears to extend throughout the last trimester of pregnancy and well into the 2nd year of postnatal life, and by analogy similar harm would be expected in the brains of humans growth-retarded during this time [as rats in reported nutrient deficiency experiment]
“There is no question but that the transient period of brain growth, known as the brain growth spurt, is more vulnerable to growth restriction than the periods both before and afterwards. Vulnerability in this sense means that quite mild restriction leads in experimental animals to permanent, irrecoverable reduction in the trajectory of bodily growth and to easily detectable distortions and deficits in the adult brain… In the present case the ‘damage’ consists of permanent but non-uniform reduction in the extent of brain growth. There is accumulating evidence that it has functional importance. An important feature of this type of vulnerability is that it is highly dependent on the timing of the insult, although not as finely so as the earlier teratology.”

[8] Evidence type: review of experiments

Siegel GJ, Agranoff BW, Albers RW, et al., editors.
Philadelphia: Lippincott-Raven; 1999.

“The brain consumes about one-fifth of total body oxygen utilization
“The brain is metabolically one of the most active of all organs in the body. This consumption of O2 provides the energy required for its intense physicochemical activity. The most reliable data on cerebral metabolic rate have been obtained in humans. Cerebral O2 consumption in normal, conscious, young men is approximately 3.5 ml/100 g brain/min (Table 31-1); the rate is similar in young women. The rate of O2 consumption by an entire brain of average weight (1,400 g) is then about 49 ml O2/min. The magnitude of this rate can be appreciated more fully when it is compared with the metabolic rate of the whole body. An average man weighs 70 kg and consumes about 250 ml O2/min in the basal state. Therefore, the brain, which represents only about 2% of total body weight, accounts for 20% of the resting total body O2 consumption. In children, the brain takes up an even larger fraction, as much as 50% in the middle of the first decade of life [15].”

[9] Evidence type: review

Stephen Cunnane (Editor), Kathlyn Stewart (Editor)
ISBN: 978-0-470-45268-4
June 2010, Wiley-Blackwell

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/brain-energy-table.png

[10] Evidence type: review

Delange F.
Proc Nutr Soc. 2000 Feb;59(1):75-9.

“I is required for the synthesis of thyroid hormones. These hormones, in turn, are required for brain development, which occurs during fetal and early postnatal life. The present paper reviews the impact of I deficiency (1) on thyroid function during pregnancy and in the neonate, and (2) on the intellectual development of infants and children. All extents of I deficiency (based on I intake (microgram/d); mild 50-99, moderate 20-49, severe > 20) affect the thyroid function of the mother and neonate, and the mental development of the child. The damage increases with the extent of the deficiency, with overt endemic cretinism as the severest consequence. This syndrome combines irreversible mental retardation, neurological damage and thyroid failure. Maternal hypothyroxinaemia during early pregnancy is a key factor in the development of the neurological damage in the cretin. Se deficiency superimposed on I deficiency partly prevents the neurological damage, but precipitates severe hypothyroidism in cretins. I deficiency results in a global loss of 10-15 intellectual quotient points at a population level, and constitutes the world’s greatest single cause of preventable brain damage and mental retardation.”

[11] Evidence type: review

Micronutrient Deficiencies in the First Months of Life
edited by F. Delange, Keith P. West

“Iron plays a critical role in brain development, including its postnatal stages. Youdim et al., Youdim, Rouault and Beard reviewed the biological mechanisms whereby iron deficiency could possibly affect brain structure and functioning. The accumulation and distribution of iron in various regions of the brain depend on the stage of its development. This might indicate that brain regions vary in their vulnerability to iron deprivation, and suggests that the effect of iron deficiency on brain iron content could depend on the timing of the exposure. Animal studies indicate that low dietary intake of iron in the neonatal or preweaning period (before postnatal days 14-21 [ this must be rodents? ] may reduce whole-brain iron content that is not reversible by dietary repletion and produce irreversible effects. In rats, such effects occur before completion of brain organization and myelination and establishment of dopaminergic tracts. By contrast, dietary depletion in the postweaning period can also reduce brain iron content but this might be reversible upon dietary repletion. This illustrates that the timing of exposure to iron deficiency must be carefully considered when examining possible effects of iron deficiency on mental performance.
“Iron is not only required for brain growth and differentiation of neuronal cells, but also for protein synthesis, hormone production and other aspects of cellular enrgy metabolism and functioning. When sufficiently severe to reduce hemoglobin concentrations or cause anemia, iron deficiency might adversely affect oxygen delivery, thereby leading to reduced functioning of the central nervous system. Such deletrious effects of iron deficiency might be partially or completely reverese by iron repletion.
“Effects of iron deficiency might also be determined by other mechanisms. For example, it has been hypothesized that anemic children experience delayed acquisition of skills because they explore and interact less with their environment than nonanemic children, and they induce less stimulating behavior in their caretakers. Additionally, several studies have indicated that anemic children tend to be more fearful, withdrawn and tense, have reduced ability to focus their attention [25, 26], and are therefore less exposed to environmental stimuli that may promote mental and motor development.”

[12] Evidence type: review (mostly non-human animal experiments)

Bhatnagar S, Taneja S.
Br J Nutr. 2001 May;85 Suppl 2:S139-45.

“Abstract
“Cognition is a field of thought processes by which an individual processes information through skills of perception, thinking, memory, learning and attention. Zinc deficiency may affect cognitive development by alterations in attention, activity, neuropsychological behavior and motor development. The exact mechanisms are not clear but it appears that zinc is essential for neurogenesis, neuronal migration, synaptogenesis and its deficiency could interfere with neurotransmission and subsequent neuropsychological behavior. Studies in animals show that zinc deficiency during the time of rapid brain growth, or during the juvenile and adolescent period affects cognitive development by decreasing activity, increasing emotional behavior, impairing memory and the capacity to learn. Evidence from human studies is limited. Low maternal intakes of zinc during pregnancy and lactation were found to be associated with less focused attention in neonates and decreased motor functions at 6 months of age. Zinc supplementation resulted in better motor development and more playfulness in low birth weight infants and increased vigorous and functional activity in infants and toddlers. In older school going children the data is controversial but there is some evidence of improved neuropsychological functions with zinc supplementation. Additional research is required to determine the exact biological mechanisms, the critical periods, the threshold of severity and the long-term effects of zinc deprivation on cognitive development.”

[13] Evidence type: review

McNamara RK, Carlson SE.
Prostaglandins Leukot Essent Fatty Acids. 2006 Oct-Nov;75(4-5):329-49. Epub 2006 Sep 1.

“There is now good evidence suggesting that DHA is accrued in rodent, primate, and human brain during active periods of perinatal cortical maturation, and that DHA plays an important role in neuronal differentiation, synaptogenesis, and synaptic function. In animal studies, prenatal deficits in brain DHA accrual that are not corrected via postnatal dietary fortification are associated with enduring deficits in neuronal arborization, multiple indices of synaptic pathology, deficits in mesocorticolimbic dopamine neurotransmission, deficits in hippocampal serotonin and acetylcholine neurotransmission, neurocognitive deficits on hippocampus and frontal cortex-dependent learning tasks, and elevated behavioral indices of anxiety, aggression, and depression. Human and primate infants born preterm or fed diets without DHA postnatally exhibit lower cortical DHA accrual compared to infants born at term or fed human milk postnatally. Children/adolescents born preterm exhibit deficits in cortical gray matter expansion, neurocognitive deficits, and are at increased risk for attention-deficit/hyperactivity disorder (ADHD) and schizophrenia. Individuals diagnosed with ADHD or schizophrenia exhibit peripheral indices of lower DHA status and exhibit deficits in cortical gray matter expansion and deficits in cortical dopamine neurotransmission. Based on this body of evidence, it is hypothesized that perinatal deficits in brain DHA accrual represents a modifiable neurodevelopmental risk factor for the emergence of neurocognitive deficits and subsequent psychopathology. Evaluation of this hypothesis is currently feasible.”

[14] Evidence type: review

J D Cook
Am J Clin Nutr February 1990 vol. 51 no. 2 301-308

[ Emphasis mine ]
“Dietary iron supply encompasses both the total amount of ingested iron and its bioavailability. Before 1950, nutritionists emphasized only total iron intake as a measure of dietary adequacy. Wider application of isotopic techniques during the l9SOs and l960s led to the realization that the bioavailability of ingested iron may be more important than total intake. There are two separate pathways of iron entry into the mucosal cell. The largest fraction of dietary iron is in the form of inorganic or nonheme iron, the absorption of which is determined largely by the nature of the meal. Nonheme-iron absorption occurs mainly from the duodenum because of the greater solubility of luminal iron in the proximal, more acid, region of the small intestine. Isotopic studies with extrinsic labeling demonstrated that all dietary forms of nonheme iron ingested in the same meal form a common pool within the intestinal lumen. Absorption from this pool is determined not by the type of the iron ingested but by enhancers, which promote absorption by maintaining iron in a reduced soluble form, and inhibitors, which bind iron or make iron insoluble and prevent its uptake by the brush border (29-32). The bioavailability of nonheme iron is enhanced by ascorbic acid and various tissue foods, such as meat, fish, and poultry, but not dairy products (33). A large number of dietary constituents impair iron absorption and these factors have been the major focus of absorption studies during the past decade. The most important inhibitors include tea, coffee, bran, calcium phosphate, egg yolk, polyphenols, and certain forms of dietary fiber. The extremes in bioavailability of nonheme iron as measured from isotopically labeled single meals served in a laboratory setting is nearly 15-fold (Fig 2). If tea is eliminated, absorption will increase about threefold. If meat is added, absorption will again increase 2-3 times. Maximal enhancement absorption occurs when a large quantity of ascorbic acid (eg, 2g) is taken with the meal.
“The second dietary iron fraction is heme, which is absorbed into the intestinal cell as an intact porphyrin complex. Specific receptors for heme iron have been identified in laboratory animals (34) but not in humans. After heme iron enters the cell it is rapidly degraded by heme oxygenase (35), and the released iron then enters the common intracellular iron pool. Subse- quent mucosal handling ofthis iron appears to be identical to that of inorganic iron. Because heme iron remains protected within the porphyrin complex before its uptake by the mucosa, it does not interact with dietary ligands and is therefore unaffected by the nature of the meal. Percentage absorption of heme iron is 5-10-fold higher than from nonheme iron. Although heme represents only 10-15% of dietary iron in meat-eating populations, it may account for nearly one-third of absorbed iron (36). Because absorption of heme iron is constant and independent of meal composition, the contribution of heme iron can be readily calculated from dietary records. This is in distinction to marked differences in the availability of non-heme iron.”

[15] Evidence type: review

Clive E. West*,†,2, Ans Eilander*, and Machteld van Lieshout*
J. Nutr. September 1, 2002 vol. 132 no. 9 2920S-2926S

“The bioefficacy of β-carotene in plant foods is much less than was previously thought. Intervention studies enrolled schoolchildren in Indonesia (10) and breast-feeding women in Vietnam (11) (Table 1). In each study there were four dietary groups: low-retinol, low-carotenoid diet (negative control); dark-green leafy vegetables (as well as carrots in the Indonesian study); yellow and orange fruits; and a retinol-containing diet (positive control). For dark-green leafy vegetables, the bioefficacy was 1:26 and 1:28; while for fruit, the bioefficacy was 1:12. This suggests that, with a mixture of vegetables and fruits in a ratio of 4:1, which is typical for both developing and developed countries, the bioefficacy of β-carotene from a mixed diet is 1:21. Chinese children aged 5–6.5 y yielded similar results for green and yellow vegetables (1:27) (14). Van Lieshout et al. (15), using the plateau isotopic enrichment method, also found relatively poor bioefficacy of β-carotene in dark-green leafy vegetables. β-Carotene in pumpkin was 1.7 times as potent as that in spinach (Table 1).”

[16] See also the extensive review in the Vitamin A chapter of Dietary Reference Intakes for Vitamin A, Vitamin K, Arsenic, Boron, Chromium, Copper, Iodine, Iron, Manganese, Molybdenum, Nickel, Silicon, Vanadium, and Zinc Panel on Micronutrients, Subcommittees on Upper Reference Levels of Nutrients and of Interpretation and Use of Dietary Reference Intakes, and the Standing Committee on the Scientific Evaluation of Dietary Reference Intakes, which is accessible and would take a lot of space to include here.
[17] Evidence type: review

Crawford MA.
Am J Clin Nutr. 1993 May;57(5 Suppl):703S-709S; discussion 709S-710S.

“The brain is 60% structural lipid, which universally uses arachidonic acid (AA; 20:4n6) and docosahexaenoic acid (DHA; 22:6n-3) for growth, function, and integrity. Both acids are consistent components of human milk. Experimental evidence in animals has demonstrated that the effect of essential fatty acid deficiency during early brain development is deleterious and permanent. The risk of neurodevelopmental disorder is highest in the very-low-birth-weight babies. Babies born of low birth weight or prematurely are most likely to have been born to mothers who were inadequately nourished, and the babies tend to be born with AA and DHA deficits. Because disorders of brain development can be permanent, proper provision should be made to protect the AA and DHA status of both term and preterm infants to ensure optimum conditions for the development of membrane-rich systems such as the brain, nervous, and vascular systems.”

[18] Evidence type: experiment

O’Brien JS, Sampson EL.
J Lipid Res. 1965 Oct;6(4):537-44.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/cholesterol-percent.png

[19] Evidence type: review

William M. Pardridge
Chapter in Fuel Homeostasis and the Nervous System, Volume 291 of the series Advances in Experimental Medicine and Biology pp 43-53

[ Emphasis mine ]
“Although free fatty acids are an important carbon source for cellular combustion in tissues such as skeletal muscle, fat, or liver in the postabsorptive state, brain does not significantly combust circulating free fatty acid, even after several weeks of prolonged starvation.(32) This failure to oxidize circulating free fatty acids is not due to a deficiency of the relevant free fatty acid oxidizing enzymes in brain since labeled free fatty acids are readily converted to CO 2 following the intracerebral administration of [14Cj-labeled free fatty acid,(33) and small amounts of circulating free fatty acids are converted to Krebs cycle intermediates (34). Rather, the failure of brain to utilize circulating free fatty acids as an important source of combustible carbon is due, in part, to a slow transport through the BBB. In the absence of plasma proteins, both medium chain and long chain free fatty acids are rapidly transported through the BBB. (35) However, free fatty acids are more than 99% bound by high affinity binding sites on circulating albumin, and only approximately 5% of plasma free fatty acid is unidirectionally extracted by brain on a single pass through the cerebral microcirculation .(36) Moreover, there is a prominent enzymatic barrier to the utilization of the circulating free fatty acids,3 as depicted in Figure 5. There is rapid esterification into membrane-bound triglyceride of circulating free fatty acid at either the endothelial membrane or the brain cell membrane. Thus, in the steady state, an equal amount of free fatty acid taken up by brain and esterified in the endothelial or brain cell membranes is released to blood via hydrolysis of membrane-bound triglyceride via brain microvascular lipoprotein lipase. 3 This enzymatic barrier protecting brain intracellular space from circulating free fatty acids is very well developed and breaks down only under pathologic conditions in brain.
[ As far as I can tell, it has only recently been discovered that there are some mechanisms for transporting fatty acids across the blood brain barrier, but how much and under what circumstances is poorly understood. This statement expresses that candidly: ]

Murphy EJ.
J Neurochem. 2015 Dec;135(5):845-8. doi: 10.1111/jnc.13289. Epub 2015 Sep 17.

“How do fatty acids enter the brain and what role, if any, do membrane and cytosolic fatty acid binding proteins have on facilitating this process? This is a fundamental question that many lipid neurochemists will freely admit they cannot answer in any kind of definitive manner.”

[20] Evidence type: experiment

O’Brien JS, Sampson EL.
J Lipid Res. 1965 Oct;6(4):537-44.

https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/white-gray-chol.png
[ palimitic = 16:0, stearic = 18:0, oleic = 18:1 ]

[21] Evidence type: non-human animal experiment

Fatty acid transport and utilization for the developing brain.
Edmond J, Higa TA, Korsak RA, Bergner EA, Lee WN.
J Neurochem. 1998 Mar;70(3):1227-34.

[ Emphasis mine ]
“To determine the transport and utilization of dietary saturated, monounsaturated, and n-6 and n-3 polyunsaturated fatty acids for the developing brain and other organs, artificially reared rat pups were fed a rat milk substitute containing the perdeuterated (each 97 atom% deuterium) fatty acids, i.e., palmitic, stearic, oleic, linoleic, and linolenic, from day 7 after birth to day 14 as previously described. Fatty acids in lipid extracts of the liver, lung, kidney, and brain were analyzed by gas chromatography-mass spectrometry to determine their content of each of the deuterated fatty acids. The uptake and metabolism of perdeuterated fatty acid lead to the appearance of three distinct groups of isotopomers: the intact perdeuterated, the newly synthesized (with recycled deuterium), and the natural unlabeled fatty acid. The quantification of these isotopomers permits the estimation of uptake and de novo synthesis of these fatty acids. Intact perdeuterated palmitic, stearic, and oleic acids from the diet were found in liver, lung, and kidney, but not in brain. By contrast, perdeuterated linoleic acid was found in all these organs. Isotopomers of fatty acid from de novo synthesis were observed in palmitic, oleic, and stearic acids in all tissues. The highest enrichment of isotopomers with recycled deuterium was found in the brain. The data indicate that, during the brain growth spurt and the prelude to myelination, the major saturated and monounsaturated fatty acids in brain lipids are exclusively produced locally by de novo biosynthesis. Consequently, the n-6 and n-3 polyunsaturated fatty acids must be transported and delivered to the brain by highly specific mechanisms.”

[22] Evidence type: review

Brain Cholesterol: Long Secret Life Behind a Barrier
Ingemar Björkhem, Steve Meaney
Arteriosclerosis, Thrombosis, and Vascular Biology. 2004; 24: 806-815

“Although an immense knowledge has accumulated concerning regulation of cholesterol homeostasis in the body, this does not include the brain, where details are just emerging. Approximately 25% of the total amount of the cholesterol present in humans is localized to this organ, most of it present in myelin. Almost all brain cholesterol is a product of local synthesis, with the blood-brain barrier efficiently protecting it from exchange with lipoprotein cholesterol in the circulation. Thus, there is a highly efficient apolipoprotein-dependent recycling of cholesterol in the brain, with minimal losses to the circulation.Although an immense knowledge has accumulated concerning regulation of cholesterol homeostasis in the body, this does not include the brain, where details are just emerging. Approximately 25% of the total amount of the cholesterol present in humans is localized to this organ, most of it present in myelin. Almost all brain cholesterol is a product of local synthesis, with the blood-brain barrier efficiently protecting it from exchange with lipoprotein cholesterol in the circulation. Thus, there is a highly efficient apolipoprotein-dependent recycling of cholesterol in the brain, with minimal losses to the circulation.”

[23] Evidence type: review

Morris AA
J Inherit Metab Dis. 2005;28(2):109-21.

“The second function of KBs in the brain is to provide substrates for the synthesis of various molecules. KBs are particularly important for the synthesis of lipids, such as cholesterol in myelin. Studies in 18-day-old rats found that KBs are incorporated into brain cholesterol and fatty acids much more readily than glucose is incorporated (Webber and Edmond 1977). Studies of cultured mouse astrocytes and neurons gave similar results (Lopes-Cardozo et al 1986). The preferential use of KBs for lipid synthesis probably occurs because they can be converted directly to acetoacetyl-CoA in the cytoplasm by acetoacetyl-CoA synthetase (EC 6.2.1.16, see Figure 1). Cytosolic acetoacetyl-CoA thiolase can then convert acetoacetyl-CoA to acetyl-CoA. Cytosolic acetyl-CoA can be generated from glucose (via the tricarboxylic acid cycle and ATP-citrate lyase, Figure 1) but this is a less direct pathway due to the inability of acetyl-CoA to cross the mitochondrial inner membrane. KBs are incorporated into fatty acids in the brain but they are primarily used for cholesterol synthesis (Koper et al 1981). Acetoacetyl-CoA synthetase expression in human brain parallels that of HMG-CoA reductase (EC 1.1.1.34), providing further evidence for the importance of this pathway in sterol synthesis (Ohgami et al 2003). Although KBs are the preferred substrates for brain lipogenesis, they appear not to be essential. Thus, rats fed a hypoketogenic diet develop normally (Auestad et al 1990). Development is also normal in most human patients with defects of ketogenesis (Morris et al 1998; van der Knaap et al 1998), though imaging sometimes shows white-matter abnormalities (see Clinical Considerations below).”

[24] Evidence type: review

Yeh YY, Sheehan PM.
Fed Proc. 1985 Apr;44(7):2352-8.

[ Emphasis mine ]
“Persistent mild hyperketonemia is a common finding in neonatal rats and human newborns, but the physiological significance of elevated plasma ketone concentrations remains poorly understood. Recent advances in ketone metabolism clearly indicate that these compounds serve as an indispensable source of energy for extrahepatic tissues, especially the brain and lung of developing rats. Another important function of ketone bodies is to provide acetoacetyl-CoA and acetyl-CoA for synthesis of cholesterol, fatty acids, and complex lipids. During the early postnatal period, acetoacetate (AcAc) and beta-hydroxybutyrate are preferred over glucose as substrates for synthesis of phospholipids and sphingolipids in accord with requirements for brain growth and myelination. Thus, during the first 2 wk of postnatal development, when the accumulation of cholesterol and phospholipids accelerates, the proportion of ketone bodies incorporated into these lipids increases. On the other hand, an increased proportion of ketone bodies is utilized for cerebroside synthesis during the period of active myelination. In the lung, AcAc serves better than glucose as a precursor forbiddingly the synthesis of lung phospholipids. The synthesized lipids, particularly dipalmityl phosphatidylcholine, are incorporated into surfactant, and thus have a potential role in supplying adequate surfactant lipids to maintain lung function during the early days of life. Our studies further demonstrate that ketone bodies and glucose could play complementary roles in the synthesis of lung lipids by providing fatty acid and glycerol moieties of phospholipids, respectively. The preferential selection of AcAc for lipid synthesis in brain, as well as lung, stems in part from the active cytoplasmic pathway for generation of acetyl-CoA and acetoacetyl-CoA from the ketone via the actions of cytoplasmic acetoacetyl-CoA synthetase and thiolase.”

[25] Evidence type: non-human animal experiment

Koper JW, Zeinstra EC, Lopes-Cardozo M, van Golde LM.
Biochim Biophys Acta. 1984 Oct 24;796(1):20-6.

“We have compared glucose and acetoacetate as precursors for lipogenesis and cholesterogenesis by oligodendrocytes and astrocytes, using mixed glial cultures enriched in oligodendrocytes. In order to differentiate between metabolic processes in oligodendrocytes and those in astrocytes, the other major cell type present in the mixed culture, we carried out parallel incubations with cultures from which the oligodendrocytes had been removed by treatment with anti-galactocerebroside serum and guinea-pig complement. The following results were obtained: 1. Both oligodendrocytes and astrocytes in culture actively utilize acetoacetate as a precursor for lipogenesis and cholesterogenesis. 2. In both cell types, the incorporation of acetoacetate into fatty acids and cholesterol exceeds that of glucose by a factor of 5-10 when the precursors are present at concentrations of 1 mM and higher. 3. Glucose stimulates acetoacetate incorporation into fatty acids and cholesterol, whereas acetoacetate reduces the entry of glucose into these lipids. This suggests that glucose is necessary for NADPH generation, but that otherwise the two precursors contribute to the same acetyl-CoA pool. 4. Both with acetoacetate and with glucose as precursor, oligodendrocytes are more active in cholesterol synthesis than astrocytes. 5. Using incorporation of 3H2O as an indicator for total lipid synthesis, we estimated that acetoacetate contributes one third of the acetyl groups and glucose one twentieth when saturating concentrations of both substrates are present.”

[26] Evidence type: experiment

“A total of 272 venous blood samples was obtained from umbilical cord and from children of varying ages from birth to 8 years. All were analysed for blood glucose and either FFA, glycerol or ketone bodies.”
[ Fasted overnight ]
https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/perssongentzketonechildren.png

[27] Evidence type: experiment

Kraus H, Schlenker S, Schwedesky D.
Hoppe Seylers Z Physiol Chem. 1974 Feb;355(2):164-70.

“Removal of circulating ketone bodies by the brain is greater in newborns than in infants. Both values are higher than those reported in adults [14]. This is demonstrated by the differences in the slopes of the’regression lines. From these data, however, the conclusion cannot be drawn that there is a specific enhancement of ketone body metabolism in the brains of young individuals. The total metabolic rate could be increased in the infant brain due to denser arrangement of blood capillaries, shorter diffusion distances and a higher cerebral blood flow. In order to avoid objections arising from these differences the contribution of ketone bodies to the total oxidative metabolism of the brain was calculated (last row of Table 1). Hence it follows that the brain’s capacity to utilize ketone bodies is specifically increased in newborns in comparison with infants. These values in turn are five and four times higher respectively than those reported in adults [14]. This conclusion is also justified by the finding that the contribution of glucose is not significantly altered throughout the different age groups. Corresponding relative values in newborns, infants and adults are 0.26, 0.27, and 0.33. The results of the present paper are confirmed by the report that the estimated cerebral uptake of ketone bodies in a group of older children (aged up to 14 years) was about three to four times higher than values observed in adults [19]. As cited above it was shown in different animals that the capacity to utilize ketone bodies is higher in the infant than in the adult brain [2, 20]. The increased ketone utilization by the animal brain during the neonatal period resulted from higher activities of the enzymes of ketone body utilization. Whether this also applies to the human infant brain remains to be tested.
https://ketotic.mytimpani.co.uk/wp-content/uploads/2016/09/cerebral-ketone-av.png

[28] Evidence type: experiment

P F Bougneres, C Lemmel, P Ferré, and D M Bier
J Clin Invest. 1986 Jan; 77(1): 42–48.

[ Emphasis ours ]
“Using a continuous intravenous infusion of D-(-)-3-hydroxy[4,4,4-2H3]butyrate tracer, we measured total ketone body transport in 12 infants: six newborns, four 1-6-mo-olds, one diabetic, and one hyperinsulinemic infant. Ketone body inflow-outflow transport (flux) averaged 17.3 +/- 1.4 mumol kg-1 min-1 in the neonates, a value not different from that of 20.6 +/- 0.9 mumol kg-1 min-1 measured in the older infants. This rate was accelerated to 32.2 mumol kg-1 min-1 in the diabetic and slowed to 5.0 mumol kg-1 min-1 in the hyperinsulinemic child. As in the adult, ketone turnover was directly proportional to free fatty acid and ketone body concentrations, while ketone clearance declined as the circulatory content of ketone bodies increased. Compared with the adult, however, ketone body turnover rates of 12.8-21.9 mumol kg-1 min-1 in newborns fasted for less than 8 h, and rates of 17.9-26.0 mumol kg-1 min-1 in older infants fasted for less than 10 h, were in a range found in adults only after several days of total fasting. If the bulk of transported ketone body fuels are oxidized in the infant as they are in the adult, ketone bodies could account for as much as 25% of the neonate’s basal energy requirements in the first several days of life. These studies demonstrate active ketogenesis and quantitatively important ketone body fuel transport in the human infant. Furthermore, the qualitatively similar relationships between the newborn and the adult relative to free fatty acid concentration and ketone inflow, and with regard to ketone concentration and clearance rate, suggest that intrahepatic and extrahepatic regulatory systems controlling ketone body metabolism are well established by early postnatal life in humans.”

[29] Evidence type: experiment

Harrington TA, Thomas EL, Modi N, Frost G, Coutts GA, Bell JD.
Lipids. 2002 Jan;37(1):95-100.

“The role of body fat content and distribution in infants is becoming an area of increasing interest, especially as perception of its function appears to be rapidly evolving. Although a number of methods are available to estimate body fat content in adults, many are of limited use in infants, especially in the context of regional distribution and internal depots. In this study we developed and implemented a whole-body magnetic resonance imaging (MRI)-based protocol that allows fast and reproducible measurements of adipose tissue content in newborn infants, with an intra-observer variability of <2.4% and an inter-observed variability of <7%. The percentage total body fat for this cohort of infants ranged from 13.3-22.6% (mean and standard deviation: 16.6 +/- 2.9%), which agrees closely with published data. Subcutaneous fat accounted for just over 89% of the total body fat, whereas internal fat corresponded to almost 11%, most of which was nonabdominal fat. There were no gender differences in total or regional body fat content. These results show that whole-body MRI can be readily applied to the study of adipose tissue content and distribution in newborn infants. Furthermore, its noninvasive nature makes it an ideal method for longitudinal and interventional studies in newborn infants.”

[30] Evidence type: review

[ Emphasis ours ]
“The likelihood that the composition of fatty acids delivered to the fetus can affect the quality of fetal development is more compelling. The concentration of DHA in the brain of neonates is dependent on the intake of pre-formed DHA (Farquharson et al., 1993; Jamieson et al., 1999; Makrides et al., 1994) and many workers have reported beneficial effects of LCPUFA intake in early post-natal life in particular (Birch et al., 1992 ; Hoffman et al., 1993; Horwood, Darlow and Mogridge, 2001; Lucas et al., 1992, 1998). In this context, much has been made of the relatively high concentration of DHA in the fetal brain at term and the importance of in utero DHA supply but this is not specifically a fetal/placental issue as the human brain, including that of the pregnant mother, maintains a high concentration of DHA throughout life. Furthermore the total amount of DHA present in the fetal brain at term is not much greater than that in the placenta itself. An issue that is much more clearly specific to the fetus and placenta is the very high concentration of DHA and AA achieved in the fetal adipose tissue and the fact that 16 times more DHA is stored in the adipose tissue than is deposited in the fetal brain during in utero life. Within a few hours of birth there is a dramatic rise in plasma TG and NEFA indicating mobilization of adipose tissue stores ( Van Duyne & Havel, 1959 ) such that the concentration of DHA in the adipose tissue is undetectable after two months of post-natal life on a diet devoid of pre-formed DHA ( Farquharson et al., 1993). Thus the importance of this adipose store of LCPUFA may be to protect processes such as brain and retinal development against a poor dietary supply of LCPUFA during the critical first months of post-natal life. The fact that most of the LCPUFA such as DHA which is accrued by the fetus is actually stored in fetal adipose tissue also implies that there is normally an excess availability in utero for development of the fetal organs and tissues and that it is only in low birth weight babies, where the body fat content may be very low (Sparks et al., 1980), that the supply of LCPUA may become limiting for fetal requirements during in utero life.”

Why are heart disease and dental health related?

Microbiome Nonsense: response to “Chowing Down On Meat”

Response to "Chowing Down On Meat"

As the claim that animal protein and saturated fat is unhealthy becomes less and less tenable, those who have the intuition that animal-based nutrition must be bad for you are looking elsewhere.
There was great excitement at the end of 2014 about a study posted in Nature demonstrating the rapid changes in human gut microbes in response to animal-based vs. plant-based diets [1].
The paper is very interesting, and it has a lot of original data of a kind we’ve often wished for.
The authors then go on to interpret their findings without apparent restraint.
A report on the study on NPR called Chowing Down On Meat, Dairy Alters Gut Bacteria A Lot, And Quickly gets right to the point:

"Looks like Harvard University scientists have given us another reason to walk past the cheese platter at holiday parties and reach for the carrot sticks instead: Your gut bacteria will thank you."

and finally:

""I mean, I love meat," says microbiologist Lawrence David, who contributed to the study and is now at Duke University. "But I will say that I definitely feel a lot more guilty ordering a hamburger … since doing this work," he says."

That’s right.
The excitement in the blog-o-sphere was not so much about the clear results — that the changes in the gut flora in response to diet are fast and large — but about the authors’ opinions that the observed changes support a link between meat consumption and inflammatory bowel disease (IBD).
We take exception to these claims, as they are not well-founded by the data in the study, or in any other study.
The data to support them do not warrant the conclusion.
We consider it irresponsible at best to suggest that a dietary practice is harmful to health when the evidence is weak, especially when one is in a position of authority and subject to high publicity.
Here are the points we address:

The Claims about Inflammatory Bowel Disease

Here are some quotes from the paper stressing the possible dangers of a carnivorous diet based on a supposed link to IBD — inflammatory bowel disease.
Notice that they use language that implies the claims are proven, when as we will show, they are not.

"increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids and the outgrowth of microorganisms capable of triggering inflammatory bowel disease [6]" — Abstract
"Bile acids have been shown to cause inflammatory bowel disease in mice by stimulating the growth of the bacterium Bilophila[6], which is known to reduce sulphite to hydrogen sulphide via the sulphite reductase enzyme DsrA (Extended Data Fig. 10)." — from figure 5, page 4.
"Mouse models have also provided evidence that inflammatory bowel disease can be caused by B. wadsworthia, a sulphite-reducing bacterium whose production of H2S is thought to inflame intestinal tissue [6]. Growth of B. wadsworthia is stimulated in mice by select bile acids secreted while consuming saturated fats from milk. Our study provides several lines of evidence confirming that B. wadsworthia growth in humans can also be promoted by a high-fat diet. First, we observed B. wadsworthia to be a major component of the bacterial cluster that increased most while on the animal-based diet (cluster 28; Fig. 2 and Supplementary Table 8). This Bilophila-containing cluster also showed significant positive correlations with both long-term dairy (P , 0.05; Spearman correlation) and baseline saturated fat intake (Supplementary Table 20), supporting the proposed link to milk-associated saturated fats[6]. Second, the animal-based diet led to significantly increased faecal bile acid concentrations (Fig. 5c and Extended Data Fig. 9). Third, we observed significant increases in the abundance of microbial DNA and RNA encoding sulphite reductases on the animal-based diet (Fig. 5d, e). Together, these findings are consistent with the hypothesis that diet-induced changes to the gut microbiota may contribute to the development of inflammatory bowel disease." — last paragraph, emphasis ours.

This concern is prominent in the paper;
they start with it and end with it.
It is based on a single citation to a study in mice.

Reasons those claims are not warranted

Let’s look at that study (Dietary-fat-induced taurocholic acid promotes pathobiont expansion and colitis in Il10−/− mice [2]):

Here’s the abstract (emphasis ours):

"The composite human microbiome of Western populations has probably changed over the past century, brought on by new environmental triggers that often have a negative impact on human health1. Here we show that consumption of a diet high in saturated (milk-derived) fat, but not polyunsaturated (safflower oil) fat, changes the conditions for microbial assemblage and promotes the expansion of a low-abundance, sulphite-reducing pathobiont, Bilophila wadsworthia2. This was associated with a pro-inflammatory T helper type 1 (TH1) immune response and increased incidence of colitis in genetically susceptible Il10−/−, but not wild-type mice. These effects are mediated by milk-derived-fat-promoted taurine conjugation of hepatic bile acids, which increases the availability of organic sulphur used by sulphite-reducing microorganisms like B. wadsworthia. When mice were fed a low-fat diet supplemented with taurocholic acid, but not with glycocholic acid, for example, a bloom of B. wadsworthia and development of colitis were observed in Il10−/− mice. Together these data show that dietary fats, by promoting changes in host bile acid composition, can markedly alter conditions for gut microbial assemblage, resulting in dysbiosis that can perturb immune homeostasis. The data provide a plausible mechanistic basis by which Western-type diets high in certain saturated fats might increase the prevalence of complex immune-mediated diseases like inflammatory bowel disease in genetically susceptible hosts."

Translation:
They took some mice who were particularly susceptible to colitis, and also some regular mice, and fed them one of three different diets: a low fat diet (if we’re reading it correctly they used the AIN-93M Purified Diet from harlan, which is about 10% fat), or a diet with 37% fat which was either polyunsaturated, or saturated milk fat. They didn’t specify the amount of carbohydrate or protein, but we assume the diets were about 10-15% protein, leaving about 50% carbohydrate.
The mice who had the high milk-fat diet had a significant increase in the gut bacteria called Bilophila wadsworthia.
The susceptible mice on the high milk-fat diet got colitis at a high rate (more than 60% in 6 months).
The other susceptible mice, those on low-fat or polyunsaturated fat also got colitis, but at a lower rate (25-30%).
The regular mice didn’t get colitis, even on the high milk-fat diet.
What’s the problem with knockout mice?
The mice that got colitis were susceptible because they were genetically manipulated to not function normally.
Specifically, they couldn’t produce something called interleuken-10 (IL-10).
IL-10 has many complex actions including fighting against inflammation in multiple ways.
The argument made by the scientists is that Bilophila wadsworthia must induce inflammation, and that colitis probably comes about in people who are less effective at fighting that inflammation, just like the knockout mice.
This seems intuitive, but it is certainly not proven by the experiment.

Look at it this way:
Suppose we didn’t know the cause of phenylketonuria, a genetic disorder that makes the victim unable to make enzymes necessary to process the amino acid phenylalanine. We could knockout that gene in an animal, feed it phenylalanine, watch it suffer retardation and seizures, and conclude that phenylalanine must promote brain problems. This would be a mistake, of course. Phenylalanine is an essential amino acid occurring in breast milk. As far as we know, there is nothing unhealthy about it, as long as you don’t have a genetic mutation interfering with its metabolism.

It is, of course, possible that Bilophila wadsworthia inflames the colon.
As a hypothesis, based on this study, it is not by itself objectionable.
What we object to is the leap to citing Bilophila wadsworthia as causing colitis, as in the second excerpt above, which we repeat here:

"Bile acids have been shown to cause inflammatory bowel disease in mice by stimulating the growth of the bacterium Bilophila[6], which is known to reduce sulphite to hydrogen sulphide via the sulphite reductase enzyme DsrA (Extended Data Fig. 10)." — from figure 5, page 4.

In fact, Bilophila did not appear to affect the normal mice at all!
There is no claim that the genetic mutation in the mice has any relation to genetic susceptibility to IBS in humans,
yet it is implied that natural human susceptibility might work the same way.
Hydrogen Sulfide
In the knockout mice study, a second experiment was done to determine whether the Bilophila wadsworthia seen in the milk-fat condition came from a particular bile acid, taurocholic acid.
They fed the knockout mice a low fat diet supplemented with either taurocholic acid (TC), or glycocholic acid (GC).
They confirmed that Bilophila wadsworthia was increased by taurocholic acid and not by glychocholic acid.
What else do we know about taurocholic acid?
According to the authors of this study, it is "a rich source of organic sulphur, […] resulting in the formation of H2S [hydrogen sulfide]".
In one figure they even demonstrated the presence of Bilophila wadsworthia by the presence of H2S.
But H2S can be beneficial:

  • There is emerging evidence that H2S has diverse anti-inflammatory effects, as well as pro-inflammatory effects, possibly only at very high levels [3].
  • The levels needed for harm are probably higher than occurs naturally [4]
  • H2S levels in the blood are associated with high HDL, low LDL, and high adiponectin in humans [5], all considered good things.

Moreover, there is now evidence that colon cells in particular can actually use H2S as fuel, and lots of it.
Other researchers have used a a similar argument in the opposite way.
They claim that eating fiber is healthy,
because of the butyrate generated from it in the colon, which colons cells then use as fuel.
While we have problems with that argument,
it shows a pervasive bias:
Using it when it supports plants, but ignoring it when it doesn’t.
Taking all this into account, it is not at all clear that the higher levels of sulfite reducing bacteria seen in the meat and cheese eaters was unhealthy.

What would happen if a human sufferer of IBS went on an animal foods only diet?

It’s clear that these researchers are not studying IBS at all.
They were studying gut bacteria, found an association, and cherry-picked one study suggesting that what they found in the animal diet results might be unhealthy.
If they were studying IBS, they might have noticed reasons to hypothesise that a diet low in fiber [6], [7], carbohydrates [8], or fermentable carbohydrates [9] would help IBS sufferers.
If humans who are susceptible to IBS are susceptible in the same way as the knockout mice in the cited study, then these results might be surprising.
Instead, these results in combination with the animal diet paper, should further decrease our belief that the mice results have any relevance at all.
Moreover, unless the authors are advocating a diet of low-fiber, low-carb plants (can’t think of any plants like that off the top of my head…), they are encouraging IBS sufferers to eat foods that may worsen their condition.
We don’t know what would happen in an all meat trial for IBS, but we’d love to find out.

In Sum

The supposed link between the animal diet and inflammatory bowel disease is composed of a chain of weak links:
A kind of bacteria they found in those eating meat and cheese was also found in a mouse study that suggested a link between the bacteria and IBS.
However:

  • It used animals that were genetically engineered to not function normally.
  • It did not and cannot establish causality between the observed gut bacteria changes and the increased level of disease.
  • It was merely an observation of the two coinciding along with a plausible mechanism, i.e. a clever story about how this might be a causal relationship.

This plausible mechanism is not as clean a story as it appears. Presenting it as such is downright misleading.

References

[1] Diet rapidly and reproducibly alters the human gut microbiome

Lawrence A. David, Corinne F. Maurice, Rachel N. Carmody, David B. Gootenberg, Julie E. Button, Benjamin E. Wolfe, Alisha V. Ling, A. Sloan Devlin, Yug Varma, Michael A. Fischbach, Sudha B. Biddinger, Rachel J. Dutton & Peter J. Turnbaugh
Nature (2013) doi:10.1038/nature12820
[2] Dietary-fat-induced taurocholic acid promotes pathobiont expansion and colitis in Il10−/− mice

Suzanne Devkota, Yunwei Wang, Mark W. Musch, Vanessa Leone, Hannah Fehlner-Peach, Anuradha Nadimpalli, Dionysios A. Antonopoulos, Bana Jabri & Eugene B. Chang
Nature (2012) doi:10.1038/nature11225
[3] Evidence type: review and non-human animal experiment

Wallace JL.
Trends Pharmacol Sci. 2007 Oct;28(10):501-5. Epub 2007 Sep 19.

The notion of H2S being beneficial at physiological concentrations but detrimental at supraphysiological concentrations bears similarity to the situation with nitric oxide (NO), another gaseous mediator, which shares many biological effects with H2S. Also in common with NO, there is emerging evidence that physiological concentrations of H2S produce anti-inflammatory effects, whereas higher concentrations, which can be produced endogenously in certain circumstances, can exert pro-inflammatory effects [5]. Here, I focus on the anti-inflammatory effects of H2S, and on the concept that these effects can be exploited in the development of more effective and safer anti-inflammatory drugs. "

[4] Evidence type: review and non-human animal experiment

Wallace JL.
Trends Pharmacol Sci. 2007 Oct;28(10):501-5. Epub 2007 Sep 19.

(Emphasis ours)
"How much H2S is physiological?
"H2S is present in the blood of mammals at concentrations in the 30–100 m M range, and in the brain at concentrations in the 50–160 m M range [1–3]. Even after systemic administration of H2S donors at doses that produce pharmacological effects, plasma H2S concentrations seldom rise above the normal range, or do so for only a very brief period of time [24,27]. This is, in part, due to the efficient systems for scavenging, sequestering and metabolizing H2S. Metabolism of H2S occurs through methylation in the cytosol and through oxidation in mitochondria, and it is mainly excreted in the urine [1]. It can be scavenged by oxidized glutathione or methemoglobin, and can bind avidly to hemoglobin. Exposure of certain external surfaces andtissues to H2S can trigger inflammation [28], perhaps because of a relative paucity of the above-mentioned scavenging, metabolizing and sequestering systems. The highest concentrations of H2S in the body occur in the lumen of the colon, although there is some disagreement [29] as to whether theconcentrations of ‘free’ H2S really reach the millimolar concentrations that have been reported in some studies [30,31]. Although often alluded to [32,33], there is no direct evidence that H2S causes damage to colonic epithelial cells. Indeed, colonocytes seem to be particularly well adapted to use H2S as a metabolic fuel [4].
"There have been several suggestions that H2S might trigger mutagenesis, particularly in the colon. For example, one recent report [33] suggested that the concentrations of H2S in ‘the healthy human and rodent colon’ are genotoxic. Despite the major conclusion of that study, the authors observed that exposure of cultured colon cancer epithelial cells (i.e. transformed cells) to concentrations of Na2S as high as 2 mM for 72 hours did not cause any changes consistent with a genotoxic effect (nor cell death). It was only when the experiments were performed in the presence of two inhibitors of DNA repair, and only with a concentration of 2 mM, that they were able to detect a significant genotoxic signal. It is also important to bear in mind that the concentrations of H2S used in studies such as that described above are often referred to as those found in the ‘healthy’ colon. Clearly, if concentrations of H2S in the healthy colon do reach the levels reported, and if H2S has the capacity to produce genotoxic changes and/or to reduce epithelial viability, there must be systems in place to prevent the putative untoward effects of this gaseous mediator – otherwise, the colon would probably not be ‘healthy’"

[5] Evidence type: observational

Jain SK, Micinski D, Lieblong BJ, Stapleton T.
Atherosclerosis. 2012 Nov;225(1):242-5. doi: 10.1016/j.atherosclerosis.2012.08.036. Epub 2012 Sep 10.

"Hydrogen sulfide (H2S) is an important signaling molecule whose blood levels have been shown to be lower in certain disease states. Increasing evidence indicates that H2S plays a potentially significant role in many biological processes and that malfunctioning of H2S homeostasis may contribute to the pathogenesis of vascular inflammation and atherosclerosis. This study examined the fasting blood levels of H2S, HDL-cholesterol, LDL-cholesterol, triglycerides, adiponectin, resistin, and potassium in 36 healthy adult volunteers. There was a significant positive correlation between blood levels of H2S and HDL-cholesterol (r=0.49, p=0.003), adiponectin (r=0.36, p=0.04), and potassium (r=0.34, p=0.047), as well as a significant negative correlation with LDL/HDL levels (r= -0.39, p=0.02). "

[6] Evidence type: preliminary experiment

J. T. Woolner and G. A. Kirby
Journal of Human Nutrition and Dietetics Volume 13, Issue 4, pages 249–253, August 2000

"Abstract
Introduction High-fibre diets are frequently advocated for the treatment of irritable bowel syndrome (IBS) although there is little scientific evidence to support this. Experience of patients on low-fibre diets suggests that this may be an effective treatment for IBS, warranting investigation.
Methods Symptoms were recorded for 204 IBS patients presenting in the gastroenterology clinic. They were then advised on a low-fibre diet with bulking agents as appropriate. Symptoms were reassessed by postal questionnaire 4 weeks later. Patients who had improved on the diet were advised on the gradual reintroduction of different types of fibre to determine the quantity and type of fibre tolerated by the individual.
Results Seventy-four per cent of questionnaires were returned. A significant improvement (60–100% improvement in overall well-being) was recorded by 49% of patients.
Conclusion This preliminary study suggests that low-fibre diets may be an effective treatment for some IBS patients and justifies further investigation as a full clinical trial."

[7] Evidence type: Review

Eswaran S1, Muir J, Chey WD.
Am J Gastroenterol. 2013 May;108(5):718-27. doi: 10.1038/ajg.2013.63. Epub 2013 Apr 2.

"Abstract
Despite years of advising patients to alter their dietary and supplementary fiber intake, the evidence surrounding the use of fiber for functional bowel disease is limited. This paper outlines the organization of fiber types and highlights the importance of assessing the fermentation characteristics of each fiber type when choosing a suitable strategy for patients. Fiber undergoes partial or total fermentation in the distal small bowel and colon leading to the production of short-chain fatty acids and gas, thereby affecting gastrointestinal function and sensation. When fiber is recommended for functional bowel disease, use of a soluble supplement such as ispaghula/psyllium is best supported by the available evidence. Even when used judiciously, fiber can exacerbate abdominal distension, flatulence, constipation, and diarrhea."

[8] Evidence Type: uncontrolled experiment

Austin GL, Dalton CB, Hu Y, Morris CB, Hankins J, Weinland SR, Westman EC, Yancy WS Jr, Drossman DA.
Clin Gastroenterol Hepatol. 2009 Jun;7(6):706-708.e1. doi: 10.1016/j.cgh.2009.02.023. Epub 2009 Mar 10.

"Abstract
Background & Aims
Patients with diarrhea-predominant IBS (IBS-D) anecdotally report symptom improvement after initiating a very low-carbohydrate diet (VLCD). This is the first study to prospectively evaluate a VLCD in IBS-D.
Methods
Participants with moderate to severe IBS-D were provided a 2-week standard diet, then 4 weeks of a VLCD (20 grams of carbohydrates/day). A responder was defined as having adequate relief (AR) of gastrointestinal symptoms for 2 or more weeks during the VLCD. Changes in abdominal pain, stool habits, and quality of life (QOL) were also measured.
Results
Of the 17 participants enrolled, 13 completed the study and all met the responder definition, with 10 (77%) reporting AR for all 4 VLCD weeks. Stool frequency decreased (2.6 ± 0.8/day to 1.4 ± 0.6/day; p<0.001). Stool consistency improved from diarrheal to normal form (Bristol Stool Score: 5.3 ± 0.7 to 3.8 ± 1.2; p<0.001). Pain scores and QOL measures significantly improved. Outcomes were independent of weight loss.
Conclusion
A VLCD provides adequate relief, and improves abdominal pain, stool habits, and quality of life in IBS-D."

[9] Evidence type: review

Suma Magge, MD and Anthony Lembo, MDcorresponding author
Gastroenterol Hepatol (N Y). 2012 Nov; 8(11): 739–745.

"Summary
A low-FODMAP diet appears to be effective for treatment of at least a subset of patients with IBS. FODMAPs likely induce symptoms in IBS patients due to luminal distention and visceral hypersensitivity. Whenever possible, implementation of a low-FODMAP diet should be done with the help of an experienced dietician. More research is needed to determine which patients can benefit from a low-FODMAP diet and to quantify the FODMAP content of various foods, which will help patients follow this diet effectively."

Ornish Diet Worsens Heart Disease Risk: Part I

Dr. Dean Ornish has come under a lot of criticism lately for his misleading statements about diet and heart disease.
See, for example:
Critique of Dean Ornish Op-ed, by Nina Teicholz, and
Why Almost Everything Dean Ornish Says about Nutrition Is Wrong, from Scientific American.
Ornish made his name with a study that claimed to actually reverse heart disease [1].
There are at least three problems with the study.
First, it included several confounders to the dietary regimen.
For example, the intervention groups spent an hour a day on stress management techniques, such as meditation, and three hours a week exercising.
Second, although it was touted as the first study to look at “actual” heart disease results, it made no measurements of cardiac events!
Instead, it was based on measuring stenosis — the degree of narrowing of coronary arteries.
Considering that stenosis is only a predictor of cardiac events,
it seems disingenuous to call it a direct measure of heart disease.
Stenosis is used to predict heart disease (though it is often not the previously found blockages that are ultimate culprits [2]).
However, the measurement has a lot of variability.
Because of this, differences in measurements over time need to be quite large to be showing a true progression or regression, and not just error.
We found three studies attempting to pinpoint the minimum difference in measurements to make such a claim.
They respectively recommended 15%, 9.3%, and 7.8% as a basis for this judgment [3], [4], [5].
So how much reduction of stenosis was there in Ornish’s study?

“The average percentage diameter stenosis decreased from 40.0 (SD 16.9)% to 37.8 (16.5)% in the experimental group yet progressed from 42.7 (15.5)% to 46.11 (18.5)% in the control group (p = 0.001, two-tailed).”

That’s the extent of the success in a year: a -2.2% change for the claim of “regression” vs. a 3.4% change for the claim of “progression”.
It does not reach a level of significance given the measurement tool.
Fortunately, there were other measurements taken that are also predictors of cardiac events: blood lipids.
Even the AHA, an association that changes its mind slowly in response to evidence,
considers triglycerides above 100 to be higher than optimal [6].
Low HDL is a strong marker of heart disease, with HDL below 40 considered by the AHA a “major heart disease risk factor” [7].
The intervention group went from an average triglyceride level of 211 to 258, and their HDL from 39 to 38.
This shows that the intervention actually worsened the participants’ risk factors!
Moreover, although not acknowledged by the AHA, we know that the ratio of triglycerides to HDL is a very strong predictor of heart disease; among the best [8].
A triglyceride-to-HDL level of less than 2 is considered ideal.
Over 4 is considered risky.
Over 6 is considered very high risk.
The intervention group’s average triglycerides-to-HDL ratio leapt from 5.4 to 6.8!
It went from bad to worse.
Thus, the third problem with the study is that it actually showed a worsening of heart disease by other important measures.
The bottom line is that Ornish’s study never showed what it claimed to show.
After a year of intervention, even with other lifestyle changes incorporated, the subjects on his diet had a higher risk of heart disease than before they started.

References

[1] Ornish, Dean, et al. “Can lifestyle changes reverse coronary heart disease?: The Lifestyle Heart Trial.” The Lancet 336.8708 (1990): 129-133.
[2] Evidence type: experiment

Little WC, Constantinescu M, Applegate RJ, Kutcher MA, Burrows MT, Kahl FR, Santamore WP.
Circulation. 1988 Nov;78(5 Pt 1):1157-66.

Abstract
To help determine if coronary angiography can predict the site of a future coronary occlusion that will produce a myocardial infarction, the coronary angiograms of 42 consecutive patients who had undergone coronary angiography both before and up to a month after suffering an acute myocardial infarction were evaluated. Twenty-nine patients had a newly occluded coronary artery. Twenty-five of these 29 patients had at least one artery with a greater than 50% stenosis on the initial angiogram. However, in 19 of 29 (66%) patients, the artery that subsequently occluded had less than a 50% stenosis on the first angiogram, and in 28 of 29 (97%), the stenosis was less than 70%. In every patient, at least some irregularity of the coronary wall was present on the first angiogram at the site of the subsequent coronary obstruction. In only 10 of the 29 (34%) did the infarction occur due to occlusion of the artery that previously contained the most severe stenosis. Furthermore, no correlation existed between the severity of the initial coronary stenosis and the time from the first catheterization until the infarction (r2 = 0.0005, p = NS). These data suggest that assessment of the angiographic severity of coronary stenosis may be inadequate to accurately predict the time or location of a subsequent coronary occlusion that will produce a myocardial infarction.

[3] Evidence type: experiment

Abstract
BACKGROUND:
Clinical trials with angiographic end points have been used to assess whether interventions influence the evolution of coronary atherosclerosis because sample size requirements are much smaller than for trials with hard clinical end points. Further studies of the variability of the computer-assisted quantitative measurement techniques used in such studies would be useful to establish better standardized criteria for defining significant change.
METHODS AND RESULTS:
In 21 patients who had two arteriograms 3-189 days apart, we assessed the reproducibility of repeat quantitative measurements of 54 target lesions under four conditions: 1) same film, same frame; 2) same film, different frame; 3) same view from films obtained within 1 month; and 4) same view from films 1-6 months apart. Quantitative measurements of 2,544 stenoses were also compared with an experienced radiologist’s interpretation. The standard deviation of repeat measurements of minimum diameter from the same frame was very low (0.088 mm) but increased to 0.141 mm for measurements from different frames. It did not increase further for films within 1 month but increased to 0.197 mm for films 1-6 months apart. Diameter stenosis measurements were somewhat more variable. Measurement variability for minimum diameter was independent of vessel size and stenosis severity. Experienced radiologists did not systematically overestimate or underestimate lesion severity except for mild overestimation (mean 3.3%) for stenoses > or = 70%. However, the variability between visual and quantitative measurements was two to three times higher than the variability of paired quantitative measurements from the same frame.
CONCLUSIONS:
Changes of 0.4 mm or more for minimum diameter and 15% or more for stenosis diameter (e.g., 30-45%), measured quantitatively, are recommended as criteria to define progression and regression. Approaches to data analysis for coronary arteriographic trials are discussed.

[4] Evidence type: experiment

Brown BG1, Hillger LA, Lewis C, Zhao XQ, Sacco D, Bisson B, Fisher L.
Circulation. 1993 Mar;87(3 Suppl):II66-73.

Abstract
BACKGROUND:
Imaging trials using arteriography have been shown to be effective alternatives to clinical end point studies of atherosclerotic vascular disease progression and the effect of therapy on it. However, lack of consensus on what end point measures constitute meaningful change presents a problem for quantitative coronary arteriographic (QCA) approaches. Furthermore, standardized approaches to QCA studies have yet to be established. To address these issues, two different arteriographic approaches were compared in a clinical trial, and the degree of concordance between disease change measured by these two approaches and clinical outcomes was assessed.
METHODS AND RESULTS:
In the Familial Atherosclerosis Treatment Study (FATS) of three different lipid-lowering strategies in 120 patients, disease progression/regression was assessed by two arteriographic approaches: QCA and a semiquantitative visual approach (SQ-VIS). Lesions classified with SQ-VIS as “not,” “possibly,” or “definitely” changed were measured by QCA to change by 10% stenosis in 0.3%, 11%, and 81% of cases, respectively. The “best” measured value for distinguishing definite from no change was identified as 9.3% stenosis by logistic regression analysis. The primary outcome analysis of the FATS trial, using a continuous variable estimate of percent stenosis change, gave almost the same favorable result whether by QCA or SQ-VIS.
CONCLUSIONS:
The excellent agreement between these two fundamentally different methods of disease change assessment and the concordance between disease change and clinical outcomes greatly strengthens confidence both in these measurement techniques and in the overall findings of the study. These observations have important implications for the design of clinical trials with arteriographic end points.

[5] Evidence type: experiment

Gibson CM1, Sandor T, Stone PH, Pasternak RC, Rosner B, Sacks FM.
Am J Cardiol. 1992 May 15;69(16):1286-90.

Abstract
The purpose of this study was (1) to determine a threshold for categorizing individual coronary lesions as either significantly progressing or regressing, (2) to determine whether multiple lesions within individual patients progress at independent rates, and (3) to calculate sample sizes for atherosclerosis regression trials. Seventeen patients with 46 significant lesions (2.7 lesions/patient) underwent repeat coronary arteriography 3.0 years apart. With use of the standard error of the mean change in diameter from initial to repeat catheterization across 5 pairs of consecutive end-diastolic frames, individual lesions were categorized as either significantly (p less than 0.01) progressing or regressing if there was a 0.27 mm change in minimum diameter or a 7.8 percent point change in percent stenosis. The mean diameter change of a sample of lesions can also be analyzed as a continuous variable using either the lesions or the patient as the primary unit of analysis. A lesion-specific analysis can be accomplished using a multiple regression model that accounts for the intraclass correlation (rho) in the degree of change among multiple lesions within individual patients. The intraclass correlations in percent stenosis (rho = 0.01) and minimum diameter (rho = -0.24) were low, indicating that disease progression in different lesions within individual patients is nearly independent. With use of this model, 50 patients per treatment group would permit the detection of a 5.5% difference between treatment group means in the change in minimum diameter and a 2.7% percentage point (not percent) difference in the change in percent stenosis.(ABSTRACT TRUNCATED AT 250 WORDS)

[6] From The American Heart Association’s “Scientific Statement”
“New clinical recommendations include reducing the optimal triglyceride level from <150 mg/dL to <100 mg/dL, and performing non-fasting triglyceride testing as an initial screen.”
[7] From Levels of Cholesterol

Less than 40 mg/dL for men; less than 50 mg/dL for women: Major heart disease risk factor
60 mg/dL or higher Gives some protection against heart disease

[8] Evidence type: observational

Gaziano JM1, Hennekens CH, O’Donnell CJ, Breslow JL, Buring JE.
Circulation. 1997 Oct 21;96(8):2520-5.

Abstract
BACKGROUND:
Recent data suggest that triglyceride-rich lipoproteins may play a role in atherogenesis. However, whether triglycerides, as a marker for these lipoproteins, represent an independent risk factor for coronary heart disease remains unclear, despite extensive research. Several methodological issues have limited the interpretability of the existing data.
METHODS AND RESULTS:
We examined the interrelationships of fasting triglycerides, other lipid parameters, and nonlipid risk factors with risk of myocardial infarction among 340 cases and an equal number of age-, sex-, and community-matched control subjects. Cases were men or women of <76 years of age with no prior history of coronary disease who were discharged from one of six Boston area hospitals with the diagnosis of a confirmed myocardial infarction. In crude analyses, we observed a significant association of elevated fasting triglycerides with risk of myocardial infarction (relative risk [RR] in the highest compared with the lowest quartile=6.8; 95% confidence interval [CI]=3.8 to 12.1; P for trend <.001). Results were not materially altered after control for nonlipid coronary risk factors. As expected, the relationship was attenuated after adjustment for HDL but remained statistically significant (RR in the highest quartile=2.7; 95% confidence interval [CI]=1.4 to 5.5; P for trend=.016). Furthermore, the ratio of triglycerides to HDL was a strong predictor of myocardial infarction (RR in the highest compared with the lowest quartile=16.0; 95% CI=7.7 to 33.1; P for trend <.001).
CONCLUSIONS:
Our data indicate that fasting triglycerides, as a marker for triglyceride-rich lipoproteins, may provide valuable information about the atherogenic potential of the lipoprotein profile, particularly when considered in context of HDL levels.

What about the sugars in breast milk?

Something that nearly always comes up when we talk about babies naturally being in ketosis is the fact that breast milk contains sugars — as much as 40% [1].
Some people have even argued with us that therefore babies are not in ketosis!

That objection is, of course, reasoning backwards — objecting to a fact because it doesn’t fit a theory.
That healthy, breastfed babies live in a state of ketosis and use the ketogenic metabolism for energy and growth is not a hypothesis;
it is an empirical fact.
See our article on ketogenic babies for details.
However, the fact that babies are in ketosis even while consuming a diet relatively high in carbohydrates does pose a mystery that deserves investigation.
In this article, we’re going to suggest one possible explanation for the mystery,
but remember that this possible explanation is just a hypothesis,
until someone does an experiment to test it.

In brief

We can’t conclude, just because breast milk has a relatively high proportion of carbohydrates, that babies are burning a lot of carbohydrates for fuel.

  • Breast milk is full of components that are good for building brains. Infancy is a period of intense brain growth.
  • The sugars in breast milk are mostly from lactose, with small amounts in the form of oligosaccharides.
    Both lactose and oligosaccharides are replete with components that are crucial building blocks of brains.
  • In addition to providing materials for growing brains,
    other non-fuel functions of at least oligosaccharides include serving as prebiotics and fighting infection.
  • Insofar as some parts of the milk are being used for other purposes, they can’t also be used as fuel.

Therefore, a plausible explanation for how babies are in ketosis while consuming a relatively high-carbohydrate food, is that those carbohydrates are not being used as fuel, but rather as building blocks for the brain, and to a lesser extent, feeding gut bacteria, and fighting infections.

Lactose

Most of the carbohydrate in breast milk is lactose,
which is broken down by digestion into glucose and galactose.
Galactose is an important component of some glycoproteins and glycolipids,
including cerebrosides — glycolipids in the brain and nervous system.
Cerebrosides made of galactose are a major component of brain tissue [2].
They are also such a large component of myelin that cerebroside synthesis has been used as a measure of myelination or remyelination [2].
It is therefore plausible that much of the galactose in breast milk is used for brain tissue and myelin synthesis [3].
In fact, glucose is itself also used for making glycolipids for brain tissue [4], [5], although ketone bodies seem to be preferred [6], [7].

Oligosaccharides

After lactose and fat, oligosaccharides are the largest component of breast milk [8].
Oligosaccharides are unique to human breast milk — other animals produce almost no oligosaccharides in their milk [9].
Oligosaccharides are thought not to function as fuel.
Some have been shown to have a prebiotic role [10], [11].
Much of the oligosaccharides pass completely through the infant’s digestive tract, and probably have an immune system function [12], [13].
Oligosaccharides also contain sialic acid [14], an important component in the brain used for cell-to-cell interactions, neuronal outgrowth, modifying synaptic connectivity, and memory formation [15].

Bottom line

The main point to take from all this is that many of the components of breast milk that one might presume to be used as “calories” are actually being used for other things, especially to make brains with.
That includes glucose, galactose, proteins, fats, and even ketone bodies.
This could explain the fact that infants are in mild ketosis while breastfed, even though breast milk has higher carbohydrates than would support a ketogenic metabolism in an adult.

References

[1] Calculating the macronutrients in breast milk is made very complex by not only the variation among individuals but diurnal variations, and variations over longer periods of time.
It is a huge simplification to report a single value for the amount of some component of breast milk:

Whitehead RG.
Am J Clin Nutr. 1985 Feb;41(2 Suppl):447-58.

“It should be recognized, however, that we have all been guilty of adopting an oversimplified approach insofar as relating energy needs to milk volumes is concerned. The energy composition of milk is not the constant factor we have all tacitly assumed. Fat is the major energy-donating component and its concentrations vary considerably. At the beginning of each feed, from either breast, the fat content of the milk the baby receives is low, the exact level being determined by the extent to which that breast was emptied during the previous fed. As the baby feeds, fat content then rises by an amount that can be as much as 3-4-fold but the extent is very variable. There is also evidence that average fat levels vary at different times during the day in a cyclical manner. Even after one has taken account of these variables, it is still apparent that individual women have characteristically different fat concentrations in their breast milk. These complications have been extensively studied by Prentice in rural Gambian women (8, 9), and for the purpose of calculating breast milk requirements, they are almost impossible to untangle.”
Nonetheless, the standard reported amount of carbohydrate is 38―41%:

Olivia Ballard, JD, PhD (candidate) and Ardythe L. Morrow, PhD, MSc
Pediatr Clin North Am. Feb 2013; 60(1): 49–74.
https://lh5.googleusercontent.com/-4RhjKvSSG4k/VLV0rP3Ib8I/AAAAAAAAJmI/Sslu4-6mx8k/w873-h565-no/breast-milk-comp.png
[2] From Wikipedia:
“Galactosylceramide is the principal glycosphingolipid in brain tissue. Galactosylceramides are present in all nervous tissues, and can compose up to 2% dry weight of grey matter and 12% of white matter. They are major constituents of oligodendrocytes.”
“Monogalactosylceramide is the largest single component of the myelin sheath of nerves. Cerebroside synthesis can therefore give a measurement of myelin formation or remyelination.”
[3] I first heard this idea from this blog post: What can we learn from breast milk? Part 1: Macronutrients
“…the carbohydrate source is lactose, made of glucose and galactose. Now galactose is very special, it’s not used as an energy fuel like glucose, it’s used for myelin synthesis (that is making nerve insulation), this is why human breast milk is so high in lactose, for the galactose! So that ~15% becomes ~7% of calories coming from carbs for an adult (~38g @ 2000 calories).”
[4] Evidence type: review

Edmond J.
Can J Physiol Pharmacol. 1992;70 Suppl:S118-29.

“Many studies in the decade, 1970-1980, in human infants and in the rat pup model show that both glucose and the ketone bodies, acetoacetate and D-(-)-3-hydroxybutyrate, are taken up by brain and used for energy production and as carbon sources for lipogenesis. Products of fat metabolism, free fatty acids, ketone bodies, and glycerol dominate metabolic pools in early development as a consequence of the milk diet. This recognition of a distinctive metabolic environment from the well-fed adult was taken into consideration within the last decade when methods became available to obtain and study each of the major cell populations, neurons, astrocytes, and oligodendrocytes in near homogeneous state in primary cultures. Studies on these cells made it possible to examine the distinctive metabolic properties and capabilities of each cell population to oxidize the metabolites that are available in development. Studies by many investigators on these cell populations show that all three can use glucose and the ketone bodies in respiration and for lipogenesis.”

[5] Evidence type: non-human animal experiment

“The incorporation of 14C-label from subcutaneously injected [3-14C]acetoacetate and [U-14C]glucose into phospholipids and sphingolipids in different regions of developing rat brain was determined. In all regions, phosphatidylcholine was the lipid synthesized most readily from either substrate. The percentages of radioactivity in other phospholipids and most sphingolipids remained relatively constant throughout postnatal development. An exceptional increase in the percentage of radioactivity incorporated into cerebroside, coinciding with a decrease of incorporation into phosphatidylcholine, was first noted on day 12 and continued until a maximal level was reached between days 18 and 20 of postnatal age. These developmental changes in preferential synthesis of lipids were associated with increased demands for phospholipids and cerebroside during the early and late postnatal stages, respectively. There was no difference in accumulation of radioactivity from acetoacetate, expressed as dpm of [14C]acetoacetate recovered in phospholipids plus sphingolipids per g of tissue, among all brain regions during the first 5 days of life. During active myelination (12 to 20 days of age); however, the amount of 14C-label was highest in brain stem, ranging from 1.9- to 2.3-fold greater than values for cerebrum and thalamus. The region with the next highest accumulation was cerebellum, followed by midbrain. During the same period, brain stem was likewise the most active site of accumulation of radioactivity from 14C-labeled glucose. Higher amounts of [14C]acetoacetate label accumulated in lipids of brain stem and cerebellum, relative to midbrain, thalamus, and cerebrum, coincide with evidence that active myelination begins in the hindbrain and proceeds rostrally toward the forebrain. Ketone bodies could therefore serve as a potential source of phospholipids and sphingolipids for brain growth and maturation.”

[6] Evidence type: non-human animal experiment

Yeh YY, Sheehan PM.
Fed Proc. 1985 Apr;44(7):2352-8.

(Emphasis ours)
“Persistent mild hyperketonemia is a common finding in neonatal rats and human newborns, but the physiological significance of elevated plasma ketone concentrations remains poorly understood. Recent advances in ketone metabolism clearly indicate that these compounds serve as an indispensable source of energy for extrahepatic tissues, especially the brain and lung of developing rats. Another important function of ketone bodies is to provide acetoacetyl-CoA and acetyl-CoA for synthesis of cholesterol, fatty acids, and complex lipids. During the early postnatal period, acetoacetate (AcAc) and beta-hydroxybutyrate are preferred over glucose as substrates for synthesis of phospholipids and sphingolipids in accord with requirements for brain growth and myelination. Thus, during the first 2 wk of postnatal development, when the accumulation of cholesterol and phospholipids accelerates, the proportion of ketone bodies incorporated into these lipids increases. On the other hand, an increased proportion of ketone bodies is utilized for cerebroside synthesis during the period of active myelination. In the lung, AcAc serves better than glucose as a precursor for the synthesis of lung phospholipids. The synthesized lipids, particularly dipalmityl phosphatidylcholine, are incorporated into surfactant, and thus have a potential role in supplying adequate surfactant lipids to maintain lung function during the early days of life. Our studies further demonstrate that ketone bodies and glucose could play complementary roles in the synthesis of lung lipids by providing fatty acid and glycerol moieties of phospholipids, respectively. The preferential selection of AcAc for lipid synthesis in brain, as well as lung, stems in part from the active cytoplasmic pathway for generation of acetyl-CoA and acetoacetyl-CoA from the ketone via the actions of cytoplasmic acetoacetyl-CoA synthetase and thiolase.”

[7] Evidence type: non-human animal experiment

Edmond J, Auestad N, Robbins RA, Bergstrom JD.
Fed Proc. 1985 Apr;44(7):2359-64.

(Emphasis ours)
“In the course of mammalian development milk has evolved with unique characteristics as has the capacity of the neonatal rat to process this nutrient source. The primary carbon source in milk is fat, which provides two readily utilized metabolites, acetoacetate and D(-)-3-hydroxybutyrate (ketone bodies), as well as free fatty acids and glycerol. Carbohydrate provides less than 12% of the caloric content of rat milk and glucose has to be produced by the suckling rat to maintain glucose homeostasis. One would predict that glucose would be used sparingly and in pathways that cannot be satisfied by other readily available metabolites. Studies of the uptake of metabolites and the development of key enzymes for the utilization of glucose and ketone bodies by developing brain support the concept that ketone bodies are preferred substrates for the supply of carbon to respiration and lipogenesis. Astrocytes, oligodendrocytes, and neurons from developing brain all have an excellent capacity to use ketone bodies for respiration. By contrast, glucose is utilized preferentially in the hexose monophosphate shunt by all three cell populations. We are examining the requirement for ketone bodies by developing brain with the application of a system to rear rat pups artificially on a milk substitute that promotes a hypoketonemia.”

[8] Evidence type: review

Gudiel-Urbano M1, Goñi I.
Arch Latinoam Nutr. 2001 Dec;51(4):332-9.

(Emphasis ours)
“Breast-feeding is the optimal mode of feeding for the normal full-term infant. Human milk composition knowledge has been basis for recommended dietary allowances for infants. Few studies about human milk carbohydrates have been done until the last decade. However, carbohydrates provide approximately 40-50% of the total energy content of breast milk. Quantitatively oligosaccharides are the third largest solute in human milk after lactose and fat. Each individual oligosaccharide is based on a variable combination of glucose, galactose, sialic acid, fucose and N-acetylglucosamine with many and varied linkages between them, thus accounting for the enormous number of different oligosaccharides in human milk. The oligosaccharides content in human milk varies with the duration of lactation, diurnally and with the genetic makeup of the mother. At present, a great interest in the roles of human milk oligosaccharides is raising. They act as a the soluble fibre in breast milk and their structure is available to act as competitive ligands protecting the breast-fed infant from pathogens and act as well as prebiotic. They may also act as source of sialic acid and galactose, essential for brain development. This is why today there is an increasing health and industrial interest in human milk oligosaccharides content, with the main purpose of incorporating them as new ingredients in infant nutrition.”

[9] Evidence type: review

McVeagh P1, Miller JB.
J Paediatr Child Health. 1997 Aug;33(4):281-6.

“Abstract
“Over 100 years ago it was first deduced that a major component of human milk must be an unidentified carbohydrate that was not found in cows milk. At first this was thought to be a form of lactose and was called gynolactose. We now know that this was not a single carbohydrate but a complex mixture of approximately 130 different oligosaccharides. Although small amounts of a few oligosaccharides have been found in the milk of other mammals, this rich diversity of sugars is unique to human milk. The oligosaccharide content of human milk varies with the infant’s gestation, the duration of lactation, diurnally and with the genetic makeup of the mother. Milk oligosaccharides have a number of functions that may protect the health of the breast fed infant. As they are not digested in the small intestine, they form the ‘soluble’ fibre of breast milk and their intact structure is available to act as competitive ligands protecting the breast-fed infant from pathogens. There is a growing list of pathogens for which a specific oligosaccharide ligand has been described in human milk. They are likely to form the model for future therapeutic and prophylactic anti-microbials. They provide substrates for bacteria in the infant colon and thereby contribute to the difference in faecal pH and faecal flora between breast and formula-fed infants. They may also be important as a source of sialic acid, essential for brain development.”

[10] Evidence type: review

Coppa GV, Bruni S, Morelli L, Soldi S, Gabrielli O.
J Clin Gastroenterol. 2004 Jul;38(6 Suppl):S80-3.

“The development of intestinal microflora in newborns is strictly related to the kind of feeding. Breast-fed infants, unlike the bottle-fed ones, have an intestinal ecosystem characterized by a strong prevalence of bifidobacteria and lactobacilli. Data available so far in the literature show that, among the numerous substances present in human milk, oligosaccharides have a clear prebiotic effect. They are quantitatively one of the main components of human milk and are only partially digested in the small intestine, so they reach the colon, where they stimulate selectively the development of bifidogenic flora. Such results have been recently proved both by characterization of oligosaccharides in breast-fed infant feces and by the study of intestinal microflora using new techniques of molecular analysis, confirming that human milk oligosaccharides represent the first prebiotics in humans.”

[11] Evidence type: review

Coppa GV, Zampini L, Galeazzi T, Gabrielli O.
Dig Liver Dis. 2006 Dec;38 Suppl 2:S291-4.

“The microbic colonization of human intestine begins at birth, when from a sterile state the newborn is exposed to an external environment rich in various bacterial species. The kind of delivery has an important influence on the composition of the intestinal flora in the first days of life. Thereafter, the microflora is mainly influenced by the kind of feeding: breast-fed infants show a predominance of bifidobacteria and lactobacilli, whereas bottle-fed infants develop a mixed flora with a lower number of bifidobacteria. The “bifidogenic effect” of human milk is not related to a single growth-promoting substance, but rather to a complex of interacting factors. In particular the prebiotic effect has been ascribed to the low concentration of proteins and phosphates, the presence of lactoferrin, lactose, nucleotides and oligosaccharides. The real prebiotic role of each of these substances is not yet clearly defined, with the exception of oligosaccharides which undoubtedly promote a bifidobacteria-dominant microflora.”

[12] Evidence type: review

McVeagh P, Miller JB.
J Paediatr Child Health. 1997 Aug;33(4):281-6.

(Emphasis ours)
“Over 100 years ago it was first deduced that a major component of human milk must be an unidentified carbohydrate that was not found in cows milk. At first this was thought to be a form of lactose and was called gynolactose. We now know that this was not a single carbohydrate but a complex mixture of approximately 130 different oligosaccharides. Although small amounts of a few oligosaccharides have been found in the milk of other mammals, this rich diversity of sugars is unique to human milk. The oligosaccharide content of human milk varies with the infant’s gestation, the duration of lactation, diurnally and with the genetic makeup of the mother. Milk oligosaccharides have a number of functions that may protect the health of the breast fed infant. As they are not digested in the small intestine, they form the ‘soluble’ fibre of breast milk and their intact structure is available to act as competitive ligands protecting the breast-fed infant from pathogens. There is a growing list of pathogens for which a specific oligosaccharide ligand has been described in human milk. They are likely to form the model for future therapeutic and prophylactic anti-microbials. They provide substrates for bacteria in the infant colon and thereby contribute to the difference in faecal pH and faecal flora between breast and formula-fed infants. They may also be important as a source of sialic acid, essential for brain development.”

[13] Evidence type: experiment

Survival of human milk oligosaccharides in the intestine of infants.
Chaturvedi P, Warren CD, Buescher CR, Pickering LK, Newburg DS.
Adv Exp Med Biol. 2001;501:315-23.

(Emphasis ours)
“Several human milk oligosaccharides inhibit human pathogens in vitro and in animal models. In an infant, the ability of these oligosaccharides to offer protection from enteric pathogens would require that they withstand structural modification as they pass through the alimentary canal or are absorbed and excreted in urine. We investigated the fate of human milk oligosaccharides during transit through the alimentary canal by determining the degree to which breast-fed infants’ urine and fecal oligosaccharides resembled those of their mothers’ milk. Oligosaccharide profiles of milk from 16 breast-feeding mothers were compared with profiles of stool and urine from their infants. Results were compared with endogenous oligosaccharide profiles obtained from the urine and feces of age-, parity-, and gender-matched formula-fed infants. […] Among breast-fed infants, concentrations of oligosaccharides were higher in feces than in mothers’ milk, and much higher in feces than in urine. Urinary and fecal oligosaccharides from breast-fed infants resembled those in their mothers’ milk. Those from formula-fed infants did not resemble human milk oligosaccharides, were found at much lower concentrations, and probably resulted from remodeling of intestinal glycoconjugates or from intestinal bacteria. Most of the human milk oligosaccharides survived transit through the gut, and some were absorbed and then excreted into the urine intact, implying that inhibition of intestinal and urinary pathogens by human milk oligosaccharides is quite likely in breast-fed infants.”

[14] Evidence type: experiment

Nakano T1, Sugawara M, Kawakami H.
Acta Paediatr Taiwan. 2001 Jan-Feb;42(1):11-7.

“Breast milk is the best nutrient source for infants. It contains all elements needed for a normal growth and development of infants. Human milk contains a large amount of sialic acid compared with bovine milk. Sialic acid contained in oligosaccharides, glycolipids and glycoproteins in milk is considered to play important roles in physiological functions in infancy. Thus, we have investigated the sialic acid composition and the functions of sialylated compounds in human milk. Sialic acids comprise a family of neuraminic acid derivatives present in secretions, fluids and tissues of mammals. In milk, sialic acid is present in different sialoglycoconjugate compounds such as oligosaccharides, glycolipids and glycoproteins, not in a free form. Human milk contains 0.3-1.5 mg/ml of sialic acid. Sialic acid bound to oligosaccharides accounts for about 75% of the total sialic acid contained in human milk. Most of the sialic acid contained in human milk is found in the form of sialyllactose, an oligosaccharide formed from lactose and sialic acid. In milk, gangliosides, sialic acid-containing glycolipid, occur mainly as monosialoganglioside 3 (GM3) and disialoganglioside 3 (GD3). The concentration of GM3 in human milk increases, while that of GD3 concentration decreases during lactation. Because the brain and central nervous system contain considerable level of sialic acid in infancy, it is considered to play important roles on the expression and development of their functions. Moreover, we found that some sialylated compounds had inhibited the adhesion of toxins, bacteria and viruses to the receptors on the surface of epithelial cells. Additionally, we found that some sialylated compounds had growth-promoting effects on bifidobacteria and lactobacilli, predominantly present in the intestinal flora of infants fed with human milk. The results suggested that sialylated compounds in human milk possibly behaved as a physiological component in the intestinal tract of infants to protect them against enteric infections.”

[15] Evidence type: review

Wang B.
Annu Rev Nutr. 2009;29:177-222. doi: 10.1146/annurev.nutr.28.061807.155515.

“The rapid growth of infant brains places an exceptionally high demand on the supply of nutrients from the diet, particularly for preterm infants. Sialic acid (Sia) is an essential component of brain gangliosides and the polysialic acid (polySia) chains that modify neural cell adhesion molecules (NCAM). Sia levels are high in human breast milk, predominately as N-acetylneuraminic acid (Neu5Ac). In contrast, infant formulas contain a low level of Sia consisting of both Neu5Ac and N-glycolylneuraminic acid (Neu5Gc). Neu5Gc is implicated in some human inflammatory diseases. Brain gangliosides and polysialylated NCAM play crucial roles in cell-to-cell interactions, neuronal outgrowth, modifying synaptic connectivity, and memory formation. In piglets, a diet rich in Sia increases the level of brain Sia and the expression of two learning-related genes and enhances learning and memory. The purpose of this review is to summarize the evidence showing the importance of dietary Sia as an essential nutrient for brain development and cognition.”

Meat is best for growing brains

There are multiple lines of evidence that
an animal-based diet
best supports
human brain development
in infants and young children.

Human fetuses and infants rely on ketones for brain building.

In a previous post,
we wrote about the known (but little-spoken-of) fact that
human infants are in mild ketosis all the time,
especially when breastfed.
In other words,
ketosis is a natural, healthy state for infants.
Infancy is a critical time for brain growth,
so we expect that ketosis is advantageous for a growing brain.
Otherwise, there would have been a selective advantage to reduced ketotis in infancy.
This species-critical, rapid brain growth continues well past weaning.
For that reason,
we suggest in our article that weaning onto a ketogenic diet would probably be preferable to weaning away from ketosis.
In response to that post,
a reader sent us a paper
called Survival of the fattest: fat babies were the key to evolution of the large human brain. [1]
The authors discuss the apparently unique human trait of having extremely fat babies,
and explain it in terms of the unique need for growth of extremely large brains.
A key point they make
is that a baby’s ample fat provides more than simply a large energy supply,
(much more than could be stored as glycogen or protein; by their calculations, more than 20 times more),
but that ketone bodies are themselves important for human brain evolution.
They repeat the usual unwarranted assumption that
adult brains use mainly glucose for brain fuel by default,
and that ketone bodies are merely an alternative brain fuel.
Nonetheless,
when talking about fetuses, they are willing to say that the use of ketones is not merely an “alternative”:

In human fetuses at mid-gestation, ketones are not just an alternative fuel but appear to be an essential fuel because they supply as much as 30% of the energy requirement of the brain at that age (Adam et al., 1975).
Second, ketones are a key source of carbon for the brain to synthesize the cholesterol and fatty acids that it needs in the membranes of the billions of developing nerve connections.
[…]
Ketones are the preferred carbon source for brain lipid synthesis and they come from fatty acids recently consumed or stored in body fat. This means that, in infants, brain cholesterol and fatty acid synthesis are indirectly tied to mobilization and catabolism of fatty acids stored in body fat.

In other words, the claim is that ketones are the best source of certain brain-building materials, and specifically, that fetuses use them for that purpose.
Moreover, the thesis is that the extra body fat on human babies is there specifically for the purpose of supporting extra brain growth after birth, through continued use of ketones.

Weaning onto meat increases brain growth.

[ Please note that by convention weaning refers to the gradual process of transitioning from exclusive breastfeeding (starting with the first foods introduced, while breastfeeding is still ongoing), to the end of breastfeeding, not just the end itself. ]
We aren’t the only ones who have thought weaning onto meat would be a good idea.
A couple of studies have compared weaning onto meat rather than cereal.
One showed a larger increase in head circumference [2], which is a good index of brain growth in infants [3] and young children [4].
Moreover, higher increases in head circumference in infants are correlated with higher intelligence, independently of head circumference at birth [5].
In other words, the amount of brain growth after birth is a better predictor of intelligence than the amount of brain growth in gestation.
That study also found the meat-fed infants to have better zinc status, and good iron status despite not supplementing iron as was done in the cereal arm [2].
Zinc and iron are abundant in the brain, and zinc deficiency is implicated in learning disorders and other brain development problems [6].
Iron deficiency is a common risk in infants in our culture, because of our dietary practices, which is why infant cereal is fortified with it [7].
Another study showed better growth in general in babies weaned onto primarily meat [8].

Weaning onto meat is easy.
Here’s how I did it.
It is believed likely that early humans fed their babies pre-chewed meat [9].
I did that, too, although that wasn’t my first weaning step.
Influenced by baby-led weaning, I waited until he was expressing clear interest in my food, and then simply shared it with him.
At the time this meant:

  • Broth on a spoon, increasingly with small fragments of meat in it.
  • Bones from steaks and chops, increasingly with meat and fat left on them.
  • Homemade plain, unseasoned jerky, which he teethed on, or sucked until it disintegrated.
  • Beef and chicken liver, which has a soft, silky texture, and is extremely nutrient-dense.

–Amber

The brain is an energy-intensive organ that required an animal-based diet to evolve.

In 1995, anthropologists Leslie C. Aiello and Peter Wheeler posed the following problem [10]:

  • Brains require an enormous amount of energy.
  • Humans have much larger brains than other primates.
  • However, human basal metabolic rates are not more than would be predicted by their body mass.

Where do we get the extra energy required to fuel our brains, and how could this have evolved?
Aiello and Wheeler explain this by noting that
at the same time as our brains were expanding, our intestines (estimated as comparably energy-intensive) were shrinking,
by almost exactly the same amount.
thereby freeing up the extra metabolic energy needed for the brain.
Both adaptations, a large brain and small guts, independently required them to adopt a “high-quality” diet, for different reasons.
Let’s mince no words; “high-quality” means meat [11].
Meat is more nutrient dense than plants, both in terms of protein and vitamins.
Plants are simply too fibrous, too low in protein and calories, and too seasonal to have been relied on for such an evolutionary change [11], [12].
It is widely accepted that meat became an important part of our diets during this change.
This is the mainstream view in anthropology [13].
Although the need for protein and brain-building nutrients is often cited as a reason for needing meat in the evolutionary diet,
energy requirements are also important to consider.
It would have been difficult to get caloric needs met from plants (especially before cooking) [13],
because they were so fibrous.
Herbivores with special guts (such as ruminants like cows with their “four stomachs”) and primates with much larger intestines than we have, actually use bacteria in their guts to turn significant amounts of fiber into fat, see eg. [14].
This strategy is not available to a such a small gut [11], [15], which is why we had to find food that was energy dense as is.
Fortunately, insofar as we were already using animal sources to get protein and nutrients, we also had access to an abundance of fat.
The animals we hunted were unlikely to have been as lean as modern game.
Evidence supports the hypothesis that human hunting was the most likely cause of the extinction of many megafauna (large animals that were much fatter than the leaner game we have left today) [16].
Humans, like carnivores, prefer to hunt larger animals whenever they are available [17].
It has been proposed that the disappearance of the fatter megafauna exerted a strong evolutionary pressure on humans, who were already fat-dependent, to become more skilled hunters of the small game we have today, to rely more on the fat from eating brains and marrow, and to learn to find the fattest animals among the herds [18].
Animal fat and animal protein provided the energy, protein, and nutrients necessary for large brains, especially given the constraint of small guts.

Because humans wean early, and human brain growth is extended past weaning, the post-weaning diet must support fetal-like brain growth.

Humans wean much earlier than other primates, and yet their brains require prolonged growth.
Our intelligence has been our primary selective advantage.
Therefore it is critical from an evolutionary standpoint that the diet infants were weaned onto was supportive of this brain growth.
In a (fascinating and well-written) paper on weaning and evolution, Kennedy puts it this way:

“[A]lthough this prolonged period of development i.e., ‘‘childhood’’ renders the child vulnerable to a variety of risks, it is vital to the optimization of human intelligence; by improving the child’s nutritional status (and, obviously, its survival), the capability of the adult brain is equally improved. Therefore, a child’s ability to optimize its intellectual potential would be enhanced by the consumption of foods with a higher protein and calorie content than its mother’s milk; what better foods to nourish that weanling child than meat, organ tissues (particularly brain and liver), and bone marrow, an explanation first proposed by Bogin (1997).”

“Increase in the size of the human brain is based on the retention of fetal rates of brain growth (Martin, 1983), a unique and energetically expensive pattern of growth characteristic of altricial [ born under-developed ] mammals (Portmann, 1941; Martin, 1984). This research now adds a second altricial trait—early weaning—to human development. The metabolically expensive brain produced by such growth rates cannot be sustained long on maternal lactation alone, necessitating an early shift to adult foods that are higher in protein and calories than human milk.”

The only food higher in protein and calories than breast milk is meat.

A high-fat animal-based diet best supports brain growth.

Taking these facts together:

  • Even modern fetuses and breastfed infants are in ketosis, which uniquely supports brain growth.
  • Infants who are weaned onto meat get essential nutrients to grow brains with: nutrients that are currently deficient in our plant-centric diets today.
    Moreover, experiments have found that their brains actually grow more than babies fed cereal.
  • Human brains continue to grow at a fast rate even past weaning.
  • It is likely that in order to evolve such large, capable brains, human babies were weaned onto primarily meat.

A meat-based, inherently ketogenic diet is not only likely to be our evolutionary heritage,
it is probably the best way to support the critical brain growth of the human child.

Acknowledgements

We would like to thank Matthew Dalby, a researcher at the University of Aberdeen, for helpful discussions about short-chain fatty acid production in the large intestines.

References

[1] Hypothesis paper

Cunnane SC, Crawford MA.
Comp Biochem Physiol A Mol Integr Physiol. 2003 Sep;136(1):17-26.
[2] Evidence type: experiment

Krebs NF, Westcott JE, Butler N, Robinson C, Bell M, Hambidge KM.
J Pediatr Gastroenterol Nutr. 2006 Feb;42(2):207-14.

(Emphasis ours)
“OBJECTIVE:
“This study was undertaken to assess the feasibility and effects of consuming either meat or iron-fortified infant cereal as the first complementary food.
“METHODS:
“Eighty-eight exclusively breastfed infants were enrolled at 4 months of age and randomized to receive either pureed beef or iron-fortified infant cereal as the first complementary food, starting after 5 months and continuing until 7 months. Dietary, anthropometric, and developmental data were obtained longitudinally until 12 months, and biomarkers of zinc and iron status were measured at 9 months.
“RESULTS:
“Mean (+/-SE) daily zinc intake from complementary foods at 7 months for infants in the meat group was 1.9 +/- 0.2 mg, whereas that of the cereal group was 0.6 +/- 0.1 mg, which is approximately 25% of the estimated average requirement. Tolerance and acceptance were comparable for the two intervention foods. Increase in head circumference from 7 to 12 months was greater for the meat group, and zinc and protein intakes were predictors of head growth. Biochemical status did not differ by feeding group, but approximately 20% of the infants had low (<60 microg/dL) plasma zinc concentrations, and 30% to 40% had low plasma ferritin concentrations (<12 microg/L). Motor and mental subscales did not differ between groups, but there was a trend for a higher behavior index at 12 months in the meat group.
“CONCLUSIONS:
“Introduction of meat as an early complementary food for exclusively breastfed infants is feasible and was associated with improved zinc intake and potential benefits. The high percentage of infants with biochemical evidence of marginal zinc and iron status suggests that additional investigations of optimal complementary feeding practices for breastfed infants in the United States are warranted.”

[3] Evidence type: authority

(Emphasis ours)
Today the close correlation between head circumference growth and brain development in the last weeks of gestation and in the first two years of life is no longer disputed. A recently developed formula even allows for calculations of brain weight based upon head circumference data. Between the ages of 32 postmenstrual weeks and six months after expected date of delivery there is a period of very rapid brain growth in which the weight of the brain quadruples. During this growth spurt there exists an increased vulnerability by unfavorable environmental conditions, such as malnutrition and psychosocial deprivation. The erroneous belief still being prevalent that the brain of the fetus and young infant is spared by malnutrition, can be looked upon as disproved by new research results. Severe malnutrition during the brain growth spurt is thought to be a very important non-genetic factor influencing the development of the central nervous system (CNS) and therewith intellectual performance. In the past a permanent growth retardation of head circumference and a reduced intellectual capacity usually was observed in small-for-gestational age infants (SGA). Nowadays, however, there can be found also proofs of successful catch-up growth of head circumference and normal intellectual development after early and high-energy postnatal feeding of SGA infants. The development of SGA infants of even very low birth weight can be supported in such a way that it takes a normal course by providing good environmental conditions, such as appropriate nutrition – especially during the early growth period – and a stimulating environment with abundant attention by the mother.”

[4] Evidence type: experiment

Bartholomeusz HH, Courchesne E, Karns CM.
Neuropediatrics. 2002 Oct;33(5):239-41.

(Emphasis ours)
“OBJECTIVE:
“To quantify the relationship between brain volume and head circumference from early childhood to adulthood, and quantify how this relationship changes with age.
“METHODS:
“Whole-brain volume and head circumference measures were obtained from MR images of 76 healthy normal males aged 1.7 to 42 years.
“RESULTS:
“Across early childhood, brain volume and head circumference both increase, but from adolescence onward brain volume decreases while head circumference does not. Because of such changing relationships between brain volume and head circumference with age, a given head circumference was associated with a wide range of brain volumes. However, when grouped appropriately by age, head circumference was shown to accurately predict brain volume. Head circumference was an excellent prediction of brain volume in 1.7 to 6 years old children (r = 0.93), but only an adequate predictor in 7 to 42 year olds.
“CONCLUSIONS:
“To use head circumference as an accurate indication of abnormal brain volume in the clinic or research setting, the patient’s age must be taken into account. With knowledge of age-dependent head circumference-to-brain volume relationship, head circumference (particularly in young children) can be an accurate, rapid, and inexpensive indication of normalcy of brain size and growth in a clinical setting.

[5] Evidence type: experiment

Gale CR1, O’Callaghan FJ, Godfrey KM, Law CM, Martyn CN.
Brain. 2004 Feb;127(Pt 2):321-9. Epub 2003 Nov 25.

“Head circumference is known to correlate closely with brain volume (Cooke et al., 1977; Wickett et al., 2000) and can therefore be used to measure brain growth, but a single measurement cannot provide a complete insight into neurological development. Different patterns of early brain growth may result in a similar head size. A child whose brain growth both pre‐ and postnatally followed the 50th centile might attain the same head size as a child whose brain growth was retarded in gestation but who later experienced a period of rapid growth. Different growth trajectories may reflect different experiences during sensitive periods of brain development and have different implications for later cognitive function.
“We have investigated whether brain growth during different periods of pre‐ and postnatal development influences later cognitive function in a group of children for whom serial measurements of head growth through foetal life, infancy and childhood were available.”
[…]
“We found no statistically significant associations between head circumference at 18 weeks’ gestation or head circumference at birth SDS and IQ at the age of 9 years.”
[…]
“In contrast, there were strong statistically significant associations between measures of postnatal head growth and IQ. After adjustment for sex, full‐scale IQ rose by 2.59 points (95% CI 0.87 to 4.32) for each SD increase in head circumference at 9 months of age, and by 3.85 points (95% CI 1.96 to 5.73) points for each SD increase in head circumference at 9 years; verbal IQ rose by 2.66 points (95% CI 0.49 to 4.83) for each SD increase in head circumference at 9 months of age, and by 3.76 points (95% CI 1.81 to 5.72) for each SD increase in head circumference at 9 years; performance IQ rose by 2.88 points (95% CI 0.659 to 5.11) for each SD increase in head circumference at 9 months of age, and by 3.16 points (95% CI 1.16 to 5.16) for each SD increase in head circumference at 9 years.”
[…]
“[W]e interpret these findings as evidence that postnatal brain growth is more important than prenatal brain growth in determining higher mental function. This interpretation is supported by the finding that head growth in the first 9 months of life and head growth between 9 months and 9 years of age are also related to cognitive function, regardless of head size at the beginning of these periods.”

[6] Evidence type: review

Pfeiffer CC, Braverman ER.
Biol Psychiatry. 1982 Apr;17(4):513-32.

“The total content of zinc in the adult human body averages almost 2 g. This is approximately half the total iron content and 10 to 15 times the total body copper. In the brain, zinc is with iron, the most concentrated metal. The highest levels of zinc are found in the hippocampus in synaptic vesicles, boutons, and mossy fibers. Zinc is also found in large concentrations in the choroid layer of the retina which is an extension of the brain. Zinc plays an important role in axonal and synaptic transmission and is necessary for nucleic acid metabolism and brain tubulin growth and phosphorylation. Lack of zinc has been implicated in impaired DNA, RNA, and protein synthesis during brain development. For these reasons, deficiency of zinc during pregnancy and lactation has been shown to be related to many congenital abnormalities of the nervous system in offspring. Furthermore, in children insufficient levels of zinc have been associated with lowered learning ability, apathy, lethargy, and mental retardation. Hyperactive children may be deficient in zinc and vitamin B-6 and have an excess of lead and copper. Alcoholism, schizophrenia, Wilson’s disease, and Pick’s disease are brain disorders dynamically related to zinc levels. Zinc has been employed with success to treat Wilson’s disease, achrodermatitis enteropathica, and specific types of schizophrenia.”

[7] Evidence type: authority
From the CDC:
“Who is most at risk?
Young children and pregnant women are at higher risk of iron deficiency because of rapid growth and higher iron needs.
Adolescent girls and women of childbearing age are at risk due to menstruation.
Among children, iron deficiency is seen most often between six months and three years of age due to rapid growth and inadequate intake of dietary iron. Infants and children at highest risk are the following groups:

  • Babies who were born early or small.
  • Babies given cow’s milk before age 12 months.
  • Breastfed babies who after age 6 months are not being given plain, iron-fortified cereals or another good source of iron from other foods.
  • Formula-fed babies who do not get iron-fortified formulas.
  • Children aged 1–5 years who get more than 24 ounces of cow, goat, or soymilk per day. Excess milk intake can decrease your child’s desire for food items with greater iron content, such as meat or iron fortified cereal.
  • Children who have special health needs, for example, children with chronic infections or restricted diets.
[8] Evidence type: experiment

(Emphasis ours)
“Background: High intake of cow-milk protein in formula-fed infants is associated with higher weight gain and increased adiposity, which have led to recommendations to limit protein intake in later infancy. The impact of protein from meats for breastfed infants during complementary feeding may be different.
“Objective: We examined the effect of protein from meat as complementary foods on growths and metabolic profiles of breastfed infants.
“Design: This was a secondary analysis from a trial in which exclusively breastfed infants (5–6 mo old from the Denver, CO, metro area) were randomly assigned to receive commercially available pureed meats (MEAT group; n = 14) or infant cereal (CEREAL group; n = 28) as their primary complementary feedings for ∼5 mo. Anthropometric measures and diet records were collected monthly from 5 to 9 mo of age; intakes from complementary feeding and breast milk were assessed at 9 mo of age.
“Results: The MEAT group had significantly higher protein intake, whereas energy, carbohydrate, and fat intakes from complementary feeding did not differ by group over time. At 9 mo of age mean (± SEM), intakes of total (complementary feeding plus breast-milk) protein were 2.9 ± 0.6 and 1.4 ± 0.4 g ⋅ kg−1 ⋅ d−1, ∼17% and ∼9% of daily energy intake, for MEAT and CEREAL groups, respectively (P < 0.001). From 5 to 9 mo of age, the weight-for-age z score (WAZ) and length-for-age z score (LAZ) increased in the MEAT group (ΔWAZ: 0.24 ± 0.19; ΔLAZ: 0.14 ± 0.12) and decreased in the CEREAL group (ΔWAZ: −0.07 ± 0.17; ΔLAZ: −0.27 ± 0.24) (P-group by time < 0.05). The change in weight-for-length z score did not differ between groups. Total protein intake at 9 mo of age and baseline WAZ were important predictors of changes in the WAZ (R2 = 0.23, P = 0.01).
“Conclusion: In breastfed infants, higher protein intake from meats was associated with greater linear growth and weight gain but without excessive gain in adiposity, suggesting potential risks of high protein intake may differ between breastfed and formula-fed infants and by the source of protein.”

[9] From Wikipedia:
“Breastmilk supplement
“Premastication is complementary to breastfeeding in the health practices of infants and young children, providing large amounts of carbohydrate and protein nutrients not always available through breast milk,[3] and micronutrients such as iron, zinc, and vitamin B12 which are essential nutrients present mainly in meat.[25] Compounds in the saliva, such as haptocorrin also helps increase B12 availability by protecting the vitamin against stomach acid.
“Infant intake of heme iron
“Meats such as beef were likely premasticated during human evolution as hunter-gatherers. This animal-derived bioinorganic iron source is shown to confer benefits to young children (two years onwards) by improving growth, motor, and cognitive functions.[26] In earlier times, premastication was an important practice that prevented infant iron deficiency.[27]
“Meats provide Heme iron that are more easily absorbed by human physiology and higher in bioavailability than non-heme irons sources,[28][29] and is a recommended source of iron for infants.[30]”
[10] Hypothesis paper

Leslie C. Aiello and Peter Wheeler
Current Anthropology, Vol. 36, No. 2 (Apr., 1995), pp. 199-221
[11] Evidence type: review

Milton K.
J Nutr. 2003 Nov;133(11 Suppl 2):3886S-3892S.

(The whole paper is worth reading, but these highlights serve our point.)
“Without routine access to ASF [animal source foods], it is highly unlikely that evolving humans could have achieved their unusually large and complex brain while simultaneously continuing their evolutionary trajectory as large, active and highly social primates. As human evolution progressed, young children in particular, with their rapidly expanding large brain and high metabolic and nutritional demands relative to adults would have benefited from volumetrically concentrated, high quality foods such as meat.”
[…]
“If the dietary trajectory described above was characteristic of human ancestors, the routine, that is, daily, inclusion of ASF in the diets of children seems mandatory as most wild plant foods would not be capable of supplying the protein and micronutrients children require for optimal development and growth, nor could the gut of the child likely provide enough space, in combination with the slow food turnover rate characteristic of the human species, to secure adequate nutrition from wild plant foods alone. Wild plant foods, though somewhat higher in protein and some vitamins and minerals than their cultivated counterparts (52), are also high in fiber and other indigestible components and most would have to be consumed in very large quantity to meet the nutritional and energetic demands of a growing and active child.”
[…]
“Given the postulated body and brain size of the earliest humans and the anatomy and kinetic pattern characteristics of the hominoid gut, turning increasingly to the intentional consumption of ASF on a routine rather than fortuitous basis seems the most expedient, indeed the only, dietary avenue open to the emerging human lineage (2,3,10,53).”
[…]
“Given the probable diet, gut form and pattern of digestive kinetics characteristic of prehuman ancestors, it is hypothesized that the routine inclusion of animal source foods in the diet was mandatory for emergence of the human lineage. As human evolution progressed, ASF likely achieved particular importance for small children due to the energetic demands of their rapidly expanding large brain and generally high metabolic and nutritional demands relative to adults.”

[12] Evidence type: review

Kennedy GE.
J Hum Evol. 2005 Feb;48(2):123-45. Epub 2005 Jan 18.

“Although some researchers have claimed that plant foods (e.g., roots and tubers) may have played an important role in human evolution (e.g., O’Connell et al., 1999; Wrangham et al., 1999; Conklin-Brittain et al., 2002), the low protein content of ‘‘starchy’’ plants, generally calculated as 2% of dry weight (see Kaplan et al., 2000: table 2), low calorie and fat content, yet high content of (largely) indigestible fiber (Schoeninger et al., 2001: 182) would render them far less than ideal weaning foods. Some plant species, moreover, would require cooking to improve their digestibility and, despite claims to the contrary (Wrangham et al., 1999), evidence of controlled fire has not yet been found at Plio-Pleistocene sites. Other plant foods, such as the nut of the baobab (Adansonia digitata), are high in protein, calories, and lipids and may have been exploited by hominoids in more open habitats (Schoeninger et al., 2001). However, such foods would be too seasonal or too rare on any particular landscape to have contributed significantly and consistently to the diet of early hominins. Moreover, while young baobab seeds are relatively soft and may be chewed, the hard, mature seeds require more processing. The Hadza pound these into flour (Schoeninger et al., 2001), which requires the use of both grinding stones and receptacles, equipment that may not have been known to early hominins. Meat, on the other hand, is relatively abundant and requires processing that was demonstrably within the technological capabilities of Plio-Pleistocene hominins. Meat, particularly organ tissues, as Bogin (1988, 1997) pointed out, would provide the ideal weaning food.”

[13] Plants can become more nutrient dense through cooking.
That is the basis of Wrangham’s hypothesis:
(From Wikipedia)
“Wrangham’s latest work focuses on the role cooking has played in human evolution. He has argued that cooking food is obligatory for humans as a result of biological adaptations[9][10] and that cooking, in particular the consumption of cooked tubers, might explain the increase in hominid brain sizes, smaller teeth and jaws, and decrease in sexual dimorphism that occurred roughly 1.8 million years ago.[11] Most anthropologists disagree with Wrangham’s ideas, pointing out that there is no solid evidence to support Wrangham’s claims.[11][12] The mainstream explanation is that human ancestors, prior to the advent of cooking, turned to eating meats, which then caused the evolutionary shift to smaller guts and larger brains.[13]”
[14] Evidence type: review

Popovich DG1, Jenkins DJ, Kendall CW, Dierenfeld ES, Carroll RW, Tariq N, Vidgen E.
J Nutr. 1997 Oct;127(10):2000-5.

(Emphasis ours)
“We studied the western lowland gorilla diet as a possible model for human nutrient requirements with implications for colonic function. Gorillas in the Central African Republic were identified as consuming over 200 species and varieties of plants and 100 species and varieties of fruit. Thirty-one of the most commonly consumed foods were collected and dried locally before shipping for macronutrient and fiber analysis. The mean macronutrient concentrations were (mean ± SD, g/100 g dry basis) fat 0.5 ± 0.4, protein 11.8 ± 8.2, available carbohydrate 7.7 ± 6.3 and dietary fiber 74.0 ± 12.9. Assuming that the macronutrient profile of these foods was reflective of the whole gorilla diet and that dietary fiber contributed 6.28 kJ/g (1.5 kcal/g), then the gorilla diet would provide 810 kJ (194 kcal) metabolizable energy per 100 g dry weight. The macronutrient profile of this diet would be as follows: 2.5% energy as fat, 24.3% protein, 15.8% available carbohydrate, with potentially 57.3% of metabolizable energy from short-chain fatty acids (SCFA) derived from colonic fermentation of fiber. Gorillas would therefore obtain considerable energy through fiber fermentation. We suggest that humans also evolved consuming similar high foliage, high fiber diets, which were low in fat and dietary cholesterol. The macronutrient and fiber profile of the gorilla diet is one in which the colon is likely to play a major role in overall nutrition. Both the nutrient and fiber components of such a diet and the functional capacity of the hominoid colon may have important dietary implications for contemporary human health.”
We disagree, of course, with the authors’ suggested interpretation that humans, too, could make good use of the same dietary strategy, as we haven’t the colons for it.

[15] The maximum amount of fat humans could get from fermenting fibre in the gut is unknown.
The widely cited value of 10% of calories comes from:

E. N. Bergman
Physiological Reviews Published 1 April 1990 Vol. 70 no. 2, 567-590

“The value of 6-10% for humans (Table 3) was calculated on the basis of a typical British diet where 50-60 g of carbohydrate (15 g fiber and 35-50 g sugar and starch) are fermented per day (209). It is pointed out, however, that dietary fiber intakes in Africa or the Third World are up to seven times higher than in the United Kingdom (55). It is likely, therefore, that much of this increased fiber intake is fermented to VFA and even greater amounts of energy are made available by large intestinal fermentation.”
However, it should not be concluded that SCFA production could rise to 70% of energy requirements!
For one thing, as a back-of-the-envelope calculation, you can get up to about 2 kcal worth of SCFA per gram of fermentable carbohydrate.
That would come from soluble plant fiber, resistant starch and regular starch that escapes digestion.
To get 70% of calories this way on a 2000 kcal/day diet, you’d need to ingest 700g of fibre.
Even if you achieved this, it is unlikely you could absorb it all, and in the process of trying, you would experience gastrointestinal distress, including cramping, diarrhea or constipation, gas, and perhaps worse.
Indeed, this would probably happen even at 100g/d, which would provide about 10% of energy in a 2000 kcal/d diet.
Moreover, it would interfere with mineral absorption, rendering it an unviable evolutionary strategy.
Even the ADA, which extols the virtues of fiber, cautions against exceeding their recommendations of 20-35g. See Position of the American Dietetic Association: health implications of dietary fiber.

[16] Evidence type: review

“As the mathematical models now seem quite plausible and the patterns of survivors versus extinct species seem inexplicable by climate change and easily explicable by hunting (7,11), it is worth considering comparisons to other systems. Barnosky et al. note that on islands, humans cause extinctions through multiple synergistic effects, including predation and sitzkrieg, and “only rarely have island megafauna been demonstrated to go extinct because of environmental change without human involvement,” while acknowledging that the extrapolation from islands to continents is often disputed (7). The case for human contribution to extinction is now much better supported by chronology (both radiometric and based on trace fossils like fungal spores), mathematical simulations, paleoclimatology, paleontology, archaeology, and the traits of extinct species when compared with survivors than when Meltzer and Beck rejected it in the 1990s, although the blitzkrieg model which assumes Clovis-first can be thoroughly rejected by confirmation of pre-Clovis sites. Grayson and Meltzer (12) argue that the overkill hypothesis has become irrefutable, but the patterns by which organisms went extinct (7,11), the timing of megafauna population reductions and human arrival when compared with climate change (5), and the assumptions necessary to make paleoecologically informed mathematical models for the extinctions to make accurate predictions all provide opportunities to refute the overkill hypothesis, or at least make it appear unlikely. However, all of these indicate human involvement in megafauna extinctions as not only plausible, but likely.”

[17] Evidence type: review

William J. Ripple and Blaire Van Valkenburgh
BioScience (July/August 2010) 60 (7): 516-526.

“Humans are well-documented optimal foragers, and in general, large prey (ungulates) are highly ranked because of the greater return for a given foraging effort. A survey of the association between mammal body size and the current threat of human hunting showed that large-bodied mammals are hunted significantly more than small-bodied species (Lyons et al. 2004). Studies of Amazonian Indians (Alvard 1993) and Holocene Native American populations in California (Broughton 2002, Grayson 2001) show a clear preference for large prey that is not mitigated by declines in their abundance. After studying California archaeological sites spanning the last 3.5 thousand years, Grayson (2001) reported a change in relative abundance of large mammals consistent with optimal foraging theory: The human hunters switched from large mammal prey (highly ranked prey) to small mammal prey (lower-ranked prey) over this time period (figure 7). Grayson (2001) stated that there were no changes in climate that correlate with the nearly unilinear decline in the abundance of large mammals. Looking further back in time, Stiner and colleagues (1999) described a shift from slow-moving, easily caught prey (e.g., tortoises) to more agile, difficult-to-catch prey (e.g., birds) in Mediterranean Pleistocene archaeological sites, presumably as a result of declines in the availability of preferred prey.”

[18] Evidence type: review

Ben-Dor M1, Gopher A, Hershkovitz I, Barkai R.
PLoS One. 2011;6(12):e28689. doi: 10.1371/journal.pone.0028689. Epub 2011 Dec 9.

“The disappearance of elephants from the diet of H. erectus in the Levant by the end of the Acheulian had two effects that interacted with each other, further aggravating the potential of H. erectus to contend with the new dietary requirements:
“The absence of elephants, weighing five times the weight of Hippopotami and more than eighty times the weight of Fallow deer (Kob in Table 3), from the diet would have meant that hunters had to hunt a much higher number of smaller animals to obtain the same amount of calories previously gained by having elephants on the menu.
“Additionally, hunters would have had to hunt what large (high fat content) animals that were still available, in order to maintain the obligatory fat percentage (44% in our model) since they would have lost the beneficial fat contribution of the relatively fat (49% fat) elephant. This ‘large animal’ constraint would have further increased the energetic cost of foraging.”
[…]
“Comparing the average calories per animal at GBY and Qesem Cave might lead to the conclusion that Qesem Cave dwellers had to hunt only twice as many animals than GBY dwellers. This, however, is misleading as obligatory fat consumption complicates the calculation of animals required. With the obligatory faunal fat requirement amounting to 49% of the calories expected to be supplied by the animal, Fallow deer with their caloric fat percentage of 31% (Kob in Table 3) would not have supplied enough fat to be consumed exclusively. Under dietary constraints and to lift their average fat consumption, the Qesem Cave dwellers would have been forced to hunt aurochs and horses whose caloric fat ratio amounts to 49% (the equivalent of buffalo in Table 3). The habitual use of fire at Qesem Cave, aimed at roasting meat [23], [45], may have reduced the amount of energy required for the digestion of protein, contributing to further reduction in DEE. The fact that the faunal assemblage at Qesem Cave shows significantly high proportions of burnt and fractured bones, typical of marrow extraction, is highly pertinent to the point. In addition, the over-representation of fallow deer skulls found at the site [9], [45] might imply a tendency to consume the brain of these prey animals at the cave. Combined, these data indicate a continuous fat-oriented use of prey at the site throughout the Acheulo-Yabrudian (400-200 kyr).
“However, the average caloric fat percentage attributed to the animals at Qesem Cave – 40% – is still lower than the predicted obligatory fat requirements of faunal calories for H. sapiens in our model, amounting to 49% (Table 2). This discrepancy may have disappeared should we have considered in our calculations in Table 3 the previously mentioned preference for prime-age animals that is apparent at Qesem Cave [9], [45]. The analysis of Cordain’s Caribou fat data ([124]: Figure 5) shows that as a strategy the selective hunting of prime-age bulls or females, depending on the season, could, theoretically, result in the increase of fat content as the percentage of liveweight by 76% from 6.4% to 11.3%, thus raising the caloric percentage of fat from animal sources at Qesem Cave. Citing ethnographic sources, Brink ([125]:42) writes about the American Indians hunters: “Not only did the hunters know the natural patterns the bison followed; they also learned how to spot fat animals in a herd. An experienced hunter would pick out the pronounced curves of the body and eye the sheen of the coat that indicated a fat animal”. While the choice of hunting a particular elephant would not necessarily be significant regarding the amount of fat obtained, this was clearly not the case with the smaller game. It is apparent that the selection of fat adults would have been a paying strategy that required high cognitive capabilities and learning skills.”

The Effect of Ketogenic Diets on Thyroid Hormones

The previous generation of myths about low carb diets were focused on organ systems. They warned of things like kidney dysfunction, and osteoporosis.
As these myths became untenable, new myths have swiftly taken their place: myths, for example, about hormone systems, and gut bacteria.
In previous posts, such as here, and here, we dispelled misinformation arising from fears about cortisol.
In this post we address fears about thyroid.
The idea that ketogenic diets are “bad for thyroid” is spouted in keto-opposed and keto-friendly venues alike.
Despite rampant parroting, it is difficult to find evidence to support this idea.
The only evidence that we found even suggestive of this idea is the fact that T₃, the most active thyroid hormone, has repeatedly been shown to be lower in ketogenic dieters.
However, this lowered T₃ is not a sign of “hypothyroid”. In fact, it has a beneficial function!
In this article, we explain why lower T₃ on a ketogenic diet is beneficial, rather than a sign of dysfunction or cause for alarm.

Low T₃ is not hypothyroid.

Diagnosis
Let’s first clear up some confusion about “low thyroid”.
Diagnosis is a tricky business.
Diseases manifest in unwanted symptoms, and diagnosis is the art of determining the cause.
Sometimes symptoms are very good discriminators.
They are easy to verify, and they have only one or two common causes.
Other times symptoms are common in a variety of illnesses, and by themselves don’t help diagnosis much.
Hypothyroid tends to be a cluster of these indiscriminate symptoms, and therefore, a lot of people are tempted, in understandable desperation, to diagnose themselves with it.
Ideally in medical research we want to find indicators and predictors of diseases:
things we can measure that discriminate well between diseases, or predict the imminent manifestation of those diseases.
Often they are measures that are not readily apparent to a patient, for example blood levels of various substances.
To verify a suspicion of hypothyroid, we measure thyroid hormones in the blood.
As we have seen again and again, there are often different ways to measure something, and symptoms or outcomes correlated with one measure may or may not correlate with the others.
Hypothyroid
The most common thyroid measures are the levels of TSH (thyroid stimulating hormone), T₄ (a relatively inactive form of thyroid hormone), and T₃ (the more active form).
TSH acts on the thyroid gland causing T₃ and T₄ to be produced.
Further T₃ can be generated out of T₄.
Hypothyroid is a problem in the gland, where not enough T₃ and T₄ are being produced.
It is indicated by high values of TSH (along with low T₃ and T₄).



It is my suspicion

that supplementing thyroid hormone in the general case of hypothyroidism may be as foolish as supplementing insulin in Non-Insulin-Dependent Diabetes.
Insulin is appropriate in (aptly named) Insulin-Dependent Diabetes, just as thyroid hormone would remain appropriate in Hashimoto’s.

The situation is analogous to high insulin in a Type II (Non-Insulin-Dependent) Diabetic:
In that case, insulin at normal amounts is not effectively reducing blood sugar as it would in a healthy body, so more and more gets produced to have the needed effect.
In the case of hypothyroid, more and more TSH is produced, because TSH is what acts on the thyroid gland to produce T₃ and T₄.
In other words, when you have low T₃ and T₄ levels, this signals more TSH to be created, in order to cause more T₃ and T₄ to be made in the gland.
Low T₃ by itself, without high TSH or low T₄, has been studied extensively, and has various names, including “nonthyroidal illness syndrome” (NTIS) [1].
On modern, high carb diets, it appears to happen only in cases of critical illness [1].
Whether low T₃ in critical illness is adaptive or not is a point of controversy [1].
Clearly, either there is a disruption in production caused by the illness,
or the body has a functional reason for not raising T₃; that is,
that low T₃ helps recovery in some way.
The adaptive hypothesis would be supported if supplementing T₃ caused harm.
Unfortunately, results have been mixed.
The mixed results are probably an artefact of the lumping together of the various situations in which NTIS occurs.
Although NTIS occurs with starvation,
ketogenic diets, which share some metabolic similarities with starvation, have not so far been included in this area of research.
However, research in calorie, carbohydrate, and protein restriction indicates that in these cases, as with starvation [1], lower T₃ is adaptive.

Lower T₃ spares muscle in conditions of weight loss or inadequate protein.

In weight loss, starvation, or protein deficiency conditions, lowered T₃ is thought to be a functional response that protects against muscle loss [2], [3].
When a diet creates a calorie deficit, or is low in protein, this creates a catabolic state (one in which the body tends to be breaking things down, rather than building them up).
If the body does not respond to this by lowering T₃, then lean mass would be lost.
Moreover, if T₃ is supplemented by a well-meaning person who interpreted this adaptation as a detrimental “hypothyroid” condition, this also results in loss of lean mass, as shown by Koppeschaar et al. [4].
Supplementing T₃ decreases ketosis, and increases the insulin-to-glucagon ratio [4], which, as we have previously discussed is tightly correlated with glucose production.
This suggests that supplementing T₃ induces gluconeogenesis;
as Koppeschaar et al. put it: “It must be concluded that triiodothyroxine also directly influenced glucose metabolism”.
Not only are T₃ levels lower in calorie restriction, but T₃ receptors are downregulated [1], [5], suggesting a second mechanism by which the body adapts away from T₃ use under ketogenic conditions.



If you are on a low-carb diet in which you are losing weight, and your T₃ is low, don’t assume you should correct this with supplementation.
Lowered T₃ has a purpose, and supplementing it defeats the purpose.


Other research has shown a correlation between lower T₃ and higher ketosis [6], and
between lower T₃ and very low carbohydrate levels [7], [8], [9].
It’s all very consistent.

In other words, the more ketogenic a weight loss diet is the better it spares muscles, and lowered T₃ is thought to be part of the mechanism, because it is both correlated with higher βOHB, correlated with muscle sparing, and because supplementing with T₃ reverses the muscle sparing effect.

As alluded to above, T₃ will also be lowered in a situation where weight is not being lost, and carbs are not ketogenically low, if protein is inadequate [10].
This further underscores the function of T₃ lowering: to spare protein for lean mass.
We are not aware of a study showing the effects of a protein adequate, ketogenic maintenance diet (i.e. not calorie restricted) that measured T₃. Therefore, we are not certain whether lowered T₃ would continue in that context [11].
However, insofar as it may continue, that could be beneficial:

Low T₃ is associated with longevity.

It’s possible that the lower T₃ found in ketogenic dieters is an indicator of a lifespan increasing effect.
First, T₃ is associated with longevity.
Low T₃ has been found in the very long-lived [12].
This does not appear to be simply an effect of old age, though,
because the correlation also shows up in a genetic study of longevity [13].
Moreover, just as with moderately elevated cortisol,
low T₃ is found in animals who have their lifespans experimentally increased,
and therefore (again, as with elevated cortisol)
the low T₃ is hypothesised to be part of the mechanism in increasing lifespan [13], [14].

Conclusion

There is no evidence that we are aware of indicating that ketogenic diets cause hypothyroid, or negatively impact thyroid function.
The fact that T₃ is lower in ketogenic dieters is probably part of the mechanism that protects lean mass when fat is being lost.
Moreover, low T₃ may possibly even be an indicator of a life extending effect, an effect we have suggested elsewhere when examining the cortisol profile of ketogenic dieters.

References:

[1] Evidence type: review

Economidou F1, Douka E, Tzanela M, Nanas S, Kotanidou A.
Hormones (Athens). 2011 Apr-Jun;10(2):117-24.

(Emphasis ours)
“The metabolic support of the critically ill patient is a relatively new target of active research and little is as yet known about the effects of critical illness on metabolism. The nonthyroidal illness syndrome, also known as the low T₃ syndrome or euthyroid sick syndrome, describes a condition characterized by abnormal thyroid function tests encountered in patients with acute or chronic systemic illnesses. The laboratory parameters of this syndrome include low serum levels of triiodothyronine (T₃) and high levels of reverse T₃, with normal or low levels of thyroxine (T₄) and normal or low levels of thyroid-stimulating hormone (TSH). This condition may affect 60 to 70% of critically ill patients. The changes in serum thyroid hormone levels in the critically ill patient seem to result from alterations in the peripheral metabolism of the thyroid hormones, in TSH regulation, in the binding of thyroid hormone to transport-protein and in receptor binding and intracellular uptake. Medications also have a very important role in these alterations. Hormonal changes can be seen within the first hours of critical illness and, interestingly, these changes correlate with final outcome. Data on the beneficial effect of thyroid hormone treatment on outcome in critically ill patients are so far controversial. Thyroid function generally returns to normal as the acute illness resolves.”
[…]
It remains controversial whether development of the aforementioned changes in thyroid metabolism reflects a protective mechanism or a maladaptive process during illness.
If these changes constitute an adaptation mechanism, then treatment to restore thyroid hormone levels to the normal range could have deleterious effects. In contrast, if these changes are pathologic, treatment may improve an otherwise poor clinical outcome. Current literature data indicate that:
Starvation-induced decrease in serum T₃ concentrations most likely reflects a process of adaptation.

Ketogenic metabolism most closely resembles starvation, though, of course, with the important difference that it is nutritionally complete and there is no reason to believe it would be unhealthy indefinitely. — Amber

[2] Evidence type: experiment

Kaptein EM, Fisler JS, Duda MJ, Nicoloff JT, Drenick EJ.
Clin Endocrinol (Oxf). 1985 Jan;22(1):1-15.

(Emphasis ours)
“The relationship between the changes in serum thyroid hormone levels and nitrogen economy during caloric deprivation were investigated in ten obese men during a 40 d, 400 kcal protein-supplemented weight-reducing diet. This regimen induced increases in the serum levels of total T₄, free T₄ and total rTT₃and decreases of total T₃, while serum TSH remained unchanged. There were progressive decreases in total body weight and urinary losses of total nitrogen and 3-methylhistidine, with the early negative nitrogen balance gradually returning towards basal values during the 40 days. Subjects with the largest weight loss had the most increase in the serum levels of total T₄ and free T₄ index and the greatest decrease in T₃. The magnitude of the increase of the nitrogen balance from its nadir was correlated with the extent of the reduction of T₃ and increase of T₃ uptake ratio and free T₄ levels. The decrease in the urinary excretion of 3-methylhistidine correlated with the increase in free T₄ and rT₃ levels. Nadir serum transferrin values were directly related to peak rT₃ values, and the lowest albumin concentrations occurred in subjects with the highest total T₄ and free T₄ index values. Further, the maximum changes in the serum thyroid hormone levels preceded those of the nutritional parameters. These relationships suggest that: (1) increases in serum rT₃ and free T₄ and reductions in T₃ concentrations during protein supplemented weight reduction may facilitate conservation of visceral protein and reduce muscle protein turnover; and (2) the variation in the magnitude of these changes may account for the heterogeneity of nitrogen economy.”

[3] Evidence type: experiment

(Emphasis ours)
“Although the rate of fat loss was relatively constant throughout the study, wide interindividual variations in cumulative protein (nitrogen) deficit were observed. Total nitrogen losses per subject ranged from 90.5 to 278.7 g. Cumulative nitrogen loss during the first 16 days tended to correlate negatively with initial mean fat cell size and positively with initial lean body mass. Most notable was the strong negative correlation between the size of the decrease in serum triiodothyronine over the 64-day study and the magnitude of the concurrent cumulative N deficit. During severe caloric restriction, one’s ability to decrease circulating serum triiodothyronine levels may be critical to achievement of an adaptational decrease in body protein loss.

[4] Evidence type: experiment

(Emphasis ours)
“Metabolic responses during a very-low-calorie diet, composed of 50 per cent glucose and 50 per cent protein, were studied in 18 grossly obese subjects (relative weights 131-205 per cent) for 28 d. During the last 14 d (period 2) eight subjects (Gp B) served as controls, while the other ten subjects (Gp A) in the low T₃ state were treated with triiodothyronine supplementation (50 micrograms, 3 times daily). During the first 14 d (period 1) a low T₃-high rT₃ state developed; there was an inverse relationship between the absolute fall of the plasma T₃ concentrations and the cumulative negative nitrogen balance as well as the beta-hydroxybutyrate (βOHB) acid concentrations during the semi-starvation period, pointing to a protein and fuel sparing effect of the low T₃ state. Weight loss in the semi-starvation period was equal in both groups; during T₃ treatment the rate of weight loss was statistically significant (Gp A 6.1 +/- 0.3 kg vs Gp B 4.2 +/- 0.2 kg, P less than 0.001). In the control group there was a sustained nitrogen balance after three weeks; in Gp A the nitrogen losses increased markedly during T₃ treatment. Compared to the control group, on average a further 45.4 g extra nitrogen were lost, equivalent to 1.4 kg fat free tissue. Thus, 74 per cent of the extra weight loss in the T₃ treated group could be accounted for by loss of fat free tissue. During the T₃ treatment period no detectable changes occurred regarding plasma triglycerides and plasma free fatty acids (FFA) concentrations; the plasma βOHB acid concentrations decreased significantly as compared to the control group. Plasma glucose concentrations and the immunoreactive insulin (IRI)/glucose ratio increased in Gp A in the T₃ treatment period, reflecting a state of insulin resistance with regard to glucose utilization. Our results warrant the conclusion that there appears to be no place for T₃ as an adjunct to dieting, as it enhances mostly body protein loss and only to a small extent loss of body fat.
[…]
The plasma βOHB concentration declined significantly during T₃ treatment. In accordance with the results of Hollingsworth et al. we observed a decline of the plasma uric acid levels; this decline occurred simulataneously with the decrease in the βOHB levels in the T₃ treated group; as renal tubular handling of uric acid and ketones are closely linked during fasting, this might implicate a diminished renal reabsorbtion of ketones.
“It is known that renal conservation of ketones prevents large losses of cations during prolonged starvation without T₃ treatment; since ammonium is the major cation excreted in established starvation, the increased renal reabsorbtion of ketone bodies also minimizes nitrogen loss.”

[5] Evidence type: review

Schussler GC, Orlando J.
Science. 1978 Feb 10;199(4329):686-8.

“Fasting decreases the ratio of hepatic nuclear to serum triiodothyronine (T₃) by diminishing the binding capacity of nuclear T₃ receptors. In combination with the lower serum T₃ concentration caused by fasting, the decrease in receptor content results in a marked decrease in nuclear T₃-receptor complexes. The changes in T₃ receptor content and circulating T₃ in fasted animals appear to be independent synergistic adaptations for caloric conservation in the fasted state. Unlike changes in hormonal level, the modification of nuclear receptor content provides a mechanism that may protect cells with a low caloric reserve independently of the metabolic status of the whole animal.”

[6] Evidence type: controlled experiment

Spaulding SW, Chopra IJ, Sherwin RS, Lyall SS.
J Clin Endocrinol Metab. 1976 Jan;42(1):197-200.

“To evaluate the effect of caloric restriction and dietary composition on circulating T₃ and rT₃, obese subjects were studied after 7—18 days of total fasting and while on randomized hypocaloric diets (800 kcal) in which carbohydrate content was varied to provide from 0 to 100% calories. As anticipated, total fasting resulted in a 53% reduction in serum T₃ in association with a reciprocal 58% increase in rT₃. Subjects receiving the no-carbohydrate hypocaloric diets for two weeks demonstrated a similar 47% decline in serum T₃ but there was no significant change in rT₃ with time. In contrast, the same subjects receiving isocaloric diets containing at least 50 g of carbohydrate showed no significant changes in either T₃ or rT₃ concentration. The decline in serum T₃ during the no-carbohydrate diet correlated significantly with blood glucose and ketones but there was no correlation with insulin or glucagon. We conclude that dietary carbohydrate is an important regulatory factor in T₃ production in man. In contrast, rT₃, concentration is not significantly affected by changes in dietary carbohydrate. Our data suggest that the rise in serum rT₃ during starvation may be related to more severe caloric restriction than that caused by the 800 kcal diet.”

So at least in a very low calorie situation, T₃ becomes low only when the diet is sufficiently low in carbohydrate to be ketogenic, and its level correlates with ketogenesis.
We are not told whether any of the diets were protein sufficient, but in this case it doesn’t matter. The very low calories make it catabolic, and only when carbohydrate is at ketogenically low levels does the protein sparing effect occur.
—Amber

[7] Evidence type: controlled experiment

Mathieson RA, Walberg JL, Gwazdauskas FC, Hinkle DE, Gregg JM.
Metabolism. 1986 May;35(5):394-8.

(Emphasis ours)
“Twelve obese women were studied to determine the effects of the combination of an aerobic exercise program with either a high carbohydrate (HC) very-low-caloric diet (VLCD) or a low carbohydrate (LC) VLCD diet on resting metabolic rate (RMR), serum thyroxine (T₄), 3,5,3′-triiodothyronine (T₃), and 3,5,3′-triiodothyronine (rT₃). The response of these parameters was also examined when subjects switched from the VLCD to a mixed hypocaloric diet. Following a maintenance period, subjects consumed one of the two VLCDs for 28 days. In addition, all subjects participated in thrice weekly submaximal exercise sessions at 60% of maximal aerobic capacity. Following VLCD treatments, participants consumed a 1,000 kcal mixed diet while continuing the exercise program for one week. Measurements of RMR, T₄, T₃, and rT₃ were made weekly. Weight decreased significantly more for LC than HC. Serum T₄ was not significantly affected during the VLCD. Although serum T₃ decreased during the VLCD for both groups, the decrease occurred faster and to a greater magnitude in LC (34.6% mean decrease) than HC (17.9% mean decrease). Serum rT₃ increased similarly for each treatment by the first week of the VLCD. Serum T₃ and rT₃ of both groups returned to baseline concentrations following one week of the 1,000 kcal diet. Both groups exhibited similar progressive decreases in RMR during treatment (12.4% for LC and 20.8% for HC), but values were not significantly lower than baseline until week 3 of the VLCD. Thus, although dietary carbohydrate content had an influence on the magnitude of fall in serum T₃, RMR declined similarly for both dietary treatments.”

[8] Evidence type: controlled experiment

Pasquali R, Parenti M, Mattioli L, Capelli M, Cavazzini G, Baraldi G, Sorrenti G, De Benedettis G, Biso P, Melchionda N.
J Endocrinol Invest. 1982 Jan-Feb;5(1):47-52.

(Emphasis ours)
“The effect of different hypocaloric carbohydrate (CHO) intakes was evaluated in 8 groups of obese patients in order to assess the role of the CHO and the other dietary sources in modulating the peripheral thyroid hormone metabolism. These changes were independent of those of bw. Serum T₃ concentrations appear to be more easily affected than those of reverse T₃ by dietary manipulation and CHO content of the diet. A fall in T₃ levels during the entire period of study with respect to the basal levels occurred only when the CHO of the diet was 120 g/day or less, independent of caloric intake (360, 645 or 1200 calories). Moreover, reverse T₃ concentrations were found increased during the entire period of study when total CHO were very low (40 to 50 g/day) while they demonstrated only a transient increase when CHO were at least 105 g/day (with 645 or more total calories). Indeed, our data indicate that a threshold may exist in dietary CHO, independent of caloric intake, below which modifications occur in thyroid hormone concentrations. From these results it appears that the CHO content of the diet is more important than non-CHO sources in modulating peripheral thyroid hormone metabolism and that the influence of total calories is perhaps as pronounced as that of CHO when a “permissive” amount of CHO is ingested.”

[9] Evidence type: controlled experiment

(Emphasis ours)
“To assess the effect of starvation and refeeding on serum thyroid hormones and thyrotropin (TSH) concentrations, 45 obese subjects were studied after 4 days of fasting and after refeeding with diets of varying composition. All subjects showed an increase in both serum total and free thyroxine (T₄), and a decrease in serum total and free triiodothyronine (T₃) following fasting. These changes were more striking in men then in women. The serum T₃ declined during fasting even when the subjects were given oral L-T₄, but not when given oral L-T₃. After fasting, the serum reverse T₃ (rT₃) rose, the serum TSH declined, and the TSH response to thyrotropin-releasing hormone (TRH) was blunted. Refeeding with either a mixed diet (n = 22) or a carbohydrate diet (n = 8) caused the fasting-induced changes in serum T₃, T₄, rT₃, and TSH to return to control values. In contrast, refeeding with protein (n = 6) did not cause an increase in serum T₃ or in serum TSH of fasted subjects, while it did cause a decline in serum rT₃ toward basal value.
The present data suggest that: (1) dietary carbohydrate is an important factor in reversing the fall in serum T₃ caused by fasting; (2) production of rT₃ is not as dependent on carbohydrate as that of T₃; (3) men show more significant changes in serum thyroid hormone concentrations during fasting than women do, and (4) absorption of T₃ is not altered during fasting.”

Note that in this case, “refeeding” was with an 800 calorie diet, i.e., for protein, 200g. So the refeeding diet is still low calorie, and thus still catabolic —Amber

[10] Evidence type: controlled experiment

Otten MH, Hennemann G, Docter R, Visser TJ.
Metabolism. 1980 Oct;29(10):930-5.

“Short term changes in serum 3,3′,5-triiodothyronine (T₃) and 3,3’5-triiodothyronine (reverse T₃, rT₃) were studied in four healthy nonobese male subjects under varying but isocaloric and weight maintaining conditions. The four 1500 kcal diets tested during 72 hr, consisted of: I, 100% fat; II, 50% fat, 50% protein; III, 50% fat, 50% carbohydrate (CHO), and IV, a mixed control diet. The decrease of T₃ (50%) and increase of rT₃ (123%) in the all-fat diet equalled changes noted in total starvation. In diet III (750 kcal fat, 750 kcal CHO) serum T₃ decreased 24% (NS) and serum rT₃ rose significantly 34% (p < 0.01). This change occurred in spite of the 750 kcal CHO. This amount of CHO by itself does not introduce changes in thyroid hormone levels and completely restores in refeeding models the alterations of T₃ and rT₃ after total starvation. The conclusion is drawn that under isocaloric conditions in man fat in high concentration itself may play an active role in inducing changes in peripheral thyroid hormone metabolism.”

Here, finally, is a study that is explicitly a maintenance diet. It says mostly what we would expect. It was a bit surprising, and contrary to some previous findings, that in the half carb, half fat diet, this high a carbohydrate level would still allow lower T₃. The authors suggest that this is evidence that high fat alone is responsible. Our interpretation, in contrast, is that it is the zero protein condition that led to the lower T₃. In the body of the paper, the authors, to their credit, acknowledge that they are speculating. We would love to see this example followed by more researchers. —Amber

[11] Ebbeling et al. did make T₃ measurements, on a ketogenic diet intended to be weight stable, but the subjects were losing weight while on the ketogenic phase, and therefore no conclusion about T₃ in weight stable, protein adequate conditions can be drawn from that study.

Ebbeling CB, Swain JF, Feldman HA, Wong WW, Hachey DL, Garcia-Lago E, Ludwig DS.
JAMA. 2012 Jun 27;307(24):2627-34. doi: 10.1001/jama.2012.6607.

(Emphasis ours)
“Participants
Overweight and obese young adults (n=21).
Interventions
After achieving 10 to 15% weight loss on a run-in diet, participants consumed low-fat (LF; 60% of energy from carbohydrate, 20% fat, 20% protein; high glycemic load), low-glycemic index (LGI; 40%-40%-20%; moderate glycemic load), and very-low-carbohydrate (VLC; 10%-60%-30%; low glycemic load) diets in random order, each for 4 weeks.”
[…]
“Hormones and Components of the Metabolic Syndrome (Table 3)
Serum leptin was highest with the LF diet (14.9 [12.1 to 18.4] ng/mL), intermediate with the LGI diet (12.7 [10.3 to 15.6] ng/mL) and lowest with the VLC diet (11.2 [9.1 to 13.8] ng/mL; P=0.0006). Cortisol excretion measured with a 24-hour urine collection (LF: 50 [41 to 60] μg/d; LGI: 60 [49 to 73] μg/d; VLC: 71 [58 to 86] μg/d; P=0.005) and serum TSH (LF: 1.27 [1.01 to 1.60] μIU/mL; LGI: 1.22 [0.97 to 1.54] μIU/mL; VLC: 1.11 [0.88 to 1.40] μIU/mL; P=0.04) also differed in a linear fashion by glycemic load. Serum T₃ was lower with the VLC diet compared to the other two diets (LF: 121 [108 to 135] ng/dL; LGI: 123 [110 to 137] ng/dL; VLC: 108 [96 to 120] ng/dL; P=0.006).

[12] Evidence type: observational

Baranowska B1, Wolinska-Witort E, Bik W, Baranowska-Bik A, Martynska L, Broczek K, Mossakowska M, Chmielowska M.
Neurobiol Aging. 2007 May;28(5):774-83. Epub 2006 May 12.

(Emphasis ours)
“It is well known that physiological changes in the neuroendocrine system may be related to the process of aging. To assess neuroendocrine status in aging humans we studied a group of 155 women including 78 extremely old women (centenarians) aged 100-115 years, 21 early elderly women aged 64-67 years, 21 postmenopausal women aged 50-60 years and 35 younger women aged 20-50 years. Plasma NPY, leptin, glucose, insulin and lipid profiles were evaluated, and serum concentrations of pituitary, adrenal and thyroid hormones were measured. Our data revealed several differences in the neuroendocrine and metabolic status of centenarians, compared with other age groups, including the lowest serum concentrations of leptin, insulin and T₃, and the highest values for prolactin. We failed to find any significant differences in TSH and cortisol levels. On the other hand, LH and FSH levels were comparable with those in the elderly and postmenopausal groups, but they were significantly higher than in younger subjects. GH concentrations in centenarians were lower than in younger women. NPY values were highest in the elderly group and lowest in young subjects. We conclude that the neuroendocrine status in centenarians is markedly different from that found in early elderly or young women.”

[13] Evidence type: observational

Rozing MP1, Westendorp RG, de Craen AJ, Frölich M, Heijmans BT, Beekman M, Wijsman C, Mooijaart SP, Blauw GJ, Slagboom PE, van Heemst D; Leiden Longevity Study (LLS) Group.
J Gerontol A Biol Sci Med Sci. 2010 Apr;65(4):365-8. doi: 10.1093/gerona/glp200. Epub 2009 Dec 16.

“BACKGROUND:
The hypothalamo-pituitary-thyroid axis has been widely implicated in modulating the aging process. Life extension effects associated with low thyroid hormone levels have been reported in multiple animal models. In human populations, an association was observed between low thyroid function and longevity at old age, but the beneficial effects of low thyroid hormone metabolism at middle age remain elusive.
METHODS:
We have compared serum thyroid hormone function parameters in a group of middle-aged offspring of long-living nonagenarian siblings and a control group of their partners, all participants of the Leiden Longevity Study.
RESULTS:
When compared with their partners, the group of offspring of nonagenarian siblings showed a trend toward higher serum thyrotropin levels (1.65 vs157 mU/L, p = .11) in conjunction with lower free thyroxine levels (15.0 vs 15.2 pmol/L, p = .045) and lower free triiodothyronine levels (4.08 vs 4.14 pmol/L, p = .024).
CONCLUSIONS:
Compared with their partners, the group of offspring of nonagenarian siblings show a lower thyroidal sensitivity to thyrotropin. These findings suggest that the favorable role of low thyroid hormone metabolism on health and longevity in model organism is applicable to humans as well.

[14] Evidence type: experiment

Fontana L, Klein S, Holloszy JO, Premachandra BN.
J Clin Endocrinol Metab. 2006 Aug;91(8):3232-5. Epub 2006 May 23.

“CONTEXT:
Caloric restriction (CR) retards aging in mammals. It has been hypothesized that a reduction in T₃ hormone may increase life span by conserving energy and reducing free-radical production.
OBJECTIVE:
The objective of the study was to assess the relationship between long-term CR with adequate protein and micronutrient intake on thyroid function in healthy lean weight-stable adult men and women.
DESIGN, SETTING, AND PARTICIPANTS:
In this study, serum thyroid hormones were evaluated in 28 men and women (mean age, 52 +/- 12 yr) consuming a CR diet for 3-15 yr (6 +/- 3 yr), 28 age- and sex-matched sedentary (WD), and 28 body fat-matched exercising (EX) subjects who were eating Western diets.
MAIN OUTCOME MEASURES:
Serum total and free T₄, total and free T₃, reverse T₃, and TSH concentrations were the main outcome measures.
RESULTS:
Energy intake was lower in the CR group (1779 +/- 355 kcal/d) than the WD (2433 +/- 502 kcal/d) and EX (2811 +/- 711 kcal/d) groups (P < 0.001). Serum T₃ concentration was lower in the CR group than the WD and EX groups (73.6 +/- 22 vs. 91.0 +/- 13 vs. 94.3 +/- 17 ng/dl, respectively) (P < or = 0.001), whereas serum total and free T₄, reverse T₃, and TSH concentrations were similar among groups.
CONCLUSIONS:
Long-term CR with adequate protein and micronutrient intake in lean and weight-stable healthy humans is associated with a sustained reduction in serum T₃ concentration, similar to that found in CR rodents and monkeys. This effect is likely due to CR itself, rather than to a decrease in body fat mass, and could be involved in slowing the rate of aging.”