6 October 2025
Back in the twenty-tens, gluten-free diets were all the rage. It seemed like every second celebrity, athlete and blogger was proclaiming that they had lost weight, improved their tennis game, cured their anxiety/acne/cancer/leprosy and won Lotto, all because they cut evil gluten out of their diet. With the social media-spawned proliferation of ever-more-niche (and ever-more-cultish) diets, you’d be forgiven for thinking that the gluten-free diet’s star had waned. And that’s what I thought, until I ran a GoogleTrends search for ‘gluten-free diets’ and found this:

It turns out that more people in the world are searching for information on gluten-free diets now than ever before… and the overwhelming majority of them have absolutely nothing to gain from adopting a gluten-free diet.
Who should avoid gluten?
Now, don’t get me wrong. There are people who should, without any doubt, strictly avoid eating either wheat, or any gluten-containing food. These people fall into three major diagnostic categories:
- Autoimmune reactions to gluten.
The best-known of these is coeliac disease, a condition in which, in genetically predisposed individuals, undigested gliadin (a component of gluten) triggers a destructive immune response in the mucous membrane lining the small intestines, ultimately leading to malabsorption of nutrients and an increased risk of lymphoma.
Many people with coeliac disease do not suffer from the classical intestinal symptoms of chronic diarrhoea and resultant weight loss, so anyone with unexplained anaemia, infertility, osteoporosis, persistent mouth ulcers or prolonged fatigue should undergo serological testing in order to identify the so-called ‘silent’ form of the disease.
First degree relatives of diagnosed coeliac disease sufferers, and people with certain other autoimmune diseases that commonly co-occur with coeliac disease, such as type 1 diabetes, alopecia areata, and autoimmune thyroiditis, should also be tested, as early diagnosis and adoption of a gluten-free diet protects against developing intestinal damage and malabsorption.
Two other gluten-related autoimmune diseases, gluten ataxia and dermatitis herpetiformis, may occur in people who are genetically predisposed to coeliac disease, either with or without the classic coeliac-type pattern of intestinal damage. - Allergic reactions to wheat proteins.
Adverse immunological reactions to wheat proteins (including, but not limited to, gluten) manifest in four ways: classic food allergy affecting the skin, gut or respiratory tract, which can in extreme cases lead to anaphylaxis and death; wheat-dependent, exercise-induced anaphylaxis (WDEIA); occupational asthma (baker’s asthma) and rhinitis; and contact urticaria. - Immune-mediated non-coeliac gluten sensitivity.
There are some people who experience symptoms including abdominal pain, bloating, diarrhoea, skin rashes, headache, mild cognitive dysfunction (‘brain fog’), fatigue, and bone or joint pain when they eat gluten-containing foods, but in whom testing has ruled out both autoimmune and allergic reactions to gluten and related proteins. If their symptoms go away when they stop eating gluten-containing foods, and return when they undergo a blinded gluten challenge (that is, they consume both gluten, and a placebo, in random order on separate occasions), these people are said to be suffering from non-coeliac gluten sensitivity. Note that many people who experience gastrointestinal symptoms such as bloating and diarrhoea when they eat gluten-containing foods, are actually reacting to the fermentable carbohydrates (‘FODMAPs’) in these foods, and not to the gluten. That’s why it’s important, for diagnostic accuracy, to conduct a challenge with gluten alone, and not in the context of a gluten-containing food such as bread or pasta.

… and who shouldn’t be avoiding gluten?
Currently, only 1.2 per cent of Australian men and 1.9 per cent of women have coeliac disease, and true wheat allergies are even rarer, affecting less than one per cent. The prevalence of true non-coeliac gluten sensitivity is hard to pin down, as so few people who believe they are gluten-sensitive have ever undergone the gold-standard test for confirming the diagnosis: dietary elimination, followed by a double-blind, randomised, placebo-controlled food challenge. However, an Italian study conducted in gastroenterology, allergy and internal medicine referral centres, which employed clinical criteria rather than the gold-standard test, found that non-coeliac gluten sensitivity was only marginally more common than coeliac disease.
So in all, less than three per cent of Australians have a condition that definitively benefits from avoidance of wheat and/or gluten. Yet a 2020 population survey found that around one quarter of adults are consciously avoiding gluten altogether, or intentionally minimising their intake. 53 per cent of non-coeliac, non-allergic gluten-avoiders nominated ‘general health’ as the principal reason for their avoidance.
But in people who do not have allergic or autoimmune reactions to gluten or other wheat proteins, following a gluten-free diet may actually be harmful to general health. Healthy, non-coeliac adults who followed a gluten-free diet for just one month had a reduction in their populations of beneficial gut bacteria, including Bifidobacterium and Lactobacillus species, and Faecalibacterium prausnitzii which is considered to be one of the most important keystone species in a healthy human gut. And they had a concomitant increase in the number of opportunistic pathogens such as Escherichia coli and Enterobacteriaceae. The abundance of Akkermansia muciniphila, a type of bacteria that is associated with improved gut barrier function, decreased inflammation and enhanced glucose regulation, is also reduced on a low-gluten diet.
These adverse changes in gut microbial composition even occur when healthy volunteers are randomised to a gluten-free diet matched for fibre content with a high-gluten diet. When women with autoimmune thyroid disease were placed on a gluten-free diet, after four weeks there was a significant increase in gut bacteria associated with inflammation, including Desulfobacterota, Proteobacteria, and Parasutterella. Meanwhile, multiple species of beneficial bacteria including Actinobacteriota and Bifidobacterium decreased. And the longer people stay on a gluten-free or low-gluten diet, the more their microbial richness – a pivotal indicator of a healthy gut microbiome – declines.
It’s not the gluten itself that shapes gut microbial composition, but the polysaccharides (complex carbohydrates) that accompany gluten in grains such as wheat, rye and barley. Those polysaccharides serve as the food source for beneficial bacteria; remove them from the diet, and the population of ‘good bugs’ drops back, allowing the not-so-good bugs to proliferate. And that adverse change in the make-up of the gut microbiome decreases the immune system’s ability to respond to infections.
The danger of basing your dietary choices on myths
Many people who are following gluten-free diets have been persuaded by five major myths about wheat and other gluten-containing foods that float freely around the Internet and popular books and media. (I thoroughly debunked these five myths in my Deep Dive webinar, ‘Should I be Gluten-Free’; EmpowerEd members can access the video recording of the webinar along with the fully-referenced slides.)
I’ll cover the #1 most widely-circulated myth in this article.
The Biggest Myth About Gluten:
“Wheat (and other gluten-containing foods) cause health problems in humans because we have only been consuming them since agriculture began, and that’s not long enough for us to adapt to them.”
– Every second social media ‘nutrition influencer’
Here’s the reality:
Pretty much everything that humans eat now, including most animal species, has only been consumed since agriculture began. We have extensively modified all food species (plant and animal) through selective breeding, which was – and still is – intended to enhance desirable characteristics in food, such as size, sweetness, hardiness to environmental stress, cropping duration, or shelf-life.
It is simply no longer possible to find the species of plants that our ancestors ate unless we go foraging in the ever-shrinking wilderness, and these wild plant foods are not exactly tasty.
Take lettuce as an example. The ancient ancestor of our modern, mild-tasting salad staple was a thick-stemmed, white sap-exuding weed with thorns on its bitter leaves. It was so unpalatable that it was cultivated only to extract oil from its seeds. The ancient Egyptians, Greeks and Romans selectively bred it to reduce its bitterness, and the Romans carried it with them as they marched through western Europe, all the way to Britain. At each point along the way, local people took the Roman gift and modified it to their growing conditions and taste preferences, always, however, focusing on growing more luscious leaves. When the Romans brought lettuce to China, however, the Chinese bred it into a crunchy, thick-stemmed vegetable for cooking. While plant geneticists can trace the DNA of lettuce through 6000 years of modification by humans, not a single version of lettuce eaten today bears any resemblance in appearance, texture or taste to its ancient ancestor.
The same story pertains for animal foods. The nutritional composition of beef, lamb, pork and poultry is so dramatically different to that of the wild animals that pre-agricultural humans hunted, that there is simply no comparison. Beef, for example, has over 16 times the fat content of the meat of the red hartebeest, an antelope species which is a favoured target of African persistence hunters. Wild game meat also has a significantly higher ratio of omega-3 to saturated fats than even range-fed beef.
If you think that you should only be eating what humans ate in Paleolithic times, prepare to be very hungry indeed… and forget about living in a city!
How long have humans been eating wheat, anyway? Wheat was first domesticated – that is, deliberately cultivated – in southeastern Anatolia (now part of Turkey) roughly 11, 000 years ago. However, archaeological evidence from the Ohalo II site in Israel (a cave inhabited by hunter-gatherers) shows that humans gathered, processed and ate wild grains, including barley and wheat, around 23,000 years ago – that is, during the Paleolithic era. They also ate herbs, nuts, fruits and legumes, as indicated by the tens of thousands of seeds and fruits discovered at the site.
Aside from the fact that humans have been eating grains (including wheat) for far longer than we’ve been intentionally growing them, the argument that humans have not had enough time to genetically adapt to grains just doesn’t stack up.
Humans, along with all other species, are constantly adapting to their environment through the random generation of genetic mutations that occurs when sperm meets egg, and their genetic material combines (called ‘recombination’). If those mutations are beneficial to survival and reproduction, the ‘new’ gene will persist in the population. That’s how humans came to have more copies of the gene that codes for production of the starch-digesting enzyme amylase in our genome than pre-human hominins and non-human primates: being able to digest cooked starches from tubers, and later on from wild grains, was a significant survival advantage. In fact, without this capacity to harvest energy from starches, we would never have developed the brain size and capacity that distinguishes us as humans.
Furthermore, population growth increases the speed of adaptation – because more individuals means more reproduction and more genetic diversity – and agriculture facilitated a dramatic increase in the human population. Only a few million of us walked Earth 10,000 years ago, at the beginning of the agricultural revolution. After roughly 8000 years of agriculture the human population had swelled to about 200 million people; and from there to 600 million people in the year 1700. Now there are over 8.25 billion of us.
This rapid population expansion facilitates evolutionary adaptation. In fact researchers have found evidence of recent selection in roughly seven percent of all human genes. According to anthropologist John Hawks, author of a study quantifying the rate of acceleration of human adaptive evolution,
“We are more different genetically from people living 5,000 years ago than they were different from Neanderthals.”
Genome study places modern humans in the evolutionary fast lane
Let’s put it this way: if the CCR5 gene, which originated only about 4000 years ago and which confers resistance to smallpox, can now be found in the genomes of about ten per cent of Europeans, it’s a total cinch for humans to adapt to eating wheat and other gluten-containing grains over the course of 23,000 years.



