Going gluten-free is bang on trend right now. While only around 1% of Americans have celiac disease, around one-third of U.S. adults are trying to cut gluten out of their diet. Why? Well, many perceive gluten-free foods to be healthier, when in fact they’re generally not, and as many as 6% supposedly have a new syndrome of gluten intolerance called “non-celiac gluten sensitivity” (NCGS).
Perceived by many as simply a self-diagnosed food fad, hard evidence for its existence is lacking. And a new study is only reinforcing its controversial status. According to the results, most people diagnosed with this clinical entity can’t even tell the difference between gluten-containing and gluten-free flour after eating both. On top of that, only one-third of patients actually experienced symptoms after consuming gluten.
But let’s rewind a bit first. What’s the difference between celiac disease and NCGS? The former is an autoimmune disease where the body mistakenly identifies components of gluten as a potential threat and thus attacks them, damaging the small intestine in the process. While celiac disease can be diagnosed and monitored by checking levels of so-called biomarkers in the body, none of these indicators have been identified in NCGS. Furthermore, intestinal biopsies fail to find any differences that could be used to diagnose the condition.
So how do you know if you have NCGS, then? Herein lies the problem. Diagnosis relies on self-reported symptoms experienced following the consumption of gluten – abdominal pain, bloating, nausea, fatigue; much like what celiacs suffer. But even if these are genuinely brought about by eating gluten-containing foods, who’s to say it’s the gluten causing the problems, and not something else in the cereal? The existence of this condition is therefore still very much up for debate, and studies attempting to compare responses to gluten and placebos in supposed sufferers have produced a mixed bag of results.
In an attempt to put another nail in the coffin, researchers in Italy gathered a bunch of individuals with clinical NCGS diagnoses and then assessed their adherence to a gluten-free diet by means of an antibody test and questionnaire. Those who had stuck to it for a minimum of three months prior to trial onset – 35 individuals – were then enrolled.
Participants were randomly split into two groups and given a sachet of flour – simply labeled “A” or “B” – and asked to sprinkle it on soup or pasta for 10 days. One had gluten in it, while the other didn’t, but previous tests showed they were indistinguishable by sight. They were then given a two-week break before swapping to the other flour. For the duration of the study, participants were asked to record any symptoms they experienced, like indigestion, constipation, and diarrhea.
As described in Alimentary Pharmacology & Therapeutics, only one-third of the participants actually met the clinical criteria for NCGS, managing to correctly identify which flour had gluten in it at the end of the study and reporting symptoms following its ingestion. Furthermore, half of those that thought the gluten-free flour had gluten in it said they experienced gastrointestinal symptoms after eating it, but not when they ate the flour that actually contained gluten.
The researchers admit that they don’t have a clear-cut explanation for these findings, but they think that people reporting symptoms may not actually be sensitive to gluten, but something else in cereal. For example, studies have shown that difficult-to-digest carbohydrates called FODMAPs can flare up inflammatory bowel disease. Another possibility is the nocebo effect, where people experience symptoms because they anticipate them.
So it seems this study wasn’t the final nail in the coffin for the existence of NCGS, but maybe larger, more rigorous studies will provide some clarity.
[H/T: Real Clear Science]