Corporeality can be, at times, pretty great. And yet for all its advantages, there are certain downsides to being trapped in a sack of rotting limbs and organs and eye-juice. For instance: Allergies. There are innocent people out there who can’t pet a friendly dog without sneezing, or eat a peanut without instantly dying.
Is this obscene injustice some adaptive quirk of evolution—the kind of thing that regularly saved our ancestors from surprise bear-pummelings, way back in the day—or just one of the thousand ways our bodies have of reminding us that, ultimately, they are not on our side, and could kill us at any moment?
This week on Giz Asks, we asked a number of experts devoted to treating and researching allergies to weigh in on how individuals develop allergies, as well as how we as a species developed them—and whether animals can be allergic to us, too.
Sterling Professor of Immunobiology at Yale University and Investigator at the Howard Hughes Medical Institute
Allergies do have a purpose – normally, they are protective reactions to noxious environmental substances, phytochemicals, irritants, airborne particles, etc. All allergic reactions (sneezing, coughing, itching, vomiting and diarrhoea) have one thing in common – they expel unwanted substances from the body (respiratory tract, gastrointestinal tract or skin).
Even people without allergies have these same protective reactions: when we inhale dust particles we sneeze or cough; when we ingest spoiled food, we vomit; when we come into contact with irritants, we itch. These reactions are mediated by neuronal reflexes. Our immune system can also participate in these same defences and these are meant to be protective. However, in people with allergies, these defenses become excessive, resulting in pathological allergies. In these cases, people can react to even minute amounts of allergens. Why this happens in some people and not others is unknown. However, it is clear that something about the modern environment is to blame: the prevalence of allergies has been growing steadily over the past two decades. Our diets based on processed foods, overuse of antibiotics and hygiene products are all likely responsible for this trend.
Paul Turke, M.D.
Social anthropologist, “Darwininan pediatrician,” and author of the forthcoming Bringing Up Baby
Most people believe that allergies result from immune system mistakes. But this is not always the case. Many of the things we touch, breathe, or ingest, are less benign than we think. Some are toxins or irritants that should elicit an immune response, and therefore some of the rashes, itches, and sneezes that we would rather avoid are probably beneficial.
On the other hand, there’s no doubt that some allergic reactions are mistakes. This is not surprising given that our immune systems have the incredibly difficult task of determining whether the substances we come into contact with are harmful or harmless. It’s a tough job because of the sheer number of substances that must be discerned, and it’s made even tougher by the fact that many of the bacteria and viruses that infect us are moving targets. They are alive too, and therefore are selected to evolve evasive counter-strategies, which include mimicking the proteins that comprise our own cells and tissues. As a result, our immune systems can become confused, and ignore things that should not be ignored, or fail to tolerate things that should be tolerated. In the latter instance, allergies and autoimmune diseases are the result.
Yet another reason for allergies—specifically childhood food allergies—is what’s known as ‘evolutionary mismatch.’ The gist is this. Our immune systems need to be able to tolerate the proteins in the foods we eat, and it seems that the best time to learn proper tolerance is very early in our lives—while we are foetuses, newborns, and toddlers. Thus, at least hypothetically, the key to developing proper tolerance to foods is to grow up under conditions where early exposures closely match later exposures, as occurred throughout almost all of our evolutionary history.
Nowadays, however (unlike in Paleolithic times, when novel foods came along only very occasionally), mismatch between early and late exposures is increasingly likely. To see this, consider a mum who didn’t eat peanuts or fish while pregnant and breastfeeding, and didn’t serve them to her children once they became old enough to begin consuming solids. And now ask yourself, what’s the probability of them encountering these foods upon entering the world at large, say at daycare, nursery, or a friend’s house? I’ve argued that the probability is now larger than it’s ever been, and a setup for the development of a food allergies.
David B. Corry, M.D.
Professor and Chief of Immunology, Allergy and Rheumatology at Baylor College of Medicine
This is a common question for which a good answer remains elusive. We are taught that allergies are simply a mistaken reaction to things that are common in the environment like mite and cockroach proteins and pollens. A more nuanced, and likely more correct, notion that is now emerging is that something happens to our immune systems to convert our normal “tolerogenic”, non-inflammatory responses to harmless pollens and other agents into inflammatory events that lead to diseases like asthma, sinusitis, allergies, and others. The primary event that leads to such untoward immune reactions increasingly appears to be low-grade infections often involving fungi.
Although the inflammatory reactions generated by the fungal infections are useless against harmless pollens, cockroach and mite proteins, they are highly effective against the fungi and in fact represent a primary means by which we keep fungal infections of our mucosal surfaces (skin and linings of our airways and gastrointestinal tract) from spreading internally, a usually fatal complication.
Animals of all kinds also express allergies, and it is likely that again fungi are root causes in many cases. True pet allergies to humans are exceptionally rare events, although pets can become allergic to the perfumes and other chemicals often found on their owners.
Matthew Greenhawt MD
Associate Professor of Paediatrics at Children’s Hospital Colorado, University of Colorado School of Medicine
Why do we have so many individuals developing food allergies these days? Most kids graduating high school may not find it so unusual that they have known a few of their classmates who had a food allergy, though their parents and grandparents more than likely never did. What has changed in a generation that has delivered this explosive problem into our lives? The causes are unclear, and studies that can clearly show us a cause and effect are hard to design and harder to execute, meaning that we have to draw conclusions from studies that may only tell us part of the picture. However, despite these limitations, we do have a few leading theories.
The first is something called the “hygiene hypothesis,” which relates to the immune system balance between its two arms that deal with infection and allergy. As society has evolved in the past century, there are fewer communicable illnesses that our parents and grandparents may have suffered from. For example, my trainees these days have likely never encountered a natural case of the chicken-pox, because of a highly effective vaccine against that, which has significantly cut the rate. There are a slew of other vaccines that have prevented measles, mumps, rubella, and certain invasive streptococcal and haemophilus illnesses that cause pneumonia and meningitis. The immune system evolved to handle both infection and allergic inflammation, but decreased rates of infection may be throwing the relative “chi” of the immune system out of whack, allowing the allergic inflammation arm to become overactive. It’s beyond vaccines. Think about the last time you were able to be out of reach of a bottle of gel-based hand sanitiser. Our society is just a lot “cleaner”, meaning that the balance of the systems is tilted and this may contribute to the rise of recognition of things, such as foods, that for some individuals are recognised as “danger” and prompt an immune response.
Two other theories merit attention as well. One relates to vitamin D. Many Americans are vitamin D deficient despite ample sources of vitamin D—dairy perhaps is the most well-recognised, as is skin exposure to the sun. Vitamin D is an important immune regulatory marker, and that lack of vitamin D may be associated with the development of allergic disease, including food allergy. A few years ago, an astute researcher at Harvard noticed an inverse relationship between sunlight exposure and epinephrine auto-injector prescriptions, meaning that in areas with less sunlight (and presumably less vitamin D) there were higher rates of epinephrine device prescriptions (which may infer higher rates of allergy). The second theory has to do with the timing of introduction of certain foods that are possible high-risk allergens. Years ago, based on the data available at the time, we advised parents of children with some risk of developing a food allergy (parental history of allergic disease) to intentionally avoid foods like peanut, tree nut, and seafood until the child was about three years old, to protect the developing immune system from exposure that could possibly trigger allergy in some individuals. As it turns out, the opposite was the better choice. In the ensuing years, multiple studies showed that early and deliberate exposure in at risk infants was actually the protective action, and not deliberate delay. This protection has been shown in five very recently published randomised, controlled studies of early egg introduction and one study of early peanut introduction, and the National Institutes of Allergy and Infectious Diseases now recommends introducing peanut into the diets of infants as early as 4-6 months of life.
So, while we don’t know exactly why the rates of food allergy have skyrocketed in the past 20 years, we do have a few theories as to what may be behind the rate. More research is needed to better try to pin down specific causes and effects.
Neeta Ogden MD
Spokesperson, American College of Allergy Asthma and Immunology
Allergies stem from an overactive immune system that perceives certain allergens (food/pollen) as foreign bodies. Today allergies tend to be multifactorial. We know there is a definitely a genetic component. There is a medical term called the atopic triad which includes the allergic conditions of eczema and asthma as well as hay fever and food allergy. If a parent or sibling has even one of these their child is at risk for developing allergy at some point in their life. Typically these conditions of the atopic triad run together—you often see them all in a single patient. This is called the atopic march and speaks to the natural progression of allergic disease starting in childhood with eczema and then onto food allergy and later allergic rhinitis. These conditions may also wax and wane throughout a person’s life. Environmental causes also interact with genes affecting the natural course of allergic disease.
There are many children and adults who develop allergies without any family history. Our understanding of allergy is still evolving but it is likely that this is again due to an interaction of genetics and environment. Studies have shown that avoiding certain environmental triggers—like highly-processed foods and smoking, for example—during pregnancy and infancy can decrease the risk of allergies and asthma later in life. Other factors such as vitamin D, omega-3’s in the diet, exposure to animals, and an individual’s microbiome can be protective against allergies.
Animals do develop allergies in particular due to the intense spring pollen seasons that have occurred in the last five years. Climate change is another environmental trigger with warmer temperatures and higher C02 levels creating a ripe environment for soaring pollen levels. This has lead to a marked increased in new onset seasonal allergies especially in adults and similarly leads to typical allergy symptoms in pets.
The evolution of allergies still needs to elucidated. Like many medical conditions, there was less detection in the past. Doctors, parents and people are more aware of allergies than ever and more likely to diagnose them with new advanced diagnostic modalities.