Spit and Image

The wonderful world of mucus

Sally Black

As my one-year-old proudly wiped her own nose, the wonder and beauty of mucus impressed themselves upon me. Specifically, on my shirt. My daughter had only just begun to tackle the art of nose wiping. As with many human activities, practice lagged behind theory. Her pride in her new achievement was a joy to see. For the sticky disorder she created, there were baby wipes.

Over the past winter, it seemed not a day went by without at least one in the family being ill. The final leaves fell from the trees, the days grew shorter and darker, Christmas came and went, the days lengthened again, and still the illnesses raged. There was no pause. Normally two, sometimes three, but thankfully not all four of us suffered at any single point. “A mighty creature is the germ,” rhymed Ogden Nash, “though smaller than a pachyderm. . . . Do you, my poppet, feel infirm? You probably contain a germ.” We never met him, but he was surely writing about us.

Much in life takes one by surprise, and often being told about it beforehand makes little difference. Who can know merely by report the tenderness that comes from feeling your infant daughter snuffle and settle for comfort with her cheek against your neck? Is childhood not packed with miracles! Who but a parent feels the mixture of love and resignation when the nose of that nestling child drips viscous snot inside your collar? (Who, for that matter, would think it possible for a diaper to declare itself needing to be changed not by a smell or a cry, not by an unexpected ooze around a waist or foot, but, somehow, by diarrhea dribbling out of a child’s sleeve? A sleeve. Truly, the human body is amazing. But I digress.)

In an episode of the TV show The West Wing, the character Toby Ziegler reflects on becoming a father. He contemplates his newborn twins. “I didn’t realize,” he muses, “babies come with hats.” Well, I’m a physician. I may not have helped elect an idealized liberal leader of a fictional United States, but I have understood for many years that while newborns don’t arrive with hats, they quickly receive them. What I had failed to anticipate was the extent to which these newborns then develop into unstoppable engines of infection. They are tanks—I use the word in its military sense—of virulence. To live in a house with kids under five, in the midst of a cold, wet winter, is to experience the vast power of billions of years of evolution expressing itself in the complex medium of childhood illnesses. (And to live through an English winter is to live through the cold and wet. Some stereotypes are reliable.)

Wordsworth wrote fine poetry about the innocence and blessedness with which our lives start, but he left bits out. He could have written about intimations of morbidity from recollections of early childhood. He mentioned daffodils and cataracts and clouds, yet nowhere in his poetry do you find mucus. That’s an omission. From conception to birth, and forever afterward, mucus smooths our life. Digestion, sex, and breathing depend upon it. It seals the neck of the womb, protecting the fetus from infection. The expulsion of that seal, extruded in a lump or slowly as a gradual discharge, heralds our arrival. Not in utter nakedness but trailing clouds of mucus do we come.

Control of body fluids is a must for civilized adults. In an infant, much may be forgiven.

The National Archives

The thought of all this is offputting. It shouldn’t be. To shudder at the physical reality of life is the characteristic of nervous adolescence. We learn as we mature that we are built from base materials but that our nature is not sullied as a result. When a young couple are awaiting their first child, the sight of the expectant mother’s mucus show is unlikely to fill them with a teenage cry of “Eew, yuck!” They are more likely to respond to the realization that the baby is coming with half-relieved anticipation and sheer blind panic. (“I am not,” I distinctly remember thinking at the time, “ready. How about waiting a while?” I may have accidentally spoken that out loud. I recall my wife’s response being blunt almost to the point of impoliteness. She was probably fretting about something.)

To step back a little for another example—nine months, say—consider the presence of another person’s saliva. That of a stranger on the rim of a glass you wish to drink from is horrible, but only an unfortunate few are struck by repulsion at the thought of kissing their lover. What we find wholesome depends not on its chemistry but on the qualities we see in it.

Something does not strike us as lovely simply because it is “natural,” although we often find ourselves thinking and saying so. Our attitude toward the unhindered processes of nature is ambivalent, as highlighted by our behavior on catching a cold. It takes no more than a runny nose to make us into Tour de France cyclists, enthusiastic users of drugs to improve our constitutions. Germs are so many and various, we think, so devious and prevalent, that pharmacology is morally reasonable, nothing more than a means, as Lance Armstrong might say, to level the playing field.

At the mildest sniffle we reach for medicines. We want a pill for our fever, another for our headache; we desire a cloud of chemicals to take away our coughs, make our breathing easy, and soothe our fretful spirits. If our hypocrisy runs deep, we assure ourselves that we remain devoted to what is natural by swallowing vitamins rather than acetaminophen, echinacea in place of cough syrup, and willow bark instead of aspirin.

When I contemplate such matters it is with a sense of conservatism, and, as is generally the case, my conservatism contains the potential for moral disapproval. I harbor no notion that natural is synonymous with good. My conservatism is based on the belief that being desperate to intervene and do something is a hazardous way to approach life, particularly in medicine. The human body is too intricate. Any intrusion on it is too fraught with unintended consequences. Our urge to act is poisoned by our failure to realize how poorly insight and intelligence are able to predict the reactions of complex systems. In the absence of hard evidence of benefit, my conservatism says, it is better to do nothing. Stand there, as the saying goes, don’t just do something.

A large amount of research has looked at how likely new medical therapies are to actually help. Phase 3 trials, which are usually randomized, double-blinded, and controlled, test treatments that are not just theoretically beneficial, but that also have proven their benefit in lab experiments, in animal trials, and in early (phase 1 and phase 2) studies of safety and efficacy in humans. Once they have leapt triumphantly over those early hurdles and embarked on their third phase clinical study—the key test of their value to the world—their chance of being shown to do more good than harm is fifty-fifty. (In an important sense, that is a curiously fortunate figure. If new treatments were on average likely to help, randomizing people to placebos or existing therapies would be ethically difficult, and our ability to justify these essential trials would crumble.)

Of course, one can be too conservative. In the nineteenth century, when anesthesia was first discovered, people thought it could be wrong. Some took this view on moral grounds. God had cursed Eve over that regrettable business with the apple, and the pain of childbirth was the result. Using anesthesia to get rid of pain was said to be giving in to “a decoy of Satan” that would “rob God of the deep, earnest cries which arise in time of trouble for help.” (That widely circulated quote is attributed to some unidentified member of the clergy.)

Others felt the pain of labor should not be meddled with on practical grounds. “There is, in natural labor, no element of disease,” argued a leading American obstetrician, saying that if a patient of his ever died from anesthesia, “I should feel disposed to clothe me in sackcloth, and cast ashes on my head for the remainder of my days.” (The same obstetrician, Charles Meigs (1792–1869) was also outraged that doctors should be made to wash their hands when attending a labor. That too was unnatural. “Doctors are gentlemen,” he famously said, “and a gentleman’s hands are clean.”) The idea that pain was natural and essential to health was not restricted to views of childbirth. It helps explain why, in the millennia before anesthesia was discovered, surgery was carried out without making people insensible through the use of opium or alcohol. There were other concerns too, but a conviction that pain was essential for proper healing was definitely present.

“The Lord gave us tears to shed,” said the physician and philosopher Maimonides more than 800 years ago, “do not try to stem their flow.” The same might be true for mucus. Mucus, to us, can occupy the moral position that pain did for our forebears: normal, natural, and possibly needed for recovery. Like a cough and a fever and pain, after all, mucus is not only the symptom of a disease but part of our body’s defense against it. Abolish those defenses and you may make people feel better at the cost of putting them in danger.

Or maybe not. We can come up with stories about why inhibiting mucus might be bad, and we can just as well come up with arguments about why it might help. These theoretical debates drove beliefs in medicine for most of history, which is why doctors used so many leeches. Arguments and theories are poor guides to truth. What we need is a reliable way of working out what happens to people when you inhibit their mucus production during periods of illness: we need an experiment.

Here my conservatism, which is otherwise based on good evidence, experiences a setback. A review of data from randomized clinical trials shows that brief and targeted inhibition of nasal mucus is, on average, a good thing. Depending on what combination of analgesia, decongestant, and antihistamine you use, the effects vary. Somewhere between four and seven people have to take the drugs for a single sufferer to feel any better. But the effects appearreal. The natural course of events, when you get a cold, need not be accepted as the right and proper order of things. Left to itself, a cold manages to inflict slightly more suffering than if it is attacked pharmacologically. Interventions, carefully chosen and properly tested, appear to work.

Reconstruction of an apothecary’s laboratory

Wellcome Library, London

Not that pharmacology helps me much as my family weathers its seasonal storm of mucus. To reduce the viral symptoms of my daughter and her five-year-old brother sounds like a pleasant prospect, but in trials only older children and adults seem to get benefits from the drugs. My inner medical conservative notices this with interest. It suggests that perhaps the positive data about decongestants are an artifact of poor trial design, the manifestation of uncontrolled placebo effects.

Young children do not respond to placebos; they have not learned to expect benefits. Adults are also more likely to be able to tell the difference between a decongestant and a sham. A bad taste in the mouth, an irritation in the nose, all these things can result in trial participants’ determining whether they have been allocated an active drug. A useless drug with mild but recognizable side effects might thereby falsely show an effect in adults, while its true pointlessness would be evident when given to children immune to placebos. As is proper, my inner conservative never sleeps.

Even should the evidence come to support drug use, getting a dose of medicine into my offspring would be grueling. For my son a straitjacket might be required, for my daughter, anesthesia. The effort might break me, worn down as I am by the effort involved in getting miserable and feverish children through the day and to sleep at night. Child rearing and mucus can seem to take more out of us than we get from them. But even in combination the two retain much of worth and wonder. I can even console myself with the suggestive but unproven idea that childhood infections reduce the chance of adult autoimmune diseases.

For the ancient Greeks, balance in four essential humors was a key to both health and temperament.

Hippocratic medicine held mucus, which they called phlegm, to be one of the four essential humors. Africans, the ancient Greeks thought, were dry, fiery, and bilious. They lacked phlegm, and the imbalance showed itself in their defective frame and their ill temper. Northerners, the non-Mediterranean Europeans, had too much phlegm, and their bodies, characters, and societies alike were damp and cold as a result. The healthy mind in the healthy body required balance, and only the Greeks—wrote the Greeks—had that.

Our response to illness is also a balancing act. Is it better to suffer stoically, to recognize that running noses are a normal part of life and particularly of childhood? Or is that too passive, too defeatist, too phlegmatic? Is it better to strive through technology and pharmacology to overcome those symptoms? Morality mixes with medicine when we reach for an answer.

Recent Stories

The way they live, the food they eat, and the effect on us

A true but unlikely tale

Story and Photographs by William Rowan

Increasing day length on the early Earth boosted oxygen released by photosynthetic cyanobacteria.

Genomic evidence shows that Denisovans and modern humans may have overlapped in Wallacea.