Vaccination Nation

 

“What made eradication possible was a really good vaccine and political support. There was a real incentive to do it. You don’t ask a cow if it wants to be vaccinated. You just do it.”
— Ron DeHaven, former CEO of the American Veterinary Medical Association, speaking about the eradication of rinderpest, a cattle disease related to the measles virus.

Rinderpest and smallpox are the only two infectious diseases that have been eradicated around the world. Smallpox is the only disease to be eradicated that infects only people. Eradication of other infectious diseases, like COVID-19 for one, is unlikely because there are alternate hosts in the animal population, and while it may be feasible to vaccinate domesticated animals such as cows to the point of herd immunity, it is unrealistic to think the same can be done for wild animals.


Ruins of 19th-Century Smallpox Hospital - Roosevelt Island - New York City - USA - 01 (41147019525)
Ruins of the Smallpox Hospital built in the 19th century on Roosevelt Island in New York City. Photo by Flickr user Adam Jones.

Cows have another favorable trait in reaching herd immunity besides being easily available for their shots, which is that they don’t subscribe to bizarre, illogical, and unscientific conspiracy theories egging them on to refuse vaccinations, if that was a possibility for them. Rinderpest, like its cousin infecting humans, the measles virus, is among the most contagious diseases on the planet, and the more contagious a disease is, the higher the percentage of a susceptible population must be vaccinated in order to achieve herd immunity. For measles and rinderpest, that’s over 90 percent.

Smallpox is – was – in the middle of the scale as far as its contagious qualities, but among the deadliest at around 30 percent fatalities. Influenza, with notable exceptions throughout history, such as the 1918-19 Spanish Flu outbreak, is at the lower end of the scale for both contagiousness and deadliness. COVID-19, like the other Severe Acute Respiratory Syndrome (SARS) viruses to which it’s related, is higher up the scale for contagiousness than the annual flu, but it is nowhere near as deadly as smallpox, though deadliness as always is strongly affected by a victim’s socioeconomic circumstances. The poor, as always in any affliction, die in droves, while the better off have access to the best care and are less likely to be infected in the first place.

One by one through the nineteenth and twentieth centuries, diseases that had killed hundreds of millions over thousands of years were brought under control with vaccines and other public health measures, such as better sanitation. There have always been people of skeptical of the effectiveness of vaccines or suspicious of the motives of the medical people, often affiliated in one way or another with a government entity, who administered the vaccines. The difference in then from now is that before about 1980 evidence of a world without vaccines was still readily available to everyone, rich and poor, living in the industrialized northern hemisphere or in the largely agricultural southern hemisphere.



In 1947, when the threat of disfigurement or death from smallpox was still very real to everyone, the citizens of New York City lined up for blocks to receive vaccinations in order to stem a possible outbreak.

 

Today, people in richer countries no longer see the effects of smallpox at all, and rarely do they see the effects of less disfiguring, less deadly diseases like measles. If COVID-19 were to leave visible scars on those who suffered and survived, instead of just the internal scars it does leave, one wonders if at least some of the people ready to dismiss the seriousness of the disease and the severity of the outbreak would be as obstinate about complying with public health measures.

If there were still children crippled by polio in every neighborhood, would there still be people who are more willing to believe an insane theory about vaccines they read in their Facebook “news” feeds than the scientific fact of once rampaging infections brought to heel in the past two hundred years? No doubt there will always be some hard cases who can’t be reached through reason, no matter what. The amount of the U.S. population vaccinated against COVID-19 is currently about 43 percent, and it needs to be over 70 percent to reach herd immunity. It will be best to cross that threshold before cold weather sets in again, forcing people back indoors. If it’s not, then COVID-19, a disease that will likely never be eradicated, only controlled, could surge once more, making this summer of relative freedom appear in retrospect like a fool’s paradise.
— Vita

 

Equal Application of the Law

 

“And when you pray, do not be like the hypocrites, for they love to pray standing in the synagogues and on the street corners to be seen by others. Truly I tell you, they have received their reward in full.”
Matthew 6:5, from the New International Version of the Bible.

When state and local governments include churches, mosques, and synagogues in their lockdown orders due to coronavirus, it might at first glance seem to be an infringement on religious freedom, but such is not the case. It would be an infringement if government singled out particular institutions which were in almost every way like other institutions except for their religious character. In this public health emergency, however, the only concern government officials have with religious institutions is the one characteristic they share with some other institutions, which is how they typically gather together large groups of people, a characteristic more conducive to spreading coronavirus than to tamping it down.


Congregating for the purpose of religious worship is no more under attack in these coronavirus lockdown orders than assembly for the political purpose of voting. This hasn’t stopped some religious leaders from loudly claiming they and their congregants are being persecuted by government in general and by the Democratic Party in particular. It hasn’t taken long for the coronavirus to become politically as polarized as everything else in our society. The virus itself has not expressed a political preference and, like past viruses, attacks everyone equally.

No one is denying religious freedom to churchgoers, only the freedom to go to church in large numbers at one time. Congregating has always been an important element of religious ritual for many people in many religions, but a public health emergency supersedes the wish of some to carry on as always at the expense of and to the detriment of the many. People can still pray, and in most places they can still gather to pray in groups of less than ten or thereabouts.


Réplique du tombeau du Christ à Pâques 2017 dans l'église Saint-Paul-Saint-Louis
Replica of Jesus Christ’s tomb at Easter 2017 in the church of Saint-Paul-Saint-Louis, in Paris, France. Photo by Tangopaso.

Some pastors don’t see it that way. They are pastors of Southern Baptist churches, by and large. They are led in their right wing political views and gullible belief in hoaxes concocted by their devilish foes in the center and left of American politics by people like Jerry Falwell Jr., the president of Liberty University in Lynchburg, Virginia. For these people, churchgoing is perhaps even more a social bond than it is a religious experience. They go to see and be seen.

Church is also a place where they reaffirm to each other their political bond, which is conservative at least, and right wing more often with each passing year. Taking away their church gatherings of dozens or hundreds of people in close proximity to each other is seen by them as prying apart the social and political bonds which are more important to them than the religious bonds affirmed in regular churchgoing. Their pastors can grandstand about supposed government and leftist persecution of their religious institutions, but their real worry is loosening the social and political bonds cemented regularly in seeing and being seen by their fellow congregants.
— Vita

 

Everyone Has an Opinion

 

The measles outbreak in Clark County, in the southwestern corner of the state of Washington, across the Columbia River from Portland, Oregon, has brought national attention to the beliefs of people who do not get some or all vaccinations for one reason or another, because vaccination rates in Clark County are far below the national average. The term many have come to apply to these people is “anti-vaxxer”, though it unfairly lumps everyone together, including people who are less against vaccines as they are for personal liberty, or who object on religious grounds. Since vaccination is a public health issue, however, the reasons for not getting vaccinated do not matter as much as the effects.

 

The history of the differing reasons for vaccine opposition goes back to the introduction of the smallpox vaccine, primarily by Edward Jenner, in the late eighteenth and early nineteenth centuries in England. The idea that to combat a disease a person should voluntarily introduce a weakened form of it into his or her body ran counter to reason. Vaccination methods of the time were far cruder than today, and since sterilization of wounds and bandages were little understood, infection often followed upon vaccination. The alternative was death or disfigurement from a full force smallpox infestation, and some religious folks actually expressed preference for that because it was “God’s will.”

Bracing for a short, sharp jab
In Merawi, Ethiopia, a mother holds her nine month old child in preparation for a measles vaccination. One in ten children across Ethiopia do not live to see their fifth birthday, with many dying of preventable diseases like measles, pneumonia, malaria, and diarrhea. British aid has helped double immunization rates across Ethiopia in recent years by funding medicines, equipment, and training for doctors and nurses. Photo by Pete Lewis for the United Kingdom Department for International Development (DFID).

Measles world map-Deaths per million persons-WHO2012
World Health Organization (WHO) 2012 estimated deaths due to measles per million persons, with bright yellow at 0, dark red at 74-850, and shades of ocher from light to dark ranging from 1-73. Gray areas indicate statistics not available. Map by Chris55.

Those who didn’t object to vaccination on grounds of cutting into a healthy body and introducing a light case of the disease or a bad case of infection, or of meddling in God’s will, objected to the perceived unnaturalness of the procedure since the vaccine ultimately came from cows infected with cowpox. To those people, introduction into the human body of something from an animal was unwholesome, even dangerous. Never mind that people do the same thing all the time when they eat meat, presumably from animals and not from other people, without the ill effects these folks foresaw, such as taking on the traits of animal whose parts were introduced directly into human flesh. On the other hand, perhaps they were taking the dictum “you are what you eat” to a logical extreme somehow unimpeded by the process of digestion.


It is probably best not to overload these viewpoints with the rigors of logic. People have their opinions, and they often do not bother to make the distinction between opinions and facts. The fact is that through vaccination programs, smallpox has been eradicated worldwide since the middle of the twentieth century, roughly 150 years after introduction of the vaccine. Similarly, measles in the United States disappeared around the turn of this century after nearly 50 years of vaccinations. About the time measles was going away in this country, in 1998 a doctor in England, Andrew Wakefield, published a report in the English medical journal The Lancet linking the MMR (Measles, Mumps, and Rubella) vaccine to autism and bowel disorders, and though the findings in the report and Dr. Wakefield himself were soon repudiated by the majority of other medical professionals, some anti-vaxxers latched onto the link with autism and have been running with it ever since, regardless of the lack of evidence to support the link.

 

Measles US 1944-2007 inset
Centers for Disease Control and Prevention (CDC) statistics on U.S. measles cases (not deaths). Chart by 2over0.

The problem with anti-vaxxers of one stripe or another running with opinions mistaken for their own version of the facts is that vaccinations are needed most by vulnerable populations such as the very young, the very old, and people with suppressed immune systems. Infants cannot be vaccinated against measles at all. Many of these vulnerable people are in the position of having decisions made for them by responsible adults. In the case of children, that would be their parents, who of course have the best interest of their children at heart. The difficult point to get across to those parents is that in a public health issue involving communicable diseases, their decision not to vaccinate their children affects not only their children, but those other most vulnerable members of the greater society as well. Public health is a commons, shared by all, like clean water and clean air, and the tragedy of the commons is that a relatively few people making selfish decisions based on ill-informed opinions can have a ripple effect on everyone else. Personal liberty is a fine and noble ideal, but when it leads to poisoning of the commons then quarantine is the only option, either self-imposed or involuntary.
— Vita

 

Too Hot for School

 

There never was any truth to the notion that schools closed in the summer so that farm children could help out with chores at home, and the real reason had to do with urban schools having low attendance in the summer and teachers and administrators wanting a summer break to escape city heat in the days before air conditioning, as well as using the extended break to pursue avocations or take temporary jobs. Farm children were needed at home in the spring for planting, and then again in the fall for the harvest. While it’s true farm work never slacks off entirely, particularly when animal husbandry is involved, there still were lulls in the summer and in the winter when children could attend school. Through most of the nineteenth century, a short school year was sufficient for farm children who had no ambitions in learning beyond the sixth or eighth grade. Farm children who had greater ambitions resorted to supplementing their learning on their own when they could, much like what we know about how Abraham Lincoln learned to become a lawyer.

 

The modern summer break came about instead from the needs of urban school administrators and the upper and middle economic class students and their families who supported many of the schools. The needs of poor students and their families, as always, hardly entered into the concerns of the rest of society. Before school attendance became compulsory in the late nineteenth century, urban schools were open year round, but often were only half full, and even less than that in the summer. School administrators eventually came around to following the model of colleges by closing for the summer so that students and teachers could pursue other interests outside the baking cities, leaving behind only enough staff to help students who needed to take extra courses of learning during the break. Public health officials added their approval to emptying out the schools in summer because they deemed the hothouse conditions unhealthful in general, and not conducive to learning in particular. By the early twentieth century school administrators had generally adopted the summer break, which started in late May or early June, and ended in late August or early September.

September - back to work - back to school - back to BOOKS LCCN98509757
A 1940 Works Progress Administration (WPA) poster promoting reading and library use upon returning to school in September after the summer break.

The system appeared to work well for most of the twentieth century. Rural schools synchronized their schedules with those of their urban counterparts so as not be left behind as it became increasingly clear a high school diploma was the minimum academic achievement necessary in modern society. The tourism industry could count on a steady source of both customers and labor during the two to three month summer break. The American public school system ranked highly among the school systems of other industrialized nations, even with its extended summer break. Then in the late twentieth century alarm bells started sounding about the supposed failings of that highly successful public school systems, the details of which are beyond the scope of this article, and so in effort to increase academic rigor, or at least appear to do so, school boards have been eroding the summer break, largely on the back end.

Satterfield cartoon - Back to School (1913)
A 1913 “Back to School” cartoon by Bob Satterfield (1875-1958) that captures how most children have always viewed the occasion.

 

In many school districts, fall semester classes now start in the first weeks of August. School may have ended in mid-June, leaving less than two months for the summer break. And yet still academic achievement appears to be falling, at least among the middle and lower economic classes. That also is another article for another day. For today it is sufficient to point out that the public school system does not exist in isolation from the greater society, and lackluster academic achievement by the students cannot be remedied merely by making them sit at their desks for more days every year.

The problem is in quality, not quantity. The society as a whole is fracturing, and the problems with poor learning begin and end in the home. The long summer break enacted by the twentieth century public school system was an excellent compromise that worked well for nearly everyone except families that had both spouses working outside the home. That presents a difficulty today, too, but the answer is not in charging the public schools with child daycare duties and calling that increased academic rigor. It’s not. August is too hot for school, in air conditioned facilities or not. August is for causing students anxiety about the imminence of schools reopening when they start seeing “Back to School” sale advertisements, which now also draw the attention of their teachers, who too often feel pressed to use their own money to buy supplies for their students. July is too early for a return of that unique schooldays anxiety, especially when schools closed only a few weeks before, in June.
— Vita

 

Change at the Grass Roots

 

It may seem like hyperbole to compare growing a lawn with smoking (not combining the two, as in smoking grass), but when weighing the environmental and health effects of both rather useless activities, they may not be all that dissimilar. A lawn is purely ornamental and serves no practical purpose when it is not used as pasture for grazing animals. Deer may come out of the woods to clip parts of a suburban lawn, but for the most part keeping a lawn within the height limits deemed proper by neighbors is left up to the homeowner. Anything higher than about six inches meets with disapproval from neighbors and, in the case of a homeowners association rules, may merit a written slap on the wrist.

 

There was a time not long ago when most people smoked, and smoked everywhere. Movies of contemporary stories from the 1940s and 1950s showed actors portraying their characters as human chimneys. Few people thought much of it up until 1964, when the Surgeon General issued a report on the dangers of smoking. Even then, it took another generation for the momentum of social disapproval of smoking to build to a tipping point, largely because of the obstructive practices of the tobacco industry. In the matter of lawn growing, the balance is now tipped in favor of the people who dump fertilizers and broad leaf herbicides on their lawns to achieve an ideal of carpeted green perfection, and then burn up fossil fuels in order to keep that exuberant growth clipped to a manicured standard.

20101020 Sheep shepherd at Vistonida lake Glikoneri Rhodope Prefecture Thrace Greece
Sheep, goats, and a shepherd near Lake Vistonida in Thrace, Greece. Photo by Ggia.

Gras
Grass, with buttercups. Photo by Steffen Flor.

Given the information available about the toxic effects of fertilizer and herbicide runoff, and the deleterious effects on the climate of continued burning of fossil fuels, it seems insane to idealize the perfect lawn and what it can take to achieve perfection. Yet as things stand now, the people with model lawns are the ones who look down on everyone else and appoint themselves as standard bearers. Perhaps if more people understood the destructive effects to their own health and to the environment of all their fussing over lawns, then the balance would start to tip the other way toward saner practices.

When homeowners apply fertilizers and herbicides to their lawns, there is no obvious puff of smoke to notify everyone else of the activity. It is not as obvious then as smoking, and therefore general social disapproval will take a long time to build, and may never build to a tipping point the way it did with smoking. Education will probably be the main factor in changing people’s behavior. There are state laws which require commercial herbicide or pesticide applicators to post signs on lawns they have treated. Those are the 4 inch cards on sticks stuck into lawns, and to the extent that most passersby and neighbors give them any attention, they can easily mistake them as advertisements for the lawn care company.

The opening scene of Blue Velvet, a darkly satirical 1986 film directed by David Lynch. Besides demanding large amounts of fertilizers and herbicides to look their best, lawns gulp huge amounts of water in order to stay green throughout the warmest months.

Most people are away at work when lawn care companies do their treatments, and so they aren’t around to catch a whiff of the cabbage smell of the typical broad leaf herbicide as it drifts around the neighborhood. And of course, the homeowner who does his or her own applications, usually on the weekends when neighbors are also home, does not bother with any formal notifications at all. A neighbor might ask such a homeowner “What’s that smell?” To which the enterprising amateur lawn care enthusiast might reply, without apparent knowledge of or concern about the collateral damage of his or her efforts, “That’s the smell of the green, green grass of home!”
— Izzy

 

Fahrenheit 161

 

There are several time and temperature combinations for pasteurizing milk, but one of the most common involves heating it to 161 degrees Fahrenheit for 15 seconds, known as High Temperature Short Time (HTST). The milk still needs refrigeration afterward to slow the growth of microorganisms that may remain in it, since pasteurization kills most of them but does not completely eliminate them from the milk. In Europe, milk is most often treated with Ultra High Temperature (UHT) at 275 to 302 degrees Fahrenheit for 4 to 15 seconds, making it aseptic and capable of being stored at room temperature for up to six months. Both processes have grown out of public health measures which have transformed food safety over the past 150 years, a period when such oversight was especially needed as increasing urbanization meant fewer people retained direct connections to the production of their food.

 

Emile Charles Dameron Besuch am Bauernhof
Visit to the Farm, painting by Emile Charles Dameron (1848-1908).

The body temperature of a cow is 101.5 degrees Fahrenheit. That is well below the 145 degrees Fahrenheit which is the absolute minimum for any kind of effective pasteurization. Drinking raw milk can be safe for only a short time before any pathogens present in it start to proliferate. Calves have understood this for millennia, which is why they have never bothered with any storage measures. They drink mama’s milk straight from her udder, and that’s been good enough for them. Things are different and more complicated with humans, as they always are. To begin with, it’s strange for one species to be drinking the milk of another at all. Be that as it may, people have decided they enjoy drinking cow’s milk, and apparently have done so for millennia, though not as long as the calves the milk was meant to succor.

In the ensuing thousands of years, and especially in the past 150, people have moved off farms and into cities in such great numbers that the majority of them now do not come any closer to cows than a hundred yards or more on a drive through the countryside. This means they have little idea of the conditions those cows live under and are milked under, and also rarely have the opportunity to drink that milk as it was meant to be drunk, at 101.5 degrees Fahrenheit within seconds of milking. So to get around all that we cook the milk. Raw food can be a great, healthy addition to anyone’s diet, but for the sake of safety we cook a lot of our food, and particularly food we haven’t grown or raised ourselves and aren’t prepared to consume immediately. Cooked food must be an acquired taste, because obviously in nature food is seldom cooked. Our ancestors must have discovered good reasons to cook food, probably through much trial and error involving unfortunate intestinal distress or even death. Louis Pasteur observed under the microscope the reasons for our ancestors’ common sense use of fire to cook food. Cooking food may not be the most natural thing, but it’s better than guessing and throwing the dice. Fire is good.
— Techly

A scene from Mel Brooks’s 1974 film Young Frankenstein, with Peter Boyle as The Monster, and Gene Hackman as Blindman.