Everyone Has an Opinion

 

The measles outbreak in Clark County, in the southwestern corner of the state of Washington, across the Columbia River from Portland, Oregon, has brought national attention to the beliefs of people who do not get some or all vaccinations for one reason or another, because vaccination rates in Clark County are far below the national average. The term many have come to apply to these people is “anti-vaxxer”, though it unfairly lumps everyone together, including people who are less against vaccines as they are for personal liberty, or who object on religious grounds. Since vaccination is a public health issue, however, the reasons for not getting vaccinated do not matter as much as the effects.

 

The history of the differing reasons for vaccine opposition goes back to the introduction of the smallpox vaccine, primarily by Edward Jenner, in the late eighteenth and early nineteenth centuries in England. The idea that to combat a disease a person should voluntarily introduce a weakened form of it into his or her body ran counter to reason. Vaccination methods of the time were far cruder than today, and since sterilization of wounds and bandages were little understood, infection often followed upon vaccination. The alternative was death or disfigurement from a full force smallpox infestation, and some religious folks actually expressed preference for that because it was “God’s will.”

Bracing for a short, sharp jab
In Merawi, Ethiopia, a mother holds her nine month old child in preparation for a measles vaccination. One in ten children across Ethiopia do not live to see their fifth birthday, with many dying of preventable diseases like measles, pneumonia, malaria, and diarrhea. British aid has helped double immunization rates across Ethiopia in recent years by funding medicines, equipment, and training for doctors and nurses. Photo by Pete Lewis for the United Kingdom Department for International Development (DFID).

Measles world map-Deaths per million persons-WHO2012
World Health Organization (WHO) 2012 estimated deaths due to measles per million persons, with bright yellow at 0, dark red at 74-850, and shades of ocher from light to dark ranging from 1-73. Gray areas indicate statistics not available. Map by Chris55.

Those who didn’t object to vaccination on grounds of cutting into a healthy body and introducing a light case of the disease or a bad case of infection, or of meddling in God’s will, objected to the perceived unnaturalness of the procedure since the vaccine ultimately came from cows infected with cowpox. To those people, introduction into the human body of something from an animal was unwholesome, even dangerous. Never mind that people do the same thing all the time when they eat meat, presumably from animals and not from other people, without the ill effects these folks foresaw, such as taking on the traits of animal whose parts were introduced directly into human flesh. On the other hand, perhaps they were taking the dictum “you are what you eat” to a logical extreme somehow unimpeded by the process of digestion.


It is probably best not to overload these viewpoints with the rigors of logic. People have their opinions, and they often do not bother to make the distinction between opinions and facts. The fact is that through vaccination programs, smallpox has been eradicated worldwide since the middle of the twentieth century, roughly 150 years after introduction of the vaccine. Similarly, measles in the United States disappeared around the turn of this century after nearly 50 years of vaccinations. About the time measles was going away in this country, in 1998 a doctor in England, Andrew Wakefield, published a report in the English medical journal The Lancet linking the MMR (Measles, Mumps, and Rubella) vaccine to autism and bowel disorders, and though the findings in the report and Dr. Wakefield himself were soon repudiated by the majority of other medical professionals, some anti-vaxxers latched onto the link with autism and have been running with it ever since, regardless of the lack of evidence to support the link.

 

Measles US 1944-2007 inset
Centers for Disease Control and Prevention (CDC) statistics on U.S. measles cases (not deaths). Chart by 2over0.

The problem with anti-vaxxers of one stripe or another running with opinions mistaken for their own version of the facts is that vaccinations are needed most by vulnerable populations such as the very young, the very old, and people with suppressed immune systems. Infants cannot be vaccinated against measles at all. Many of these vulnerable people are in the position of having decisions made for them by responsible adults. In the case of children, that would be their parents, who of course have the best interest of their children at heart. The difficult point to get across to those parents is that in a public health issue involving communicable diseases, their decision not to vaccinate their children affects not only their children, but those other most vulnerable members of the greater society as well. Public health is a commons, shared by all, like clean water and clean air, and the tragedy of the commons is that a relatively few people making selfish decisions based on ill-informed opinions can have a ripple effect on everyone else. Personal liberty is a fine and noble ideal, but when it leads to poisoning of the commons then quarantine is the only option, either self-imposed or involuntary.
— Vita

 

Facebooktwitterredditmail

Too Hot for School

 

There never was any truth to the notion that schools closed in the summer so that farm children could help out with chores at home, and the real reason had to do with urban schools having low attendance in the summer and teachers and administrators wanting a summer break to escape city heat in the days before air conditioning, as well as using the extended break to pursue avocations or take temporary jobs. Farm children were needed at home in the spring for planting, and then again in the fall for the harvest. While it’s true farm work never slacks off entirely, particularly when animal husbandry is involved, there still were lulls in the summer and in the winter when children could attend school. Through most of the nineteenth century, a short school year was sufficient for farm children who had no ambitions in learning beyond the sixth or eighth grade. Farm children who had greater ambitions resorted to supplementing their learning on their own when they could, much like what we know about how Abraham Lincoln learned to become a lawyer.

 

The modern summer break came about instead from the needs of urban school administrators and the upper and middle economic class students and their families who supported many of the schools. The needs of poor students and their families, as always, hardly entered into the concerns of the rest of society. Before school attendance became compulsory in the late nineteenth century, urban schools were open year round, but often were only half full, and even less than that in the summer. School administrators eventually came around to following the model of colleges by closing for the summer so that students and teachers could pursue other interests outside the baking cities, leaving behind only enough staff to help students who needed to take extra courses of learning during the break. Public health officials added their approval to emptying out the schools in summer because they deemed the hothouse conditions unhealthful in general, and not conducive to learning in particular. By the early twentieth century school administrators had generally adopted the summer break, which started in late May or early June, and ended in late August or early September.

September - back to work - back to school - back to BOOKS LCCN98509757
A 1940 Works Progress Administration (WPA) poster promoting reading and library use upon returning to school in September after the summer break.

The system appeared to work well for most of the twentieth century. Rural schools synchronized their schedules with those of their urban counterparts so as not be left behind as it became increasingly clear a high school diploma was the minimum academic achievement necessary in modern society. The tourism industry could count on a steady source of both customers and labor during the two to three month summer break. The American public school system ranked highly among the school systems of other industrialized nations, even with its extended summer break. Then in the late twentieth century alarm bells started sounding about the supposed failings of that highly successful public school systems, the details of which are beyond the scope of this article, and so in effort to increase academic rigor, or at least appear to do so, school boards have been eroding the summer break, largely on the back end.

Satterfield cartoon - Back to School (1913)
A 1913 “Back to School” cartoon by Bob Satterfield (1875-1958) that captures how most children have always viewed the occasion.

 

In many school districts, fall semester classes now start in the first weeks of August. School may have ended in mid-June, leaving less than two months for the summer break. And yet still academic achievement appears to be falling, at least among the middle and lower economic classes. That also is another article for another day. For today it is sufficient to point out that the public school system does not exist in isolation from the greater society, and lackluster academic achievement by the students cannot be remedied merely by making them sit at their desks for more days every year.

The problem is in quality, not quantity. The society as a whole is fracturing, and the problems with poor learning begin and end in the home. The long summer break enacted by the twentieth century public school system was an excellent compromise that worked well for nearly everyone except families that had both spouses working outside the home. That presents a difficulty today, too, but the answer is not in charging the public schools with child daycare duties and calling that increased academic rigor. It’s not. August is too hot for school, in air conditioned facilities or not. August is for causing students anxiety about the imminence of schools reopening when they start seeing “Back to School” sale advertisements, which now also draw the attention of their teachers, who too often feel pressed to use their own money to buy supplies for their students. July is too early for a return of that unique schooldays anxiety, especially when schools closed only a few weeks before, in June.
— Vita

 

Facebooktwitterredditmail

Change at the Grass Roots

 

It may seem like hyperbole to compare growing a lawn with smoking (not combining the two, as in smoking grass), but when weighing the environmental and health effects of both rather useless activities, they may not be all that dissimilar. A lawn is purely ornamental and serves no practical purpose when it is not used as pasture for grazing animals. Deer may come out of the woods to clip parts of a suburban lawn, but for the most part keeping a lawn within the height limits deemed proper by neighbors is left up to the homeowner. Anything higher than about six inches meets with disapproval from neighbors and, in the case of a homeowners association rules, may merit a written slap on the wrist.

 

There was a time not long ago when most people smoked, and smoked everywhere. Movies of contemporary stories from the 1940s and 1950s showed actors portraying their characters as human chimneys. Few people thought much of it up until 1964, when the Surgeon General issued a report on the dangers of smoking. Even then, it took another generation for the momentum of social disapproval of smoking to build to a tipping point, largely because of the obstructive practices of the tobacco industry. In the matter of lawn growing, the balance is now tipped in favor of the people who dump fertilizers and broad leaf herbicides on their lawns to achieve an ideal of carpeted green perfection, and then burn up fossil fuels in order to keep that exuberant growth clipped to a manicured standard.

20101020 Sheep shepherd at Vistonida lake Glikoneri Rhodope Prefecture Thrace Greece
Sheep, goats, and a shepherd near Lake Vistonida in Thrace, Greece. Photo by Ggia.

Gras
Grass, with buttercups. Photo by Steffen Flor.

Given the information available about the toxic effects of fertilizer and herbicide runoff, and the deleterious effects on the climate of continued burning of fossil fuels, it seems insane to idealize the perfect lawn and what it can take to achieve perfection. Yet as things stand now, the people with model lawns are the ones who look down on everyone else and appoint themselves as standard bearers. Perhaps if more people understood the destructive effects to their own health and to the environment of all their fussing over lawns, then the balance would start to tip the other way toward saner practices.

When homeowners apply fertilizers and herbicides to their lawns, there is no obvious puff of smoke to notify everyone else of the activity. It is not as obvious then as smoking, and therefore general social disapproval will take a long time to build, and may never build to a tipping point the way it did with smoking. Education will probably be the main factor in changing people’s behavior. There are state laws which require commercial herbicide or pesticide applicators to post signs on lawns they have treated. Those are the 4 inch cards on sticks stuck into lawns, and to the extent that most passersby and neighbors give them any attention, they can easily mistake them as advertisements for the lawn care company.

The opening scene of Blue Velvet, a darkly satirical 1986 film directed by David Lynch. Besides demanding large amounts of fertilizers and herbicides to look their best, lawns gulp huge amounts of water in order to stay green throughout the warmest months.

Most people are away at work when lawn care companies do their treatments, and so they aren’t around to catch a whiff of the cabbage smell of the typical broad leaf herbicide as it drifts around the neighborhood. And of course, the homeowner who does his or her own applications, usually on the weekends when neighbors are also home, does not bother with any formal notifications at all. A neighbor might ask such a homeowner “What’s that smell?” To which the enterprising amateur lawn care enthusiast might reply, without apparent knowledge of or concern about the collateral damage of his or her efforts, “That’s the smell of the green, green grass of home!”
— Izzy

 

Facebooktwitterredditmail

Fahrenheit 161

 

There are several time and temperature combinations for pasteurizing milk, but one of the most common involves heating it to 161 degrees Fahrenheit for 15 seconds, known as High Temperature Short Time (HTST). The milk still needs refrigeration afterward to slow the growth of microorganisms that may remain in it, since pasteurization kills most of them but does not completely eliminate them from the milk. In Europe, milk is most often treated with Ultra High Temperature (UHT) at 275 to 302 degrees Fahrenheit for 4 to 15 seconds, making it aseptic and capable of being stored at room temperature for up to six months. Both processes have grown out of public health measures which have transformed food safety over the past 150 years, a period when such oversight was especially needed as increasing urbanization meant fewer people retained direct connections to the production of their food.

 

Emile Charles Dameron Besuch am Bauernhof
Visit to the Farm, painting by Emile Charles Dameron (1848-1908).

The body temperature of a cow is 101.5 degrees Fahrenheit. That is well below the 145 degrees Fahrenheit which is the absolute minimum for any kind of effective pasteurization. Drinking raw milk can only be safe for a short time before any pathogens present in it start to proliferate. Calves have understood this for millennia, which is why they have never bothered with any storage measures. They drink mama’s milk straight from her udder, and that’s been good enough for them. Things are different and more complicated with humans, as they always are. To begin with, it’s strange for one species to be drinking the milk of another at all. Be that as it may, people have decided they enjoy drinking cow’s milk, and apparently have done so for millennia, though not as long as the calves the milk was meant to succor.

In the ensuing thousands of years, and especially in the past 150, people have moved off farms and into cities in such great numbers that the majority of them now do not come any closer to cows than a hundred yards or more on a drive through the countryside. This means they have little idea of the conditions those cows live under and are milked under, and also rarely have the opportunity to drink that milk as it was meant to be drunk, at 101.5 degrees Fahrenheit within seconds of milking. So to get around all that we cook the milk. Raw food can be a great, healthy addition to anyone’s diet, but for the sake of safety we cook a lot of our food, and particularly food we haven’t grown or raised ourselves and aren’t prepared to consume immediately. Cooked food must be an acquired taste, because obviously in nature food is seldom cooked. Our ancestors must have discovered good reasons to cook food, probably through much trial and error involving unfortunate intestinal distress or even death. Louis Pasteur observed under the microscope the reasons for our ancestors’ common sense use of fire to cook food. Cooking food may not be the most natural thing, but it’s better than guessing and throwing the dice. Fire is good.
— Techly

A scene from Mel Brooks’s 1974 film Young Frankenstein, with Peter Boyle as The Monster, and Gene Hackman as Blindman.

 

Facebooktwitterredditmail