Hypocognition – a term from psychology and linguistics meaning the inability to discuss or process a concept because of the lack of a word or words for it.
“Phubbing” is a portmanteau made up of “phone” and “snubbing”, and it describes the act of looking at one’s phone (presumably a smartphone) in order to avoid interaction with another person. It’s nearly always a rude action, and it can be dismissive and disrespectful when the phubber employs it to imply that whatever might be displayed on the phone’s screen is more interesting than the person in front of him or her. It’s a term that didn’t exist – and couldn’t have existed – before smartphones became ubiquitous.
People appear to have an ingrained reverence for the immediate demands of technological devices. Before smartphones, extricating oneself from an unwanted interaction in public meant having to invent excuses, such as an urgent appointment. Burying one’s interest in a book has never worked as well in closing off conversation as getting a phone call or even just looking intently at a smartphone’s screen. People will stop everything for someone who is on the phone, or nowadays only looking at one.
Detail of The Meeting Place, a 2008 high relief sculpture by Paul Day, on the concourse of St. Pancras train station in London, England. Photo by Patrice78500.
The concept of using one’s smartphone to rudely dismiss another person now has a name, “phubbing”, and therefore no longer falls into the category of hypocognition. There are numerous other fuzzy concepts that still qualify as hypocognition, at least for some people. The two groups at either extreme in their reaction to the coronavirus may be engaged in hypocognition, each of a different kind. There are the people who refuse to take public health measures seriously, and so endanger everyone; and there are the people who have allowed their fears to so intimidate them that they have imposed some unnecessary burdens on the rest of society in order to help them assuage those fears, as if they were unaware that everything in life carries an element of risk.
And then there is the matter of white privilege. African-Americans understand the concept of white privilege because they have to cope with its consequences throughout their lives. Most Caucasian-Americans do not grasp the concept because they swim in the currents of white privilege every day. It is the medium that envelopes them, and they cannot see how it protects them from the same dangers and insecurities faced by their African-American neighbors.
For example, say a white man is out jogging through a largely black neighborhood. This particular neighborhood is undergoing gentrification, by which everyone understands houses owned or rented by mostly poor blacks are being bought up cheaply by better off whites and then inhabited by them. The white jogger is new to the neighborhood, part of an influx of people who can afford nice things, and whose clothes generally reflect their status. But most folks would give this jogger a pass even if he wore old clothes with holes and tears for his exercise. No one in the neighborhood, black or white, suspects the white jogger is up to anything other than jogging.
Carl Reiner and Mel Brooks discuss the origins of some concepts in this clip from a portion of their ever changing 2000 Year Old Man improvisational comedy routine. This 1967 appearance is from the television program The Colgate Comedy Hour. Comedian Dick Shawn introduced them. R.I.P., Carl Reiner (1922-2020).
Now take the same circumstances and flip them 180 degrees, with a black man jogging through a largely white neighborhood. The black man lives in the neighborhood, and thus people don’t consider he has any gentrifying influence, no matter whether the neighborhood is working class or upper middle class. The black jogger wears neither very good nor very bad clothes for his exercise. All other factors being neutral, he’s just a black man out for a run through a white neighborhood. Think about what might happen. The black jogger does, all the time. The white jogger in the other neighborhood, he never has to consider the possibility of something bad happening to him, simply because of who he is. That’s white privilege.
The detrimental effect on the dairy industry of lockdowns state governments have instituted in reaction to the coronavirus could have long term consequences, tipping the balance abruptly toward greater production of plant derived milks, butters, and cheeses. Traditional dairy has been losing market share to plant derived dairy for decades, with losses getting larger especially in the past decade. Now loss of revenue due to coronavirus lockdowns of schools and restaurants could mean bankruptcy for many dairy farms and a long term shift toward lower production as traditional dairy transforms into a lesser role.
There will no doubt always be demand for traditional dairy products, but if supermarket shelf space is an indicator of what consumers want, then plant derived milks have taken the largest chunk of shelf space away from traditional dairy, while butters, and particularly cheeses have been less competitive. The consumption of animal milk products has always been a peculiarly human practice. The desire for milk and associated products is so great that people will go to great lengths to produce and consume ersatz milk derived from nuts and grains. It is beyond the scope of this article to investigate why that is; it is enough merely to point out that consumption of milk fulfills for many people a deep-seated need, a need met for all other mammals in infancy, and then forgotten.
Different brands of oat milk available in a German organic supermarket in September 2015. Photo by Fretdf.
“Milk. n.s. [meelc, Saxon; melck, Dutch.] 1. The liquor with which animals feed their young from the breast. 2. Emulsion made by contusion of seeds.”
— fromA Dictionary of the English Languageby Samuel Johnson.
It follows then that animal milk production for human consumption is an artificial activity, consequently involving some pain and suffering by the animals, both mothers and their artificially weaned young. We have done these things for so long, for centuries going back ten thousand years or more to the beginning of agriculture, that we think the activities are natural. They are not. The closest parallel in the rest of the animal kingdom can be seen with how ants tend to aphids in order to secure for themselves the aphids’ honeydew secretions. Those secretions are not intended for consumption by the aphids’ young, however, but are merely a byproduct of their ingestion of plant juices. The relationship is closer – but not entirely the same – as our relationship to honey bees than it is to our relationship with dairy animals. The relationship we have with dairy animals is mere exploitation, closer to that of vampire bats with their prey, or to bloodsucking insects with their victims, or even to a virus with its host.
The First Continental Congress of the American Colonies sent a petition to King George III on October 25, 1774, requesting he redress their grievances against the British Parliament related to the Coercive Acts passed in response to the Boston Tea Party of December 16, 1773. The king ignored the petition, and consequently the colonists’ march toward revolution picked up momentum over the next year, resulting in the beginning of hostilities in the spring of 1775. Petitions were the primary recourse of the American Colonists in dealing with their British rulers across the Atlantic Ocean since they had no official representation in Parliament, hence the slogan “No taxation without representation.”
The nation’s founders regarded the right to petition the government as so essential to a free society that they included it in the First Amendment, adopted in 1791. They made the right explicit despite the reality that citizens of the United States, unlike colonists under the British Empire, had official representation in the government. James Madison, who was largely responsible for drafting the Bill of Rights, understood that while the people had representation in government, their representatives may not be responsive to the wishes of all the people, and that therefore the people required another, independent outlet “for a redress of grievances.”
The unresponsiveness of government representatives to the people has rarely appeared as evident as it does now, when it seems representatives are responsive mostly to the wishes of corporate contributors to their election campaigns. Polls do not necessarily give lawmakers an accurate idea of how some of their constituents are feeling about issues because responding to pollsters is a passive response to a pollster’s sometimes tailored questions. Poll sample sizes are also often ludicrously small on account of the expense and difficulty of polling. Pollsters claim they conduct their surveys based on well-researched principles in order to achieve accurate representation from small sample sizes, but there are plenty of examples to cite in demonstrating that taking polls is as much art as it is science, and not at all infallible. For one example, look at how inaccurate the polling was in several key Rust Belt states in the weeks before the November 2016 presidential election.
Emmeline Pankhurst, leader of the Woman’s Suffragette movement in England, arrested outside Buckingham Palace in London while trying to present a petition to King George V in May 1914. Photo from the British Imperial War Museum.
Signing a petition is an active measure taken by citizens numbering in the thousands or millions, as opposed to a select few hundreds or thousands responding passively to a pollster. Citizens mostly seek out petitions on their own initiative, or are made aware of them by friends or family, or by reading the news. The relative ease of signing a petition online, compared to signing one circulated door to door, does not discount that people are participating in the political process instead of waiting for someone to ask their opinion. The distinction is not a small one. Yes, physical participation in a protest weighs far more than signing an online petition in getting the attention of government leaders and the society at large, but an online petition nonetheless demonstrates that the people signing it are paying attention. Numbers have always given weight to petitions, and in the internet age it is possible for millions of people to make their wishes known to their representatives within days of a petition’s first appearance.
The petitions currently circulating urging United States House of Representatives legislators to impeach the occupant of the Oval Office are an excellent demonstration of the need of the people for an outlet to make their wishes known to their government. To anyone paying attention honestly to developments originating from the White House since January 2017, it has long been obvious that impeachment and conviction of the current president would be necessary sooner or later to uphold the rule of law. The nation’s legislators, however, always conscious of political calculations and of the interests of their big money donors, have been dragging their feet to avoid having to put themselves on the line in upholding the oath they took to preserve and defend the Constitution.
Captain Queeg, the character played by Humphrey Bogart in the 1954 film The Caine Mutiny, was obviously unstable, but nonetheless discharging him from his command was quite difficult because the captain of a vessel at sea is by necessity an autocrat whose authority is fully backed by a nation’s institutions. For all that, Captain Queeg was not a corrupt grifter with contempt for democratic institutions and a sneering disregard for the norms of civil discourse, and in comparison to the offenses of the current president, Queeg’s official transgressions were minor.
In other words, members of Congress have a constitutional duty to impeach this president for high crimes and misdemeanors he has engaged in too obviously for them to ignore any longer. Whether he will be convicted in the Republican-controlled Senate is anyone’s guess at this point. It probably depends on whether political calculations indicate to at least a few key Republican senators that the time has come at last to throw the president over the side, at which point many of the rest will scramble to get on board.
If millions of American people had waited politely for a pollster to ask them if impeachment was necessary, instead of taking matters into their own hands and petitioning their representatives, Congress might still be dithering, possibly all the way up to Election Day 2020. The current president may not get convicted in the Senate and removed from office before then, but it’s important that public hearings in Congress shine a light long enough and brightly enough on the corrupt and unethical practices of his administration that even the most disengaged voters will have to listen. A brick wall, no matter who constructed it, can keep people from hearing their government at work as well as keep government leaders from hearing the people, but now that representatives have finally listened to people engaged enough to petition them, it’s important that the rest of the populace listen honestly to the arguments for impeachment, and honest engagement requires more than checking an often lopsided Facebook news feed, a far sloppier way of exercising one’s civic duty than signing an online petition. — Vita
Where were you when the Allies stormed the beaches of Normandy on June 6, 1944? Were you only a glimmering in your parents’ brains?
Where were you when the Battle of Khe Sanh began on January 21, 1968? Were you nursing the bone spurs in your heels that would eventually earn you a medical deferment from the draft? Or were you awaiting a pilot’s commission in the Texas Air National Guard?
A drawing made by a refugee child, formerly resident in Pristina, Kosovo, depicting his horrific experiences in the Kosovo War in 1999. The drawing was taped to a wall in the Brazda refugee center in Macedonia. Photo from the U.S. Department of State and NATO.
Where were you when the United States and its allies launched the invasion of Iraq on March 20, 2003, beginning an unnecessary war that would spiral the entire region into chaos? Were you looking under furniture for weapons of mass destruction, something you would joke about later?
Where were you when the world learned in April 2004 that American soldiers had been torturing prisoners at Abu Ghraib prison? Were you throwing a few “bad apples” under the bus, rather than acknowledging a culture of cruelty encouraged from the top down in the chain of command? Or were you busy making the first year of your daytime television talk show a success? Or were you occupied with creating an illusion of yourself as a successful and hard-nosed, but fair, businessman on the first year of your television reality show that was more fiction than reality?
Dire Straits performs “The Man’s Too Strong” in concert at Wembley Arena in London, England in June 1985 during their Brothers in Arms tour.
Where were you in 2008 after conservatives had used the wedge issue of gay marriage four years earlier to whip up the ire of homophobic reactionaries and send them to the polls in just enough numbers to make it possible for the Republican candidate to steal another presidential election? Were you getting married? What does your friend, the Republican presidential candidate, have to say about that now? Is he against gay marriage only when it suits political expediency?
Where were youin August 2016 when the Turks made their first incursion into the Kurdish zone of Syria, where the Kurds had been America’s ally in the fight against the Islamic State in Syria (ISIS)? Were you listening to what the Russians had to say about your Democratic opponent in the presidential election, a practice you appear to have made into a habit since then as you extort other countries to get them to investigate your political rivals?
And where were all three of you when the brains were being passed out? It’s nice for people to have friends, but some friends are not worth having, such as a narcissistic sociopath or a war criminal, both of whom have proven time and again they look out only for themselves, and maybe their cronies as well. And in the sense of cronyism, a crony is not a true friend. And a friend may be a “sweet man” in private, but that shouldn’t shut out all the harm he’s caused in the world. Millions of Iraqis and Kurds may reflect on the old saying that “with friends like these, who needs enemies?”
David Gilmour, best known as the lead guitarist for Pink Floyd, performs the Pink Floyd song “Coming Back to Life” with a new band backing him in a concert at Pompeii, Italy in July 2016.
To maintain the integrity of a supplied drawing, people usually color as much as they can within the lines. Some people use crayons, while others use markers or pens. When it comes to using electromagnetic spectrum in the United States, the National Telecommunications and Information Administration (NTIA) is in charge of allocating bands within the spectrum and making sure everyone stays within their specified lines. The NTIA does its work within the Department of Commerce.
The Department of Commerce also oversees the National Oceanic and Atmospheric Administration (NOAA), which in turn oversees the National Weather Service (NWS). Independent of all these Department of Commerce agencies is the Federal Communications Commission (FCC), which regulates the parts of the spectrum allocated for its oversight by the FTIA, such as radio, television, and cellular phone frequencies. Beginning late last year, the FCC has been auctioning spectrum to mobile phone companies for them to use in their 5G networks. When the FCC auctioned off spectrum in the 24GHz (gigahertz) band, they raised alarm within the NOAA since that agency uses the 23.8GHz band in its weather satellites to measure water vapor in the atmosphere, a key component in its ability to forecast the weather.
This image of an outdated January 2016 Spectrum Wall Chart from the NTIA is only useful as an overview of just how tightly packed bandwidth allocation is in parts of the spectrum, based on the jumble of colors. For a better view, download a PDF (Portable Document Format) of the chart from the NTIA website, though even then it can be a strain on the eyes without higher magnification.
Now anyone who has ever manually tuned a radio receiver with a dial knows the radio stations do not stay exactly within their spectrum lines at all times, and depending on the power of the transmitters the different stations use and atmospheric conditions and the varying state of the ionosphere, some stations can occasionally push into the territory of other stations. That is what worries NOAA administrators about the 24GHz band proposed for 5G use by mobile phone companies and their man in the FCC, Chairman Ajit Pai. NOAA administrators believe 24GHz is too close for comfort and may occasionally interfere with its use of 23.8GHz, which it cannot change because it is determined by the physical law of water vapor’s behavior. They believe the interference could cause as much as a 30 percent drop in forecasting efficiency, akin to stepping back in time to 1980.
This inter agency squabble isn’t even necessary, it turns out, because if the FCC and American mobile phone companies followed the European model for ensuring minimal interference with weather satellites, they would simply add greater restrictions to the transmitting power of 5G antennas in the higher bandwidths and rely more extensively on mid-range bandwidths that are not only better for 5G transmission, but also safely removed from the vicinity of crucial weather data transmissions.
A May 2019 news report from Sky News in London, England.
There will be a World Radiocommunication Conference in Egypt in October and November, where attendees will set international standards for 5G. Considering the attitudes and policies of the current presidential administration, the American delegation will probably resist the European model and go its own incautious way in order to serve the interests of the major telecommunications companies. It’s possible the American model may turn out fine eventually, but considering the drawbacks of being wrong, wouldn’t it be prudent to heed the concerns of weather forecasters, at least until more field testing proves without a doubt the safety of using the 24GHz band of the spectrum? To satisfy the greed of telecommunications executives and the desire of some smartphone users for faster loading Facebook feeds, is it worth having a hurricane drop in on us unexpectedly? A real hurricane, that is, not one drawn with crayons, however neatly.
“It ill becomes us to invoke in our daily prayers the blessings of God, the Compassionate, if we in turn will not practice elementary compassion towards our fellow creatures.”
— Mahatma Gandhi (1869-1948)
Anyone who has ever been a vegetarian or vegan even for a short time has probably at some point encountered hostility from a meat eater, perhaps on several occasions from many different people. The experience can be baffling, particularly if the vegetarian or vegan does not make a big show of their practices. Self-righteous and preachy behavior can be annoying, certainly, but even when a vegetarian or vegan abstains from being a smug boor, some meat eaters will attack them as if they had been. A couple of recent news items help illustrate the innate hostility some people harbor for those who don’t adhere to mainstream dietary practices, even though it’s no one’s business but their own and the majority of them do not go out of their way to bother anyone.
Arby’s, an American fast food chain specializing in roast beef sandwiches, has come out with turkey meat processed to look like a bloated carrot, and in London two men have been found guilty of disorderly behavior after they ate raw squirrels in front of a vegan food stand. The actions of both Arby’s and the London squirrel eaters are obvious attempts to troll vegetarians and vegans, and their reasons for doing so say more about their own stunted mentality than anything else. Arby’s has for some time used an advertising slogan which proudly declares their enthusiasm for meat, and plenty of it. It is a fair guess that even if the political culture of Arby’s management is not necessarily right wing, they do assess their customer base as right wing, and trolling the perceived political correctness of their fast food competitors who have lately been offering vegetarian menu options is a good way to appeal to them.
Marzipan carrots for carrot cake. Marzipan consists primarily of almond paste and sugar or honey, and vegetarians would partake of it, though if honey were in it, vegans would not. Photo by SKopp.
Like everything else in our society, there is a political division in people’s dietary choices. Vegetarians and vegans are mostly liberals. Other liberals who are meat eaters are more likely to react to alternative diets with indifference or polite curiosity. At any rate, most of them do not perceive vegetarians and vegans as threats. Not so political conservatives, particularly those with authoritarian leanings. The difference is so striking that it can almost be used as a reliable indicator of political beliefs: hostility to diets at variance with the mainstream is a good clue that a person might be right wing. Often these people will appoint themselves to keep an eye on vegetarians and vegans for backsliding, no matter how innocuous their target is about minding their own business and not actively posing a threat to them. If threats are not real, they will be imagined! We have met the enemy, and it is Them, the Others!
Nothing delights these self-appointed guardians of imagined societal standards more than catching a vegetarian or worse, a vegan (and therefore probably a liberal!) in an act of perceived hypocrisy, because then they can denounce the entire belief system and not be bothered anymore by any of its implications, such as cruelty to animals or environmental degradation. A problem ignored is a problem solved! Meat eaters who worry about the perceived sanctimonious behavior of non-meat eaters occasionally like to bring up the supposed fact of Adolf Hitler’s vegetarianism, as if the actions and beliefs of one ogre tarnish all vegetarians. That is like suggesting the beliefs and actions of all Christians are suspect simply because some white evangelical Christian leaders are terrible human beings.
In this Merrie Melodies cartoon from 1947, Bugs Bunny and Elmer Fudd are at odds with each other as always, and the cartoon finishes with action that for its time was considered normal.
It is interesting to note that in dealing with hostility from some meat eaters, non-meat eaters discover they can assuage the unease of their interrogators when they ask about the reasons for their choice by stressing the healthful benefits over the other issues. That approach is not entirely dishonest, since there are real benefits for human health in foregoing or at least restricting meat eating. The American diet of meat with nearly every meal is not the most healthful, nor is it the historical norm. Most Americans could stand to reduce their consumption of meat, and in doing so they would benefit their own health as well the health of the environment and the quality of life for billions of animals. It is interesting and sad to note that of the three primary benefits of an alternative diet, only the first sets well with right wing authoritarians, and only on account of selfish reasoning.
“And the Lord said unto Cain, Where is Abel thy brother? And he said, I know not: Am I my brother’s keeper?” — Genesis 4:9, from the King James Version of the Bible.
Imagine a television game show in which the announcer calls four contestants from the studio audience to the foot of the stage, a stage that is a mock up of a pharmacy, with a counter behind which stands the host, looking like a pharmacist in a white smock. The host directs everyone’s attention to one side of the stage, where an assistant – also in a white smock – presents a year’s worth of a popular prescription drug, let’s say insulin. The host then asks the four contestants to guess the price of the insulin without going over the amount, and the contestant with the closest guess gets to come up on stage for the opportunity to win prizes.
A lithograph promoting James Morison’s alternative medicines, showing a skeletal figure surveying three doctors around a cauldron in a parody of Macbeth and the three witches. From the Wellcome Collection gallery, London, England.
The remaining contestants try again on the next round, and some of them go home empty handed. Almost all the studio audience in attendance go home without even having been invited to participate. The few winners in each episode get to take home expensive prizes such as the year’s worth of insulin, valued at thousands of dollars, but the majority of those attending a taping of the game show go home with nothing or with a cheap consolation prize, such as a bottle of gummy vitamins. This game show analogy is not far off how Americans seem to prefer having their health care system operate, particularly drug pricing.
If you’re lucky, if you win the lottery (another gamed system Americans seem to prefer over taxing the rich), then you’re good as gold. The majority, however, may run into problems and tough choices, such as paying the rent or buying insulin; paying utility bills or buying any of the number of the life preserving medications people depend on, particularly as they get older. Prescription drugs are every bit as crucial to survival for some people as food and shelter, and yet Americans seem to prefer to let the drug industry operate like any other capitalist endeavor. Profits for drug manufacturers are more important than a decent life for an unfortunate number of citizens who can’t afford the high prices those manufacturers demand simply because they can, and the people principally to blame for this awful situation are some cretins in Congress.
“Someday Never Comes”, by Creedence Clearwater Revival from their 1972 album Mardi Gras.
Who are the people responsible for putting those cretins in Congress and in positions where they can run cover for the drug companies? Why, they are in large part the same people who struggle to buy overpriced prescription drugs. Why do they do this to themselves? Ah, that is the question bedeviling America’s sickness today. The unfortunate part is that while some voters are caught up in Congressional posturing and not paying attention to substance, there are many voters who don’t share their ignorant love of machismo and capitalist lotteries and yet are forced to share the results of the bad policies ensuing from all that greed and childishness. They have to scramble for the scraps left over from the game, while a few wealthy grifters laugh at how they have duped enough voters to go along with their rigged game to keep it going, dangling prizes before the willing saps. Instead of gambling on a rigged capitalist lottery, sensible adults take measures, even – horrors! – socialist measures, to ensure decent results everyday for everybody when it comes to matters of survival like food, shelter, education, and medical care, including drugs.
The measles outbreak in Clark County, in the southwestern corner of the state of Washington, across the Columbia River from Portland, Oregon, has brought national attention to the beliefs of people who do not get some or all vaccinations for one reason or another, because vaccination rates in Clark County are far below the national average. The term many have come to apply to these people is “anti-vaxxer”, though it unfairly lumps everyone together, including people who are less against vaccines as they are for personal liberty, or who object on religious grounds. Since vaccination is a public health issue, however, the reasons for not getting vaccinated do not matter as much as the effects.
The history of the differing reasons for vaccine opposition goes back to the introduction of the smallpox vaccine, primarily by Edward Jenner, in the late eighteenth and early nineteenth centuries in England. The idea that to combat a disease a person should voluntarily introduce a weakened form of it into his or her body ran counter to reason. Vaccination methods of the time were far cruder than today, and since sterilization of wounds and bandages were little understood, infection often followed upon vaccination. The alternative was death or disfigurement from a full force smallpox infestation, and some religious folks actually expressed preference for that because it was “God’s will.”
In Merawi, Ethiopia, a mother holds her nine month old child in preparation for a measles vaccination. One in ten children across Ethiopia do not live to see their fifth birthday, with many dying of preventable diseases like measles, pneumonia, malaria, and diarrhea. British aid has helped double immunization rates across Ethiopia in recent years by funding medicines, equipment, and training for doctors and nurses. Photo by Pete Lewis for the United Kingdom Department for International Development (DFID).
World Health Organization (WHO) 2012 estimated deaths due to measles per million persons, with bright yellow at 0, dark red at 74-850, and shades of ocher from light to dark ranging from 1-73. Gray areas indicate statistics not available. Map by Chris55.
Those who didn’t object to vaccination on grounds of cutting into a healthy body and introducing a light case of the disease or a bad case of infection, or of meddling in God’s will, objected to the perceived unnaturalness of the procedure since the vaccine ultimately came from cows infected with cowpox. To those people, introduction into the human body of something from an animal was unwholesome, even dangerous. Never mind that people do the same thing all the time when they eat meat, presumably from animals and not from other people, without the ill effects these folks foresaw, such as taking on the traits of animal whose parts were introduced directly into human flesh. On the other hand, perhaps they were taking the dictum “you are what you eat” to a logical extreme somehow unimpeded by the process of digestion.
It is probably best not to overload these viewpoints with the rigors of logic. People have their opinions, and they often do not bother to make the distinction between opinions and facts. The fact is that through vaccination programs, smallpox has been eradicated worldwide since the middle of the twentieth century, roughly 150 years after introduction of the vaccine. Similarly, measles in the United States disappeared around the turn of this century after nearly 50 years of vaccinations. About the time measles was going away in this country, in 1998 a doctor in England, Andrew Wakefield, published a report in the English medical journal The Lancet linking the MMR (Measles, Mumps, and Rubella) vaccine to autism and bowel disorders, and though the findings in the report and Dr. Wakefield himself were soon repudiated by the majority of other medical professionals, some anti-vaxxers latched onto the link with autism and have been running with it ever since, regardless of the lack of evidence to support the link.
Centers for Disease Control and Prevention (CDC) statistics on U.S. measles cases (not deaths). Chart by 2over0.
The problem with anti-vaxxers of one stripe or another running with opinions mistaken for their own version of the facts is that vaccinations are needed most by vulnerable populations such as the very young, the very old, and people with suppressed immune systems. Infants cannot be vaccinated against measles at all. Many of these vulnerable people are in the position of having decisions made for them by responsible adults. In the case of children, that would be their parents, who of course have the best interest of their children at heart. The difficult point to get across to those parents is that in a public health issue involving communicable diseases, their decision not to vaccinate their children affects not only their children, but those other most vulnerable members of the greater society as well. Public health is a commons, shared by all, like clean water and clean air, and the tragedy of the commons is that a relatively few people making selfish decisions based on ill-informed opinions can have a ripple effect on everyone else. Personal liberty is a fine and noble ideal, but when it leads to poisoning of the commons then quarantine is the only option, either self-imposed or involuntary.
Snow has always been more problematic for movie sets than rain, but when the filmmakers and their special effects people do it well it creates an atmosphere for viewers that suspends their disbelief to the point of not noticing smaller details, like how the snow fallen on performers doesn’t appear to melt quickly when they go indoors, where it is presumably warmer than it is outside. All sorts of obstacles dictate the use of fake snow for movies rather than the real stuff, from warm weather outdoors to shooting scenes indoors on sound stages. Real snow also compacts underfoot, making it impracticable for filmmakers to get more than one or two takes in one spot outdoors even when they go to the trouble of brushing over footsteps to make the snow appear fresh for retakes. As expensive as it is to make a movie, it makes sense to use fake snow.
In the early twentieth century, filmmakers created fake snow with bleached cornflakes, salt, flour, cotton wadding, asbestos, or combinations of those materials as well as others. All posed problems either of realism or health and safety. Cornflakes crunched underfoot and were difficult to use once sound came into movies; salt was corrosive; flour congealed on exposure to moisture; cotton was a fire hazard, and its replacement, asbestos, was a health hazard. Filmmakers experimented with many materials, but it wasn’t until Frank Capra’s 1946 film It’s a Wonderful Life that they came upon a winning formula which was realistic and safe.
Fake snow attracts visitors to Chamberlain Square in Birmingham, England, in August 2011 as part of the Six Summer Saturdays festival. The fake snow was supplied by Snow Business, an English firm that has also used the material on many movie sets. Photo by Elliott Brown.
For that film produced by the studio RKO (Radio-Keith-Orpheum), special effects supervisor Russell Shearman helped create a mix of foamite – a fire extinguisher material – with sugar, water, and soap flakes. Mr. Shearman’s snow effects were so convincing that the Academy of Motion Picture Arts and Sciences gave him and his special effects department a Technical Achievement Award for their work on It’s a Wonderful Life. Watching this Christmas classic over 70 years later, after special effects have made huge advances in duplicating reality, and noticing how at the end of the movie the “snowflakes” on Jimmy Stewart’s shoulders take a long time to melt when he comes indoors to a warm reception from his family, friends, and neighbors, should not detract from anyone’s enjoyment of a great cinematic moment or the filmmakers’ expert creation of George Bailey’s (Stewart’s) snowy odyssey one long Christmas Eve in the fictional New York town of Bedford Falls (or its nightmare alternative, Pottersville). Movie magic at its best suspends the viewer in another world for a time, and on the few occasions when the artifice shows through, it’s charitable not to be too picky and to brush them off.
Bing Crosby, Rosemary Clooney, Danny Kaye, and Vera-Ellen in the 1954 filmWhite Christmas, directed by Michael Curtiz, and with songs by Irving Berlin, including “Snow”. The performers take the train from Florida and eventually arrive in Vermont, where snow doesn’t fall until Christmas Eve.
The expansive, pastoral cemeteries we are familiar with today came into being in the nineteenth century when municipal officials dedicated large amounts of land in the suburbs or just outside cities to landscaped burial grounds. Church graveyards within city limits had become overcrowded, prompting worries about public health and the integrity of nearby building foundations. The rural cemetery movement took its cues from English garden design of the eighteenth century, with vast expanses of mowed grass broken up trees and shrubbery, and taking advantage of vistas where possible. Since there were no public parks when these new cemeteries were designed and built, they soon functioned as parks at a time when enjoyment of the outdoors was more restrained and dignified than it is today. No one in Victorian times was jogging past the gravestones in shorts and very little else.
Those Victorian era rural cemeteries became part of the suburbs and then eventually were swallowed up by their city when urban growth expanded to encompass them. They can still be islands for quiet contemplation within a city if they are large enough for visitors to get away from the noise and bustle of surrounding streets. Their park function has been usurped by purpose built parks that allow a greater range of activities, such as jogging or playing softball. No worries there about disrespect for the dead. Since the fine old garden cemeteries of the nineteenth century have become incorporated within cities their boundaries have been limited and now they are either full or nearly full of permanent guests.
A monument at Lowell Cemetery in Lowell, Masachusetts. Photo by Bernie Ongewe.
The serenity invoked by well tended grounds and beautiful vistas is of course for the living, not the dead, who presumably are beyond caring. The same can be said for the fine caskets and embalming services offered by funeral parlors at a fine price to the lately bereaved. Nothing but the best for the dearly departed, and by extension to the social standing of those paying for it all. Cremation offers one way out of some of the unnecessary expenses and fuss of burial in a recognized cemetery. Where people have space available at home, a family plot is often outlawed by zoning regulations, and anyway the new custom of moving from one house to another several times throughout life makes it impractical. House buyers are understandably queasy about moving into a place with a stranger’s recently interred relatives just outside the back door.
The Loved One, a 1965 film based on a satirical novel by Evelyn Waugh, was directed by Tony Richardson and in this scene starred Robert Morse affecting an English accent as Dennis Barlow, who must see to the burial of his uncle while on a trip to Los Angeles, California. Liberace played Mr. Starker and Anjanette Comer played Miss Thanatogenos, both of the fictional Whispering Glades cemetery and mortuary.
There is another option, one that chucks all the trappings of the funeral industry and the land grabbing of permanent cemeteries, and that is natural burial. The dead are not embalmed nor are they buried in monstrously expensive containers that prevent or delay decomposition of the corpse and casket. The dead are buried in a cloth shroud or a simple wooden coffin which will decompose readily without contaminating the soil. Grave markers are not permanent reminders such as the headstones found at a conventional cemetery, but low key natural markers meant to degrade within a generation, or plantings such as a tree which will eventually supersede its function as a mere grave marker. The land is conserved as wild space rather than subject to continuous environmental destruction by modern landscaping practices.
Natural burial is a return to the practices of our ancestors. In some parts of the world, people have never deviated from natural burial practices. Returning to dust is inevitable, and it might as well happen in a way that preserves the economic and environmental resources of the living. Memories of the departed can be kept alive in ways other than the permanent reminders of headstones and the expensive and often environmentally destructive tending of a cemetery landscape designed to appear natural, though upon reflection it is hardly that at all, any more than the neatly clipped lawns in the suburban and city lots surrounding it.
Sting wrote “All This Time” in 1990 about the recent death of his father and about his memories of growing up near the shipyards of Wallsend in Northumberland, England.