Many a Tear Has to Fall

 

“Many a tear has to fall,
But it’s all in the game;
All in the wonderful game
That we know as love.”
— The opening lines of the song “It’s All in the Game”, music written by Charles Gates Dawes in 1911, and lyrics written by Carl Sigman in 1951.

Charles Gates Dawes was vice president of the Calvin Coolidge administration between 1925 and 1929, and before that he had a multi-faceted career as a lawyer, banker, soldier, and diplomat. He was also an avid amateur musician who wrote a song in 1911 that he called “Melody in A Major”, a song that Carl Sigman, a qualified lawyer himself, would write lyrics for in 1951 and rename “It’s All in the Game”. The singer Tommy Edwards was one of many performers who recorded “It’s All in the Game” in 1951 and in the years since, but it was his 1958 rendition that reached number one on the record charts and has become the most familiar to listeners. Two other interesting items to note about Mr. Dawes before moving along: He was a descendant of William Dawes, the man who made the midnight ride with Paul Revere in 1775, and he shared the Nobel Peace Prize for 1925 for his work rearranging the German reparations payments for World War I which had been crippling its economy.


White Roses-1890-Vincent van Gogh
Roses, an 1890 painting by Vincent van Gogh (1853-1890).

 

“It’s All in the Game” outlined the ups and downs of courtship, and as such would seem to have no bearing on Father’s Day. When we were growing up, we generally caught mere glimpses of the affection shared between our parents. Some people may have seen frequent displays of fondness, others none at all. Seeing our fathers as authority figures, probably the last thing that would have popped into our heads was the understanding that these were men who were seen quite differently, at least at one time, by their partners in marriage. For most of us, the idea would have been difficult to reconcile with the fellow we knew. Later in life, having grown up and gotten a more rounded view of things, we might learn to perceive the side of him our mother knew, and thus understand better why she married him, even though he may have been an ogre or a gent, or most likely a little bit of both and a lot in between. Then if our parents lived long enough while we attained greater maturity, we might get the opportunity to understand them better as people rather than merely as the totems of varying degrees of nurturing and authority we looked up to as children, and realize that the first lines of “It’s All in the Game” embrace us well.
— Vita


Tommy Edwards sings his 1958 rendition of “It’s All in the Game.” The photo is from the set of the 1973 George Lucas film American Graffiti, a story about coming of age in the early 1960s.

 

Reason to Smile

 

“Equality of rights under the law shall not be denied or abridged by the United States or by any State on account of sex.” — Section 1 of the Equal Rights Amendment.

It’s a fair guess that at some point in their lives most women have had someone, usually a man, but sometimes another woman, urge them to smile more, as if it were incumbent upon women to always appear pleasant and non-threatening. No one tells men to smile, except maybe for pictures. This past week, on Wednesday, May 30, Illinois became the 37th state to ratify the Equal Rights Amendment (ERA), leaving the amendment one state short of the approval by three fourths of the states required to become law. That’s reason to smile. Celebration, however, may still be a long struggle away.

 

When the United States Congress approved the ERA in 1972, they sent it on to the states with a seven year limit for ratification written into the proposal, something that had become common practice ever since the proposal for the 18th Amendment (Prohibition), with the one exception of the 19th Amendment (Women’s Suffrage). After ratification stalled at 35 states in 1977, Congress eventually granted an extension on the time limit until 1982. The amendment has remained in limbo since then, until 2017 when Nevada, under pressure from a renewed groundswell in the women’s rights movement due to current events both in politics and in the workplace, ratified the amendment to move the total to 36.

Alice Paul, with Mildred Bryan 159039v
Alice Paul, on the right, leader of the feminist movement in America and vice president of the Woman’s Party, meets with Mildred Bryan, youngest Colorado feminist, in the Garden of the Gods at Colorado Springs, where on September 23rd, 1925, the Party launched its western campaign for an amendment to the Constitution giving equal rights to women. Photo by H.L. Standley.

There is some question whether the amendment will indeed become law with ratification by a 38th state because of the time limit imposed in its proposal by Congress, and because a handful of state legislatures have rescinded their ratification since the 1970s. There is nothing explicit in Article V of the Constitution, which deals with the amendment process, stating Congress should impose a time limit on ratification. In the 1921 case of Dillon v. Gloss, the Supreme Court inferred from Article V that Congress had the power to impose a time limit, settling that argument on shaky ground. In 1939, in the case of Coleman v. Miller, the Supreme Court sent the ball back into Congress’s arena of politics on whether ratification by states after the expiration of a time limit had any validity, and whether states were allowed to rescind ratifications. Those questions have remained unchallenged, and therefore unsettled, ever since.

In an episode of the 1970s television show All in the Family, Archie Bunker argues with his neighbor Irene Lorenzo , played by Carroll O’Connor and Betty Garrett, about equal pay for equal work after Irene starts working at the same place as Archie. 46 years after Congress passed the ERA in 1972, the issue remains unsettled.

There has been a development since 1939 that further clouds the entire issue of a time limit on ratification, and that is the full ratification of the 27th Amendment (Congressional Pay Raises) in 1992, after a delay of 203 years since its passing by Congress in 1789. No time limit had been imposed by Congress in 1789, of course, but since it nonetheless became the law of the land after hundreds of years of languishing in the docket, it raises the question of the legality of the decision in Dillon v. Gloss and sets a precedent for proponents of the ERA to follow in seeking to overturn the expiration of its time limit in 1982. If and when a 38th state ratifies the ERA, that state most likely being Virginia, the matter will probably bounce from the courts back to Congress, where it will have to be settled politically, making the upcoming 2018 congressional midterm elections important for yet one more reason. Until then, smile when you feel like smiling, or not at all.
— Vita

 

The Good, the Bad, and the Unpunctuated

 

The Oxford comma, also known as the serial comma, seems to be less in evidence every year. It’s difficult to understand why many people don’t like to use it, and it may be that they simply don’t understand what punctuation is all about. Punctuation is like musical notation, or at least the parts of it that indicate to the players where the rests are and indicate the rhythm in a piece of music. The players are the readers. If there were no commas or periods in writing, readers would not know where to take a break. Imagine listening to a piece of music played that way. For that matter, imagine listening to someone who runs on and on without a pause!

 

Eastern Comma (Polygonia comma), Chippewa Co., WI (6270394051)
If it’s confusing trying to sort out punctuation marks on the written page, try differentiating all the butterflies named for commas and question marks. This one is an Eastern Comma butterfly, Polygonia comma, from Chippewa County, Wisconsin. Photo by Aaron Carlson.

Take the title of the 1966 Italian movie The Good, the Bad and the Ugly, which in the original is rendered as Il buono, il brutto, il cattivo (the order of the nouns in the original Italian is good, ugly, bad). Never mind the difference in capitalization conventions for titles between English and Italian, and the change in word order from Italian to English, the key point is the inclusion of the serial comma in the original Italian and its absence in the English translation. It’s a simple thing, that comma. Why leave it out? Perhaps the translator was thrown off by the missing conjunction “and” in the Italian, which would have been rendered “e”, as in Il buono, il brutto, e il cattivo. In English, we are used to “and” coming before the last item in a series. It would not sound quite right to our ears if the title were translated as The Good, the Bad, the Ugly. That sounds choppy and abrupt. Throw in “and” before “the Ugly” and we have a rhythm that sounds right to the ears of English speakers. Except for one little thing.


The Danish National Symphony Orchestra, conducted by Sarah Hicks, perform a suite of “The Good, the Bad and the Ugly (main title)” and “The Ecstasy of Gold”, a piece from near the end of the film.

 

What happened to the last comma? Without it, not only the rhythm, but also the sense of the film title is off. Are we to rush through when we speak the last part of it? Instead of saying “The Good [pause] the Bad [pause] and the Ugly [full stop]”, are we meant to say “The Good [pause] the Bad and the Ugly [full stop]”? No one talks in the rhythm given in the second example. Does the phrase “the Bad and the Ugly” refer to one person only, in the same way that “the Good” refers to one person? Is that person both bad and ugly? Absolutely not, as is clear from the original Italian title and from the movie itself. There are three separate characters referenced in the movie’s title, and each is named by his outstanding characteristic.

In another rendition of the same suite, the composer himself, the great Ennio Morricone, conducts the Munich Radio Orchestra. The soprano soloist is Susanna Rigacci. The musical notes are the same in both renditions, but it’s interesting to hear the differences in their presentation.

It must be the “and” that throws people off when they write out a series. They must think “and” stands in for the serial comma, making it unnecessary. But it doesn’t. Listen to the music: TheGoodtheBadandtheUgly slowed down a bit is The Good the Bad and the Ugly, and slowed down a bit more in the right places, rendered in the way we actually speak, becomes The Good, the Bad, and the Ugly. That wasn’t very hard, was it? We speak in words, and the words are like music, with rhythm and tempo. When we write down the words we speak, we need a way to convey to readers, to listeners, that rhythm and tempo, and that’s where punctuation comes in. That’s all it is. There’s nothing greatly mysterious about it, though semi-colons befuddle many, and the novelist and essayist Kurt Vonnegut disdained their use, remarking of them “All they do is show you’ve been to college.” Homer, who of course spoke his poetry for listeners and never wrote it down himself, would probably have agreed.
— Vita


In this scene from Sergio Leone’s film The Good, the Bad and the Ugly, Eli Wallach’s character, Tuco, encounters an adversary and ends up succinctly admonishing him that it takes too long to speak, shoot, and leave.

 

Obsessed with Bugaboos

 

“America does not go abroad in search of monsters to destroy. She is the well-wisher to freedom and independence of all. She is the champion and vindicator only of her own.” — John Quincy Adams (1767-1848)

There has always been a strong strain of paranoia in American political life, and it erupts occasionally in official policy, from the Alien and Sedition Acts signed into law in 1798 by John Quincy Adams’s father, President John Adams, to the Patriot Act of 2001, signed by George H.W. Bush’s son, President George W. Bush. In the early days of the republic, when the Adams family was prominent in national politics, there was of course no social media or Fox News to whip up hysteria about The Other, though there were plenty of locally circulated broadsheets that made little effort at objectivity.

 

Now the media landscape is far different than it was in 1798, and people who feel threatened by cultural changes which erode the power and influence of white conservatives have platforms like Fox News, Twitter, and Facebook that reach far and wide. As the self-styled Silent Majority slips to Ranting Minority status, their paranoid hysteria ratchets up in intensity to the point that Fox News is not a strong enough salve for their imagined wounds, and they turn to fear mongering websites like InfoWars. The information available in the bubble in which angry, old white conservatives live can’t exactly be called news, but more a drug that reinforces feelings, thoughts, and ideas they already dwell on resentfully, nursing their grievances like mean drunks wallowing in self-pity.

Brooklyn Museum - Here Comes the Bogey-Man (Que viene el Coco) - Francisco de Goya y Lucientes crop
Here Comes the Bogey-Man, an aquatint print from the 1799 set of 80 known as Los Caprichos, by Francisco Goya (1746-1828).


Pooh Bear has a bad dream.

They tend to lash out angrily, these constant consumers of spoon-fed rage, and because they tend to be more conscientious about voting than other groups in American society, their views make it into government policy more than the views of less paranoid people, at least when they coincide with the interests of corporate and political leaders. And then support for those policies among the general populace becomes tied to patriotism in the minds of these people, because they have bestowed on themselves the mantle of True Americanism. True Americans want to build a wall along the Mexican border. True Americans don’t want gays marrying each other. True Americans believe climate change is a liberal hoax, and therefore no steps need be taken to restrain the fossil fuel industry. That particular list goes on and on. There is another list about what True Americans believe and want, and it starts with changing the definition of who they are to include everyone who lives here, not just one group raging and warring against all The Others.
— Vita

 

The Pause That Refreshes

 

Editor’s note: There was no post on this website last Friday, April 27, because it is healthy to take a break and go fishing once in a while.

 

“The pause that refreshes” was a slogan coined in 1929 by Coca-Cola marketers, and nearly a century later it remains one of the most memorable advertising slogans for Coke, or for any other product. It was also in the 1920s that Henry Ford instituted a new policy at his automobile manufacturing plant to shorten workers’ shifts to eight hours and their work week to 40 hours, a model that soon became the standard throughout American industry. In 1938, the federal government established with the Fair Labor Standards Act a minimum wage and rules for most workers to receive time and a half payment for hours worked over 40 in a week.


Niels Frederik Schiøttz-Jensen An afternoon's rest
An Afternoon’s Rest, an 1885 painting by Niels Frederik Schiøttz-Jensen (1855-1941).

It’s still up to the states to regulate breaks and lunch time off for workers, and many do so in a minimal way, if at all. It may come as a surprise to some workers that their breaks often come solely at the discretion of their employer or, if they are with a union, because breaks are written into the contract between the union and management. Even bathroom breaks can be a source of contention between labor and management. It is a wonder then to consider how much conditions for workers have generally improved since the early years of the industrial revolution in the eighteenth and nineteenth centuries, when 12 and 16 hour days were not uncommon and workers’ welfare and safety were entirely their own lookout.

What changed things was when workers started to organize and bargain collectively in the late nineteenth century. It is a misconception to think the worker holiday of May Day started in communist countries, because it actually began in the United States, and has come to commemorate the Haymarket affair in Chicago, Illinois, in May of 1886 when workers on strike and demonstrating for an eight hour workday ended up in deadly confrontations with the police over the course of two days. Unionization continued wringing concessions from management through the first half of the twentieth century, and from 1945 to 1975 the percentage of the non-farm workforce belonging to a union peaked at over 30 percent. In the years since, union membership has declined to less than half that, and the remaining unions, many of them organizations formed for the benefit of state employees such as teachers, are under attack from Republican controlled state governments.

A discussion of ways of coping in life from the 1964 film of The Night of the Iguana, based on the play by Tennessee Williams, directed by John Huston, and starring Ava Gardner, Deborah Kerr, and Richard Burton as the defrocked Reverend Dr. T. Lawrence Shannon.

None of that changes the need of people concentrating on their work to take a break from it every once in a while throughout the day, and for weeks or more at a time throughout the year. Robots have no need of breaks, but for the time being there are still jobs robots cannot do and those jobs will require the talents of fallible, sometimes frail humans. Enlightened management can choose to view breaks for workers as beneficial to both parties, since a more rested worker can be more productive in the long run than one who is run ragged. Less enlightened management may consider the burnout of workers as the cost of doing business, believing they are easily replaceable cogs in management’s profit making machine. That mindset prevailed over a hundred years ago, before Henry Ford, who was by no means enlightened in all areas, nonetheless saw that his workers and people like them were the buyers of his automobiles, and raised their wages and improved their conditions in the interest of maintaining a kind of partnership with them, rather than treating them wholly as chattel, as cogs in the gears of production.
— Vita

 

We Were Here

 

The urge to leave a personal mark on the relatively permanent structures around us is strong enough to prompt some people to break laws against vandalism and trespassing and paint, mark, or scratch into public view an announcement of their existence. Is it graffiti, street art, or defacement? We can see these markings on buildings that are a few thousand years old, but beyond that, as a recently published scientific paper asserts, everything gets ground up, mixed together, compressed, and dispersed, making it hard to determine anything conclusively about human civilization and its discontents as expressed in graffiti. Caves have preserved paintings on their walls for tens of thousands of years old, but that artwork tells a very different story of humanity before what we consider civilization.

 

Sometimes graffiti addresses social and political issues, though more often the concerns of the artists are more mundane. It’s overstating to call a scatological scribbler a street artist, or even a graffiti marker. It doesn’t take much imagination or skill to scrawl the image of a phallus across a stone wall, whether it was done two thousand years ago or yesterday. Similarly with personal insults, the boorish nature of which have not changed at all over the centuries. The best graffiti is illustrative of a unique frame of mind, an altogether personal view of the world. The same definition can apply to art.

Foxx Equipment Mural - Dinosaurs and Cavemen - Kilroy Was Here
In Kansas City, Missouri, a 2008 rendition of the graffiti made famous everywhere during World War II by American servicemen. Photo by Marshall Astor.


John Cleese as the Centurion and Graham Chapman as Brian in the 1979 satirical film Monty Python’s Life of Brian.

Tagging, which is marking or painting of initials, nicknames, or symbols, and is often used to mark territory, is not a particularly enjoyable or meaningful form of graffiti to anyone but the marker and others who need to interpret the signs. They rarely exhibit any wit, and are usually straightforward signs meant for specific groups instead of the larger society, hence their often cryptic appearance to those not in the know. The signs say, among other things, “Keep out”, “This is our territory”, or “I am here”. The humorist Jean Shepherd, in a video essay about roadside features in New Jersey, speculated about the confusion of future archaeologists as they attempt to decipher the graffiti of our times, attaching to it perhaps more importance than it warrants. The entire television special is a treat, featuring Mr. Shepherd musing with philosophical delight about what constitutes art as he observes all the commercial kitsch he finds along a New Jersey highway. All our artifacts and graffiti will be gone in a millennium, of course, crumbled into disconnected bits, but for now they say “I am here”, and “We were here”.
— Vita

 

Not Buying It

 

The departure of advertisers from Laura Ingraham’s show on Fox News after a boycott of their products and services was proposed by David Hogg, the Parkland, Florida, shooting survivor Ms. Ingraham gratuitously mocked on Twitter, is not censorship, as Fox News executives claim, but the simple economic result of a self-inflicted wound. No one disputes Ms. Ingraham’s First Amendment right to make hateful, idiotic remarks. Furthermore, no one claims that Ms. Ingraham cannot disagree with Mr. Hogg on gun control. As a public figure, however, with a forum that allows her to generate revenue through television viewership ratings that are often as not in her case driven by the outlandishness of her hateful, idiotic remarks, and ad hominem attacks on those she disagrees with, she cannot expect there will be no repercussions. Boycotting her advertisers is simply hitting her where she and Fox News are most vulnerable.

 

Rosaparks
Rosa Parks in 1955, with Martin Luther King Jr. in the background. Ms. Parks was instrumental in starting the Montgomery, Alabama, bus boycott when she refused to give up her seat to a white passenger. Photo by the United States Information Agency (USIA).

There’s a world of difference between the costs paid by Ms. Ingraham for her free speech and that paid by someone such as Juli Briskman, the woman who lost her job after flipping off Supreme Leader’s motorcade last October. Ms. Briskman was not a public figure at the time, and she undertook her action on her own account, with no connection made by her between that action and her employer. Still, her employer, a federal contractor, fired her after it became widely known she worked for them. Ms. Briskman had no thought of ginning up popularity and revenue for her or her employer, far from it. People like Laura Ingraham are well aware their speech will generate controversy, because controversy translates into money. Ms. Ingraham and other public figures like her are the television wrestlers of punditry, throwing metal chairs and bellowing insults while they stomp around the arena doing their best to incite the crowd.

The boycott is a time honored method for expressing disapproval and trying to effect change in public policy or the behavior of public figures. People on both sides of the political spectrum engage in boycotts, as the Reverend Franklin Graham demonstrated recently when he called for a boycott of Target department stores on account of what he sees as their overly liberal transgender restroom policy. Everyone votes with their dollars, for the simple reason that in our capitalist society it is the easiest and most effective way of getting the attention of the powerful. Whether a boycott is undertaken for frivolous or nasty reasons is in the eye of the beholder, but it has to be respected because it too is a form of free speech. The object of a boycott may weather it with enough counter support from people who perceive the boycott as unfair. At any rate, the economic effect is often secondary to the real aim of the boycotters, which is to bring a matter to widespread public attention, causing the boycotted company or public figure to explain or justify their actions, policies, or remarks.

Mahatma Gandhi coined the term “satyagraha” to explain his view on the right way to conduct non-violent efforts for change. Satyagraha means truth (satya), and grasp or hold onto (graha), or holding onto the truth. When people hold what they believe to be the truth, they actively try to get someone or some group who is obstructing their aims to see that truth as well, so that in the end they will step out of the way without the threat of violence. Of course, we all believe we hold the truth, with the possible exception of media pundits who cynically exploit political arguments for personal gain, in which case it’s hard to say whether they believe their own nonsense or not. It doesn’t really matter.

An assembly of moments from the 1982 Richard Attenborough film Gandhi, with Ben Kingsley, showing some of the Mahatma’s methods and philosophy.

For everyone else, with their own truths (not their own facts), it is important to treat those who disagree by the light of their own truths with respect and consideration during the contest for change. The boycott throughout history has been an instrument of change used by the weak against the strong, and today is no different. It’s unseemly then for the strong to veil themselves in the First Amendment in a cynical attempt to elevate the debate into the same arena where Gandhi, Martin Luther King Jr., Rosa Parks, and Cesar Chavez fought for their rights, when they brought this public criticism upon themselves as a consequence of abusing their public forum in the interest of spewing vitriol in pursuit of dollars.
— Vita

 

Have the Chops

 

Viewers of American television shows from the 1950s, 60s, and 70s might have noticed that the families on shows of that era seemed to have lamb chops for dinner rather often, or certainly more frequently than most Americans eat lamb or mutton now. This doesn’t approach anything like a scientific proof of declining consumption of lamb and mutton since the mid-twentieth century, and at that it would only prove a decline among the demographic of the white Anglo-Saxon Protestants who were the main representatives of Americans on television then, but there it is nonetheless. On old shows like Father Knows Best and Leave it to Beaver, the characters were eating lamb chops regularly, but after the 1970s hardly anyone ate lamb chops anymore.

Ninely and Nine (3084038737)
A British shepherd with a lamb and his Border Collie in the 1890s. Photo from the National Media Museum of the United Kingdom.

 

Ham has always been more popular in Middle America than lamb, and Easter dinner was no different. It was in immigrant communities in the cities of the east and west coasts that lamb was popular, at Easter or anytime. Nevertheless, through the middle years of the twentieth century lamb and mutton were widely available throughout the country and competitively priced with other meats at supermarkets and butcher shops. Much has been made of the learned distaste for canned mutton among service members returning from overseas duty in World War II for the eventual decline in popularity of sheep meat in America, but statistics and anecdotal evidence of the popular culture as represented on television programs discount the impact of that one factor.

The increased use of synthetic fabrics over wool contributed to the drop in sheep herding, but that also is overemphasized, considering that synthetic fabrics gained ground in other countries as well, places like Australia and New Zealand where sheep herding remains a large part of the agricultural economy. What separates American sheep raising culture most from the rest of animal husbandry is the difficulty of conforming it to the needs of large scale agribusiness. In the generations after World War II, when family farms were swallowed up in large numbers by agribusiness concerns which consolidated the raising of chickens, beef cattle, and pigs into factory farms, the raising of sheep, and particularly lambs, resisted conforming to factory farm standards. As a result, American lamb and mutton became more expensive than comparable weights of chicken, beef, or pork.

American sheep herding declined to a cottage industry, which had the ironic effect of insulating it further from the factory farming practices which had taken over other areas of animal husbandry by the end of the twentieth century. The mutton and lamb available in Middle American supermarkets in the same period was likely as not imported from Australia or New Zealand. The imported meat was cheaper than American raised mutton and lamb despite the long shipping distances because of the economies of scale in those countries, where sheep were still raised in the tens of millions. Americans generally did not favor the imported meat over beef, chicken, and pork, however, because of the “gaminess” they noted in it, a product of the types of sheep raised in Australia and New Zealand and the pasture they were raised on. Americans had gotten so used to the blandness of meat produced by grain diets for factory farmed animals that they started rejecting anything stronger.

From The George Burns and Gracie Allen Show of the 1950s, the two performers reenact one of their vaudeville routines for announcer Harry Von Zell.
As Americans begin to reject factory farming out of both the inhumane nature of it and the unhealthy food it produces, prospects for sheep herders in this country are improving. Considering the practices most, but certainly not all, of them have adhered to over the last half century through some bad times, it’s not that they ever went anywhere, but that the rest of us did and are now drifting back to them in dribs and drabs. If it weren’t for the support of the immigrant population and their preference for American lamb and mutton, the sheep herders here would not likely have survived the lean times in sufficient numbers to crank up operations again with the promise of supplying more Easter dinners. Of the lambs the best that can be said is that unlike many of their unfortunate cousins on the factory farms their lives, however brief, may be more natural and even peaceful.
— Vita

 

The Lost Sheep

 

The evangelical Christianity we are familiar with today in the United States does not resemble what it was prior to the Civil War, when evangelical Protestants promoted social justice issues such as the end of slavery. Slavery was the primary issue that divided some Protestant denominations, the Baptists more than any others because of the strong presence of Baptists in the South. Rancor over the issue within the Baptist denomination eventually led to its division before the Civil War into Northern and Southern sects, a division which has continued to this day.

Brooklyn Museum - The Good Shepherd (Le bon pasteur) - James Tissot - overall
Le bon pasteur (The Good Shepherd), a painting from between 1886 and 1894 by the French artist James Tissot (1836-1902).

When people think of evangelical Christians active in modern political life, largely in conservative Republican circles, they are primarily thinking of Southern Baptists, because that is the denomination which has dominated politics and culture in the South since the Civil War, and it is from the South in the 1970s that arose the major political and cultural movement known first as the Moral Majority, and since then mostly known as the Christian Right. For over a hundred years, the dominance of Southern Baptists over life in the South was as close to a state sanctioned religion as we have gotten in this country, or at least in one part of it. Other Protestant denominations in the South, such as the Pentecostals, have been a part of modern evangelical Christianity, but the Southern Baptists have always been the major players.

As the de facto state religion of the South in the Jim Crow era and beyond, Southern Baptists were more interested in preserving white privilege and power than in promoting the kind of social justice Jesus advocated in His teachings. The Southern Baptists chose to ignore many of those ideas from the New Testament, lest they give black folks unsavory and rebellious ideas, and instead focused on the rewards waiting for the saved in the afterlife, where it wouldn’t cost the earthly white leaders anything in money or power. As the South remained rather isolated and more conservative than the rest of the country throughout the first two thirds of the twentieth century, there were further fractures within Protestant denominations, with the more liberal Northern sects increasingly considered the mainline portions of each denomination, and the Southern sects more and more lumped together as evangelical Christians, but with the twist that these evangelicals were largely white conservatives more vested in the status quo than in change for social justice.

Jimmy Carter addresses the South Baptist Convention in Atlanta, GA. - NARA - 179898
President Jimmy Carter addresses the Southern Baptist Convention in Atlanta, GA, in June 1978. Evangelical Christians were lukewarm at best regarding Mr. Carter, and in the 1980 election they turned him out in favor of the more conservative Ronald Reagan. Since then, Mr. Carter has devoted himself to humanitarian causes around the world, including Habitat for Humanity, all of which earned him the honor of a Nobel Peace Prize in 2002.

President George W. Bush meets with the leadership of the Southern Baptist Convention in the Oval Office, Oct. 11, 2006
President George W. Bush meets with the leadership of the Southern Baptist Convention in the Oval Office in October 2006. Pictured with the President are Dr. Morris Chapman, left, Dr. Frank Page, and his wife Dayle Page. Mr. Bush the Younger was more to the liking of evangelical Christians than any president of the past 40 years other than Ronald Reagan. White House photo by Paul Morse.

When the Internal Revenue Service (IRS) in the 1970s went after Bob Jones University, a private evangelical school in Greenville, South Carolina, to revoke its tax exempt status on account of not adhering to Civil Rights era desegregation laws, Southern Baptists, which by that time had become indistinguishable from evangelicals, were catalyzed into action, forming the Moral Majority in order to take an activist role in national politics. They added abortion later as a rallying cause and it also served to mask the initial, primary impetus for organizing politically, which was the affront by federal interference into their pocketbooks and their white supremacist fiefdom. From the 1970s until today evangelical Christians, the Christian Right, have been a force in national politics, and never has their participation been more perverse at first glance than their unwavering support for the current president with all his defiantly un-Christian character flaws, but with an understanding of their history it begins to make sense, though it doesn’t make it right.
— Vita

 

Talking Trash

 

“If you can convince the lowest white man he’s better than the best colored man, he won’t notice you’re picking his pocket. Hell, give him somebody to look down on, and he’ll empty his pockets for you.”
— President Lyndon Johnson to staff member Bill Moyers, on observing racial epithets on signs during a visit to Tennessee.

The terms “white trash” and “rednecks” are probably the only remaining instances where derogatory epithets are more or less acceptable in general society. Privately, of course, people of all stripes can and do use epithets of all kinds to describe others they don’t like, and it often matters little how different are the beliefs they express in public. The reason the labels “white trash” and “rednecks” may still be acceptable has to do with how, now more than ever before, they designate a voluntary lifestyle choice rather than an inborn condition. 100 years ago there was speculation among scientists and others that the condition had a genetic dimension, but since then the argument has been discredited along with the practical applications of eugenics, such as forced sterilization.


The white working class has attracted renewed scrutiny from politicians, the media, and academics after the perception of the 2016 election results as a resounding announcement from those ignored voters that they wanted their concerns addressed. By no means are white trash or rednecks any more than a minority of the white working class, and their votes comprise an even smaller percentage than that, since most of them do not habitually vote, or even register to vote. It is also untrue that white working class voters were the primary constituency of the Republican candidate elected to the presidency. There were not enough of them to install the Republican in office, any more than ethnic and racial minority voters alone made up enough of Barack Obama’s constituency to install him in office in 2008 and 2012. Nonetheless, politicians, the media, and academics unhappy with the 2016 election results have seen fit to blame the white working class, and by extension white trash and rednecks, for inflicting the current presidential administration of Supreme Leader on the country.

Toward Los Angeles, CA 8b31801u edit
A 1937 photo by Dorothea Lange of two men walking toward Los Angeles, California. Ms. Lange took many photographs in her work for the Farm Security Administration (FSA), a New Deal agency.

There is no backlash to denigrating white working class people. Across the culture at the moment, it is a safe bet for people like academics who must otherwise be extremely careful in navigating the identity politics cultural minefield, lest they destroy the career in the bureaucracy. Certainly there are some people who deserve criticism, and perhaps as suggested earlier that would include people who have made a lifestyle choice to be vulgar and offensive. Making such a lifestyle choice now, when people have greater access to information than ever before, can be considered more than ever a conscious decision rather than a cultural or genetic backwater that a person cannot escape. But the information they seem to prefer is fake news over real news, and bolsters their apparent preference for ignorance over knowledge, bigotry over acceptance, and reality television over reality.

Near the end of A Face in the Crowd, a 1957 film directed by Elia Kazan, and starring Andy Griffith and Patricia Neal, the public gets a peek behind the mask of the demagogue, “Lonesome” Rhodes. There are many similarities between this film and today’s political and cultural environment, but there is one major difference in the ability of the public to register shock and disapproval for abysmal character flaws in its leaders. Some of the baser elements in today’s society would not only not be shocked by Rhodes’s revealing of his true character, but would approve of his remarks as a middle finger thrust upward on their behalf in defiance of elites.

 

Just about everyone seems to look down on someone else, to the point that it can be considered a universal human need. Elites are certainly not free from the need to look down on some other group, but in practice they have learned it is in their own interest to be circumspect about expressing their disdain, at least in public. Sneering at the white working class generally without first splitting off the subset of white trash and rednecks is a bad idea that serves to highlight the disconnected and arrogant nature of elites, and it is behavior that will serve to push white working class voters, once the foundation of the Democratic Party along with black working class voters, farther away from Democrats and more securely into the arms of Republicans, where they are given rhetoric they want to hear, but nothing of substance. Listening to people is the first step toward working with them, while loudly condemning them all as racist, misogynist white trash might demonstrate to everyone your purity for the satisfaction of your own smug self-righteousness, but it is hardly the way to win friends and influence people, a vocation otherwise known as politics.
— Vita

 

1 4 5 6 7 8 11