The finale of the National Football League season comes next Sunday with the Super Bowl contest between the Tampa Bay Buccaneers and Kansas City Chiefs at Raymond James Stadium in Tampa, Florida, and it marks the end of a season when relatively few fans attended games in person due to coronavirus social distancing restrictions observed by the league’s teams. Attendance at this Super Bowl will be limited to 22,000 fans, in a stadium that can seat over 65,000. For hard core fans used to watching the games in person rather than on television, it must have been a peculiar season.
Crowd in the Polo Grounds grandstand for the final game of the 1908 baseball season, watching the visiting Chicago Cubs play the New York Giants. Library of Congress photo from Bain News Service.
To be at a stadium or ballpark for a game is to experience something beyond the game alone, which really can be viewed more intelligibly on a television screen or computer monitor from the comfort of home. The sports fanatic can spend hundreds of dollars for the experience, counting ticket price, parking, concessions, and other sundry expenses, and still the sports fan prefers bearing those costs instead of staying home to watch the game for free or at very little cost. One hundred years ago, there were no such contrasting choices.
At the beginning of 1921, there was no broadcast medium at all involved in bringing sporting events to the masses. In the United States, radio broadcasts of sports began later that year, with the airing of a boxing match on April 11 in Pittsburgh, Pennsylvania, and a baseball game on August 21 from Forbes Field in Pittsburgh. In both cases, the broadcasting station was KDKA in Pittsburgh. On October 8, KDKA broadcast a college football game. The first American television broadcast of sports didn’t occur until May 17, 1939, when NBC covered a college baseball game in New York City. Hard core sports fans didn’t get to listen to a sports talk radio show until New York’s WNBC started airing one in March 1964.
One hundred years ago, people either bought tickets to see sporting events or read about them the next day in a newspaper. Talking about sports was a first hand endeavor limited to friends, family, neighbors, and co-workers. Now there are several options besides buying tickets for vicariously experiencing athletic contests, and with sports talk radio and television shows and social media, there are many options for sports fans to gab on and on about their obsessions to familiars and strangers alike, both near and far.
One of the subplots from a 1995 episode of Seinfeld involving Patrick Warburton as David Puddy, the boyfriend of Elaine Benes, played by Julia Louis-Dreyfus. Michael Richards played Cosmo Kramer, and Jerry Seinfeld played a fictional version of himself.
Social norms of public appearance and behavior loosened after World War II and particularly in the ’60s and ’70s, resulting in sports fans changing over the 50 years from the ’30s to the ’80s from men (they were overwhelmingly men in the stands) who attended the games largely in suits and ties, to people who wore casual clothing, often comprised of the merchandised parts of their favorite team’s uniform. Some went shirtless and painted themselves in their team’s colors. In the 1950s, only little boys and some working class adults wore baseball caps regularly. Now, almost everyone wears one at least occasionally, and many of the caps bear team logos at a price. No one has to grow up anymore (or wants to), and sports merchandisers, who had very little business at all before the 1970s, are counting money in the billions each year now, even without sports fans filling the stands.
— Vita
The lack of capacity for critical thinking among some of the American electorate is nothing new. The French traveler Alexis de Tocqueville famously noted it in the late 1830s in the two volumes of his book, Democracy in America. Almost 130 years later, the American historian Richard Hofstadter remarked upon it in two of his works from the early 1960s, Anti-intellectualism in American Life, and The Paranoid Style in American Politics. Both of Hofstadter’s works still apply today in complementary fashion as the 2020 election nears and Clueless Leader and his cult followers on the far right drop in greater and greater numbers off the edge of reality into the realm of psychotic fever dreams.
Mike Caulfield’s “Four Moves and a Habit” method for the detection of Fake News online. Infographic created by Shonnmharen.
The great difference between now and the early 1960s, when Hofstadter wrote about the propensity of some people for ignoring facts that didn’t fit their world view, is that those people now have access to the internet and to social media, where they can spread their diseased notions like a contagion. Rumors that once took days or weeks to spread, and in the process may have fizzled out when confronted by facts, now spread in minutes and hours in a continuous onslaught that drowns outs facts. For those too intellectually lazy to engage in critical thinking, there has never been a better time for finding spurious rumors to prop up their dangerously bonkers ideas.
Most of the rumor mongering conspiracy theorists with dangerously bonkers ideas are now, and have always been, on the far right of the American political spectrum. It’s an ongoing feature of American life that the ruling class demonizes the far left because they rightly suspect the far left would overturn the cushy lifestyle of the ruling class if they could, and to help them turn the focus of hatred and suspicion upon the far left the ruling class has always had willing allies, or rather dupes, among the far right. Useful idiots.
These are the kind of people who adhere to QAnon conspiracy theories about the evil character of Democrats and Antifa partly out of credulity since they want to believe the stories, and partly because it titillates them that most reasonable adults, and particularly liberals, are outraged and appalled by the stories. For a third of the American electorate, an incapacity for critical thinking is displayed as a badge of honor, not of shame.
Too many right wing delusionists are willing, even eager, to use violence when they don’t get their way. In this they are aided and abetted by the ruling class, who use them as a cudgel against the far left and anyone else who questions the established capitalist order. Terrorism in this country has almost always come from the far right, not the far left, and for nearly four years now the current president has winked and nodded at right wing terrorists in this country. He has filled a powder keg with dangerous fantasies and then recently lit the fuse with his call out to right wing terrorists ahead of the election.
A sketch from a January 1979 episode of Second City Television, starring Andrea Martin, Catherine O’Hara, and Dave Thomas. In 2016, 52% of white women voted for the Republican presidential candidate, someone who would be incapable of understanding this SCTV sketch as satire.
For Richard Hofstadter in his examination of American history there have been breakdowns in what may be considered the consensus of political views reconciling economic and cultural differences (though he himself chafed at being lumped with the post World War II era “consensus” historians), but only one failure of consensus, and that led to the Civil War. Perhaps hope can be found in the realization that the truly dangerous right wing terrorists in this country are fewer in number than they would have everyone else believe. If the current president somehow gains four more years in power, however, that glimmer of hope may go dark because more violent reactionaries will become ever more emboldened, growing their numbers to become a visceral threat, sinister and close.
— Vita
Dan Robbins, inventor of paint by numbers kits, died recently at the age of 93. The kits Mr. Robbins invented became wildly popular for children and adults in the decades after World War II, and while sales faded in the last decades of the twentieth century, the kits never disappeared altogether. Now with the aid of the internet, paint by numbers kits have regained some popularity, and reproductions of works by famous artists such as Vincent van Gogh are available, widening the scope of painting kits available to hobbyists.
Portraits in the Countryside, an 1876 painting by Gustave Caillebotte (1848-1894), depicting the artist’s mother, aunt, cousin, and a family friend. With its large areas of uniform color and uncomplicated design, this painting would lend itself well to the paint by numbers treatment.
It hardly matters whether people consider themselves hobbyists or crafters when they pick up paint by numbers kits, or ship modeling, or knitting and crocheting, because the main thing is they are occupying themselves with a satisfying activity that often results in a useful or decorative object. The result doesn’t have to be art, nor does it have to be perfectly made. The value is as much to the maker as it is to the thing made, and possibly more. A person engaged in a hobby or craft gives himself or herself the gift of peaceful hours during which their mind and emotions can heal.
People who sneer at the dubious artistic value of a paint by numbers painting or a Bob Ross painting miss the point of those works. Hobby painters are as interested in the process as they are in the result, which often feels ancillary and even something of a let down because it means the end of the process. If they somehow produce great art, then that’s a bonus. Most of what they produce will be schmaltz, but so be it. They are helping themselves and not hurting anyone.
Take a few moments to relax as Bob Ross paints an imagined landscape and imparts his views on life.
Mr. Robbins did a great service providing at least momentary joy and well being for millions of people over the years with his paint by numbers kits, one of many hobbies and crafts contributing therapeutic benefits to those who took them up, and the gooey sentimentality of some of the subjects of those paintings hardly matters in the grander view. Losing oneself in a hobby or craft is better than watching television, or any of the numerous other electronic screens commanding everyone’s attention in today’s world.
— Vita
“Where wast thou when I laid the foundations of the earth? declare, if thou hast understanding.”
— Job 38:4, from the King James Version of the Bible.
Today is the 50th anniversary of the publication of Kurt Vonnegut’s novel Slaughterhouse-Five, or The Children’s Crusade: A Duty-Dance with Death. Many Americans are probably familiar with it because it has been assigned reading in high schools when it hasn’t been banned or burned by the outraged and the self-righteous. Being assigned reading tends to sap some of the enjoyment of reading, and in that case it might be a good idea to read the book again voluntarily, as an adult.
Mr. Vonnegut was most of all a Humanist, as he himself proclaimed, and the last thing any Humanist would claim is to also be a Saint. On looking back at Vonnegut’s work, the one feature that stands out as discordant from our modern perspective is his treatment of female characters, whom he usually portrayed without much depth, and sometimes unsympathetically for no good reason. That again is viewed from our perch 50 years in the future. Mr. Vonnegut was not out of step with his times in regard to men’s views about women, sad and embarrassing as that may seem to us now. 50 years from now, who can say how people will view us for opinions and attitudes we hold in keeping with our own time?
An anonymous painting, possibly by Christian Wilhelm Ernst Dietrich (1712-1774), of a fire at Dresden Castle.
We must remember that until Slaughterhouse-Five came out in 1969, nearly every book and movie in Western culture depicted the Allies in World War II as the good guys, and the Axis as the bad guys, with little shading of gray to add any moral nuance. The Humanist in Mr. Vonnegut could not abide that state of affairs, particularly since he had been present as a prisoner of war at the Allied fire bombing of the German city of Dresden, a target which had virtually no military or political value. The primary reason Allied command ordered the fire bombing was to terrorize the civilian population. In doing so, the Allies sought to deal out righteous retribution for German bombing of English cities earlier in the war. Atrocities, in other words, were perpetrated to one degree or another by both sides, and that is the nature of war and part of human nature and cannot be avoided, no matter how much books and movies gloss it over and glamorize one side over the other. And so it goes – to borrow a phrase from Mr. Vonnegut.
Slaughterhouse-Five was not revisionist history, but a necessary corrective to over two decades of mostly superficial accounts of World War II, at least in the popular media. It joined John Hersey’s 1946 non-fiction book Hiroshima in telling of war’s cost in suffering and the capacity for cruelty, alongside acts of kindness. In 1970, a non-fiction book written by Dee Brown, Bury My Heart at Wounded Knee: An Indian History of the American West, was published and changed the national discourse about relations with Native Americans, a discourse which had been dominated for over a century by white people of European descent demonizing them.
American prisoners caught in the Battle of the Bulge in December 1944 march to their quarters in Dresden, Germany. In February 1945, Allied air forces fire bombed the city, killing as many as 25,000 Germans, mostly women and children. The 1972 film, directed by George Roy Hill, starred Michael Sacks as Billy Pilgrim, the character based on Kurt Vonnegut, and Eugene Roche as his friend Edgar Derby, the ranking soldier among the prisoners.
Important works by great writers and historians come along infrequently and, while nothing and no one is ever perfect, their overall worth to humanity becomes even more apparent over time than at initial publication. Mark Twain’s 1885 novel Adventures of Huckleberry Finn, another great work that has stood the test of time, has also been subjected to periodic bouts of righteous indignation and banishment by different groups for divergent reasons over the years. Certainly we cringe today at some of its language and at the attitudes Mr. Twain portrayed, but many readers, perhaps most, understand that at the heart of the novel is the growing respect and friendship between a white boy and a black man, which in its day was a radical idea that undermined social conventions. We are all prisoners of our time and cannot, like Billy Pilgrim, the central character of Slaughterhouse-Five, become unstuck in time. But we can be charitable and preserve and cherish the greater Humanist vision given us by Kurt Vonnegut and other writers whose works have stood outside of time, imperfect as the writers and their works, like we and our works, will always be.
— Vita
Customs and Border Protection (CBP) employees have been detaining journalists and immigration lawyers at checkpoints in Arizona and Texas and questioning them about their political beliefs. These are nothing more than intimidation tactics by government employees who don’t appear overly concerned that they work for all citizens of the United States, not merely the current presidential administration and its far right supporters.
CBP has long had too broad an authority, and particularly after World War II when Congress passed laws giving the agency the ability to regularly trespass on citizens’ rights under the Fourth Amendment to the Constitution. In 1953, without public review, the Justice Department specified the zone within which CBP could operate fast and loose with the Constitution at 100 air miles of the United States border. That’s 100 miles within the United States, all around the perimeter, an area encompassing nearly two thirds of the populace.
A sign at the January 2018 Womens’ March in Seneca Falls, New York. Photo by Marc Nozell.
It’s incredible these laws and rules have stayed on the books as long as they have and have withstood review by the Supreme Court. The Supreme Court has often interpreted the Constitution with an eye toward sustaining the power of the government over the citizen, however, despite the recent miraculous lapse in its ruling on Timbs v.Indiana, which rescinded civil asset forfeiture, also known as cops’ legalized stealing of citizens’ property. That ruling can best be considered an anomaly, at least from the Court’s five conservative justices, who with an even more recent ruling, in Nielsen v. Preap, are back to their usual shoring up of police state encroachments on the Constitution.
George Carlin performing in 2008 in Santa Rosa, California, just months before he died. “You Have No Rights” is the closing bit, and for the album made from this Home Box Office (HBO) special, It’s Bad for Ya, he was awarded a posthumous Grammy. Warning: foul language.
Supposedly these laws are meant to be enforced against illegal immigrants, who after all are not citizens. In practice, their overly broad authority allows enough room for CBP employees with a political agenda to harass and intimidate anyone they care to, citizens and non-citizens alike. The CBP employees can always claim some legal rationale for their capricious actions, and even after offering the flimsiest excuses, they know legal redress of their abuse of power will take years, if it comes at all. This is what happens when fear guides the writing of laws, giving too much authority to law enforcement agencies, and then a lawless presidential administration grasps the reins of all that power. Meanwhile the nation’s courts have too often upheld police prerogatives over citizens’ rights, eroding the meaning of those rights and mocking their supposed inviolability.
— Vita
“Only one in four households that is income-eligible for federal housing assistance receives any. The annual cost to taxpayers of the federal income tax deductions for home mortgage interest and property taxes, which mainly benefit relatively affluent households, is double what the government spends on all lower-income housing programs combined.”
— Stockton Williams, executive director of the Terwilliger Center for Housing at the Urban Land Institute.
On February 28, Oregon Governor Kate Brown signed into law a statewide rent control bill, the first of its kind in the nation. The provisions of the bill put a cap on yearly rent price increases at a percentage above inflation, and do not apply to all rental units. Tenants’ rights groups believe the bill is better than nothing and puts an end to price gouging in a tight housing market, and they will continue to push for a more comprehensive bill in the future.
Political cartoon from the Chicago Labor newspaper from July 7, 1894, showing the condition of the laboring man at the Pullman Company. The 1894 Pullman Strike was a pivotal event in redressing the imbalance between labor and capital during the first Gilded Age. In the current second Gilded Age, weakened labor unions have had difficulty increasing wages for members, and the ad hoc affiliation Fight for $15 has achieved piecemeal success.
Arguments over whether rent control laws really work in favor of tenants go back and forth between the usual advocates for the free market on one side and advocates for at least limited government intervention on the other. “Supply and demand” is the linchpin for argument. Points less noted are low wages and income inequality, as in too many people have too little money while the rich continue accumulating more for themselves. And with more money comes more power in equal measure.
Free market arguments ignore how over time the rich, with help from their friends in government, put their thumbs on the scales of capitalism, creating an ever more favorable environment for themselves. To conceal from the lower classes how they are being preyed upon, the rich and their enablers in academia and government concoct formulas such as “a rising tide lifts all boats”, and “trickle down economics”. The Earth is not an infinite place with infinite resources, however, and even if it were, the rich in their greed would still grab for themselves with one hand while swatting the lower orders with the other hand. In their pathology, it’s just as important that others haven’t enough as it is that they have too much.
The same Wall Street financiers and speculators who created the housing bubble and consequent financial crisis in 2008 are responsible for skyrocketing rental prices around the country. None of them went to jail or were even indicted and prosecuted, and they were free to take advantage of the mess they had created by using their wealth to buy up property at rock bottom prices, helping themselves to favorable government regulations they themselves had largely written. That is more than just putting a thumb on the scale, it is sitting on it like a fat cat. It’s not unusual for the rich to profit off an economic downturn because they have the money to buy when everyone else needs to sell to have any money at all. This latest example of the rich getting richer has simply been more blatant and egregious than in previous financial crises.
A World War II era sign declaring rent control rules in some localities, a program administered nationwide by the Office for Emergency Management during the war and for several years afterward to prevent price gouging.
Conservative pundits are likely to denigrate rent control laws as socialism, while praising the free market ideal of supply and demand in the housing market for setting rental prices. The problem they choose to ignore, or are possibly even ignorant of, is that the free market ideal has been a dead letter for a long time in America, if it ever actually existed outside of economics text books in the first place. What we have now is a crony capitalist system run by corporate and financial oligarchs who bend government regulations in their favor. They write the rules to benefit themselves. They ran the housing market into the ground, and then scooped up everything at bargain prices and started charging sky high rents. If renters balked at the high prices, it didn’t matter, because they had no other options. Meanwhile, the building industry limped along, maintaining the housing shortage that keeps rents high. Supply and demand economics of, by, and for the fat cats.
— Ed.
At Christmas time, the imperative phrase “charge it!” can mean one of two things: either buying a gift on credit, or making sure a battery powered gift is ready to go once the recipient unwraps it. Buying on credit has never been the best idea and can be a sign of financial distress, while using batteries to power toys and electronic devices of all sorts has gotten better over the years, with battery technology currently poised for another great leap forward.
Flintstones Bedrock City in Williams, Arizona, in September 2018. Photo by Don McCulley. That appears to be a stripped down version of the Flintstones’ human – or cartoon character – powered vehicle under the sign.
The need for batteries on Christmas morning made itself known in earnest after World War II, when the first battery powered toys arrived on the market. Those batteries were not rechargeable and lasted only a few hours at most before depleting and then becoming trash. No recharging, no recycling. The batteries themselves might have been relatively inexpensive, but replacing them time and again was not.
Now batteries are mostly rechargeable and mostly recyclable, and more importantly they have become vital to powering far more devices than toys, from communication devices almost everyone uses throughout each day of their lives to personal transportation that is moving toward similar ubiquity. And batteries play a big part in storing electricity generated by renewable sources such as wind and solar, and that electricity can in turn be used to recharge the batteries people use every day.
A cartoonish look at the works of an electric car. Illustration by Welleman.
All that burgeoning interest has attracted research and development dollars, the incentive being the production of batteries that run longer on a charge, are made of less toxic materials, are cheaper for consumers, and are lighter in weight and in environmental footprint. The race is on, and with many more things in everyone’s daily lives being powered by batteries than there were 70 years ago, the stakes are bigger than simply making toy cars go faster on Christmas morning.
— Techly
Since 1968, when the New England Journal of Medicine editors precipitously and unfairly saddled adverse reactions of some people to Monosodium glutamate (MSG) with the name Chinese Restaurant Syndrome, MSG has been stigmatized as a food additive that is apart from and somehow unhealthier than other food additives. The first person to report symptoms to the Journal was a Chinese-American doctor, Robert Ho Man Kwok, who complained of numbness at the back of his neck, general weakness, and heart palpitations after eating at a Chinese restaurant. On this slim testimony and that of several others, the Journal coined the phrase Chinese Restaurant Syndrome.
A type of kelp known as Dasima in Korea, and Kombu in Japan, is a key ingredient in Dashi, a broth from which Japanese professor Kikunae Ikeda identified the quality of umami in 1908 that led him to the discovery and production of MSG. Photo by freddy an.
Use of MSG is not limited to Chinese cookery, however, and it can be found in many processed American foods such as Doritos, which millions of Americans appear to consume regularly without complaint. It would be interesting to see if more people would attest to adverse reactions to eating Doritos if they were made aware the product contained MSG. It is listed among the ingredients on the package, and using its most recognizable name, too, rather than one of the many names that can hide its presence, such as autolyzed yeast.
This is not to say no one can have a real adverse or allergic reaction to MSG. But for just about any ingredient in food there are some people who react badly to ingesting it. The main thing to remember is that in scientific studies of MSG, as opposed to the purely anecdotal stories that appeared to satisfy the editors of the New England Journal of Medicine in 1968, no one has found that MSG is any more dangerous than any of a multitude of other food additives. If it were as dangerous as some people appear to believe it is, not only would the Food and Drug Administration (FDA) likely take it off its generally recognized as safe (GRAS) list, but thousands or even millions of Asians and Asian-Americans would be suffering every day from its effects.
Yet Asian chefs and home cooks continue to add MSG to their meals. They are either perverse in their determination to eat the possibly unwholesome ingredient, or they are unconvinced by the nearly hysterical denunciations of it coming from some people in North America and Europe. Given the ubiquity of MSG in highly processed foods that Americans eat and enjoy every day, any real or imagined adverse reaction to it could just as easily be called American Junk Food Syndrome. There is already one name for that, which is Obesity. American food processors discovered around the time of World War II that MSG was a useful flavor booster for otherwise bland or even flavorless foods like canned vegetables and corn snacks. MSG by itself does not encourage obesity, but its overuse in helping to make some rather unpalatable and non nutritious foods delicious does contribute to obesity.
Shavings of Katsuobushi, a preserved and fermented skipjack tuna used in Dashi, the umami broth from which Professor Ikeda first isolated MSG. Photo by Sakurai Midori.
At the same time as food scientists and agribusinesses were discovering how to make cheaply made, highly profitable junk food flavorful, they were also inadvertently taking the flavor out of healthful foods by manipulating them to improve qualities like pest resistance, standing up to shipping, or tolerating being confined on factory farms, all at the expense of flavor and nutrition. Those practices yielded bland, watery supermarket produce, and meats needing seasoning and breading and all sorts of treatments in order to taste like much of anything. It’s not all that mysterious why shoppers, particularly poor ones who can’t afford to seek out higher quality ingredients, turn to highly processed, highly flavorful foods, even at the cost of poor nutrition and cumulative destructive effects on their health.
In this country, people like to blame the victim. After all, free enterprise and free choice means people don’t have to eat junk, doesn’t it? It’s also useful to have an Other to blame, as in Chinese Restaurant Syndrome. The sensible thing would be to teach children in schools about moderation in all things, including sprinkling additives on their food. A little bit of MSG on already healthful food gives an umami flavor boost and has not been shown to do harm to the great majority of people who eat it that way. MSG put on every unwholesome, processed food cannot be healthy since the bad effects of poor quality food combine with excessive amounts of this otherwise relatively harmless additive. Enormous amounts of any additive are probably not healthy, not just MSG. School administrators could stress in the curriculum healthful eating instead of allowing vending machines full of snacks, sodas, and sugary fruit drinks in the hallways. In the case of young people at least, free enterprise and free choice should take a back seat to learning healthy habits.
— Izzy
The title of this post is taken from the Ogden Nash poem “Reflections on Ice-Breaking”. Mr. Nash was writing about parties for adults, and he concluded the poem with the lines “But liquor/Is quicker.” For children at Halloween, the first lines about candy are the only ones that matter. After a bout of shepherding their children around the neighborhood for trick or treating, parents are more apt to grasp the last lines.
The association of candy with Halloween is relatively recent, and is wholly artificial in origin. Candy makers and sellers gravitated toward the holiday in the early twentieth century as a way to increase sales in the long lull between Easter and Christmas, and by the post World War II era modern Halloween traditions were pretty firmly in place. Children still might get some healthier treats like fruit in their treat bags or baskets, but after numerous tampering scares factory wrapped candies became the predominant treat, much to the relief of children everywhere.
Friandises, the French term for goodies. Photo from French Wikipedia by Ficelle.
Because of ongoing fears about tampering and because of related worries over children interacting with strangers, the Halloween trick or treat excursions of mid and late twentieth century America have most likely peaked and are now on the wane, probably to the relief of parents everywhere. People don’t know their neighbors as well as they used to, and the whole ritual has become an unnecessary risk. Children still enjoy dressing up in costumes, and of course the idea of an enormous haul of candy still exerts a sirens’ pull on them. Like Christmas and its excesses, Halloween has become a time for parents to set and teach limits, if they are inclined. It’s difficult to do when the greater society constantly tugs in the other direction.
While the children are celebrating with restraint, willingly or not, their parents might take a moment to remember the candies they enjoyed as children. One candy, Necco wafers, may be no more after this year. The company that made them and other treats shut down earlier in the year, and its unclear whether the new owners will continue production of any of them. The New England Confectionery Company (NECCO) had been in business since 1901, after consolidating the operations of two other candy companies. Its flagship candy, Necco wafers, had been in production since before the Civil War.
That’s Charlie Brown in the sheet with too many holes, and Lucy in the witch costume, in the 1966 television special It’s the Great Pumpkin, Charlie Brown.
Like a lot of candies,candy corn for instance, Necco wafers arouse strong feelings either for or against them. Perhaps it’s because our opinions about candies are mostly formed in childhood that we develop lifelong allegiances or antipathies to them. The parents who affectionately remember candies their children find revolting may be likewise turned off by all the gummies and other newly popular treats. Though he likely didn’t have Halloween in mind when he wrote “Reflections on Ice-Breaking”, Ogden Nash managed to put it all in perspective anyway, and in only seven words.
— Izzy
The urge to leave a personal markon the relatively permanent structures around us is strong enough to prompt some people to break laws against vandalism and trespassing and paint, mark, or scratch into public view an announcement of their existence. Is it graffiti, street art, or defacement? We can see these markings on buildings that are a few thousand years old, but beyond that, as a recently published scientific paper asserts, everything gets ground up, mixed together, compressed, and dispersed, making it hard to determine anything conclusively about human civilization and its discontents as expressed in graffiti. Caves have preserved paintings on their walls for tens of thousands of years old, but that artwork tells a very different story of humanity before what we consider civilization.
Sometimes graffiti addresses social and political issues, though more often the concerns of the artists are more mundane. It’s overstating to call a scatological scribbler a street artist, or even a graffiti marker. It doesn’t take much imagination or skill to scrawl the image of a phallus across a stone wall, whether it was done two thousand years ago or yesterday. Similarly with personal insults, the boorish nature of which have not changed at all over the centuries. The best graffiti is illustrative of a unique frame of mind, an altogether personal view of the world. The same definition can apply to art.
In Kansas City, Missouri, a 2008 rendition of the graffiti made famous everywhere during World War II by American servicemen. Photo by Marshall Astor.
John Cleese as the Centurion and Graham Chapman as Brian in the 1979 satirical film Monty Python’s Life of Brian.
Tagging, which is marking or painting of initials, nicknames, or symbols, and is often used to mark territory, is not a particularly enjoyable or meaningful form of graffiti to anyone but the marker and others who need to interpret the signs. They rarely exhibit any wit, and are usually straightforward signs meant for specific groups instead of the larger society, hence their often cryptic appearance to those not in the know. The signs say, among other things, “Keep out”, “This is our territory”, or “I am here”. The humorist Jean Shepherd, in a video essay about roadside features in New Jersey, speculated about the confusion of future archaeologists as they attempt to decipher the graffiti of our times, attaching to it perhaps more importance than it warrants. The entire television special is a treat, featuring Mr. Shepherd musing with philosophical delight about what constitutes art as he observes all the commercial kitsch he finds along a New Jersey highway. All our artifacts and graffiti will be gone in a millennium, of course, crumbled into disconnected bits, but for now they say “I am here”, and “We were here”. — Vita