Meatless Mondays Are Painless

 

Vegetarian or vegan substitutes for meat are not necessarily aimed at people who don’t eat meat, but rather at those who do, because by getting those people to eat less meat the environment will benefit, the animals being raised for meat will certainly benefit, and the meat eaters themselves will be healthier. The problem has been in developing a suitable substitute for meat at a reasonable cost and without creating a Frankenmeat with all sorts of nightmarish unintended consequences. Reading the reviews coming from the latest Consumer Electronics Show (CES) in Las Vegas, Nevada, it appears the company founded by Stanford University biochemistry professor Patrick O. Brown, Impossible Foods Inc., has gotten the formula right with the latest iteration of their Impossible Burger.

 

Other meat substitutes, such as the Boca Burger, have been geared toward vegetarians who wanted to retain some of the meat eating experience, and they were and are pathetic imitations. Attending a backyard cookout where everyone else was eating real beef burgers and then making do oneself with a Boca Burger or equivalent was an experience similar to being relegated to the kids’ table, with miniature versions of the adults’ dinnerware. Why bother? There are a multitude of vegetarian and vegan recipes available for real dishes, making it unnecessary to have to settle for dry, grasping imitations of what the grown-ups are eating.

Amy's Drive-Thru Vegan Fast Food Burger (28409157713)
The vegan Amy Burger at Amy’s Drive-Thru in Rohnert Park, California. Photo by Tony Webster. Amy’s Kitchen started in 1987 making organic and vegetarian frozen and convenience foods for sale in supermarkets around the country, and in 2105 opened the Rohnert Park restaurant, their first.

The point of the Impossible Burger is not to satisfy vegetarians or vegans who miss eating meat, but to replace meat in the much larger percentage of the population who are committed carnivores. Those people might have tried one of the previous meat substitutes out of curiosity, and they were right to scorn them as alternatives they could never embrace and still satisfy their nutritional and taste requirements for meat as well as a more nebulous, deep psychological need satisfied by eating meat. Professor Brown and his Impossible Foods colleagues intend their meat substitute to fulfill all those needs, and apparently they are well on their way to succeeding.

Replacing meat in the diet of the world’s people is enormously important, and probably the biggest single step toward ameliorating climate change other than reducing fossil fuel use, which would incidentally also be a byproduct of reducing livestock farming. Animal suffering would also be greatly relieved, because the situation now is horrific and getting worse as Americans and other Western peoples eat meat at least once a day, and in some places for every meal, and hundreds of millions people more in China and India aspire to the same relatively affluent lifestyles of Westerners. Factory farming of animals will become a larger industry still as the demand for meat goes up worldwide.

A scene from the 2002 film My Big Fat Greek Wedding, directed by Joel Zwick and written by Nia Vardalos, who also portrays the bride, with John Corbett as the groom. Eating meat is such an ingrained part of personal identity and social custom that most people give it little thought. Anyone who has ever been vegetarian or vegan, however, soon becomes aware of how others react to that news with bafflement or acceptance or, oddly, hostility, because refusal to eat meat is to such people a repudiation of their hospitality and identity, and possibly an indictment of their morality if the chief reason for not eating meat is because of animal suffering or the environment. It’s interesting that often the best way to smooth the ruffled feathers of meat eaters upset over learning of a vegetarian or vegan in their midst is to tout the health benefits of giving up meat, a reason that will usually gain their understanding and assent.

Consumers want more meat even though it’s not healthy for them. People will also eat more sugar than is good for them if they have the money and the opportunity. These are desires hard wired into human beings, and while some people can overcome them through will power however gained, most cannot, or even have a desire to try. For those people, the majority, give them a meat substitute at a comparable price to real meat, and satisfy their other needs for taste and nutrition and the most difficult need of all, but probably the most crucial, the carnivorous kernel in the brain that is the cause of all the social customs around eating meat or not eating meat, give those people that and the climate and the environment will be better for it, the animals all around the earth will be better for it, and those meat eaters themselves will be better for it, whether they understand and acknowledge it or not.
— Izzy

 

Through a Glass Darkly

 

“11 When I was a child, I spake as a child, I understood as a child, I thought as a child: but when I became a man, I put away childish things.
12 For now we see through a glass, darkly; but then face to face: now I know in part; but then shall I know even as also I am known.
13 And now abideth faith, hope, charity, these three; but the greatest of these is charity.”
— The Apostle Paul in a letter to the Corinthians, 1 Corinthians 13:11-13, from the King James Version of the Bible.

DNA kits are very popular now, both for people ordering them for themselves and for those giving them as gifts. Sales of kits doubled in 2017 over all previous years, and have increased again in 2018 over 2017. Interest appears to derive mostly from curiosity about immediate ancestors, on which the kits do a good job of enlightening people, and secondarily about genetic health risks, on which the kits dealing with the subject deliver mixed results, needing confirmation from a medical professional.


One area of controversy with the results, at least for Americans, has come from links to African ancestry for European-Americans, and links to European ancestry for African-Americans. Most European-Americans, or white folks, get results that include some ancestry going back to Africa two hundred years or more, usually less than 10 percent of their total genetic makeup. Most African-Americans, or black folks, get results that include around 25 percent European ancestry.

Cesky Sternberk Castle CZ family tree 116
Portraits of six generations of the Sternberg family in Jiří Sternberg’s study, Český Šternberk Castle, Czech Republic. Photo by takato marui. Tracing ancestry is easier in the pure bred lines and close quarters of Old Europe than in the melange of ethnicities and transcontinental migrations of the New World.

Compared to the ethnic homogeneity of most Old World countries, Americans are mutts, and the melting pot was particularly active in the years of heavy immigration from Europe from the mid-nineteenth century to the early twentieth century. Africans brought into the country as slaves until the Civil War eventually made up a larger proportion of the total population through that period than they have since, averaging close to 20 percent of the total for the first hundred years of the republic, and settling to a range of 11 to 13 percent of the total population afterward. It shouldn’t therefore surprise white people taking DNA tests to discover they have at least a small percentage of relatively recent African ancestry.

It is interesting, however, that DNA test results for black people yield an average of 25 percent European ancestry. It is not surprising there has been mixing of the races, despite laws against miscegenation going back centuries, but that black people have a much higher percentage of European ancestry than white people have a percentage of African ancestry, and yet black people are still and always considered black. This is Pudd’nhead Wilson territory, in which even a tiny percentage of African blood is tantamount to an entirely African genetic heritage. In America, once a person has been accepted into white society, it requires a considerable amount of African genetic input, along with other factors involving economics and relationships, for a person to fall from grace, as it were, from whiteness to blackness.


The evolution of man- a popular exposition of the principal points of human ontogeny and phylogene (1896) (14594954920)
In this 1896 illustration by Ernst Haeckel (1834-1919), “Man” is precariously at the top of a tree encompassing all life on earth.

The grade has always been uphill, on the other hand, for black people to be accepted into American society, always predominantly white, no matter how much they were genetically European. Initial branding of blackness meant staying black in the eyes of white society until extraordinary circumstances or subterfuge intervened. In all this, what the majority of people, black and white, seemed to have missed was that racial differences were minuscule, along the lines of one half of one percent of the genetics every human being shares. Race itself is an artificial concept, a social construct, rather than a real biological divider.

It’s all in the mind, and those who would reinforce race as a divider of the human species have to perform mental and ethical gymnastics to justify their beliefs since science won’t do it for them. The idea that DNA test results with some percentage of African ancestry are showing merely what goes back millions of years for all of humanity also does not hold up. Yes, everyone on earth does have common ancestors in Africa, but that is not what the makers of DNA home test kits design them to illustrate, since they typically only research genetic relations going back several centuries, that being within the time frame for which they have reliably detailed information on background in their databases.

The immigration scene of young Vito Corleone, played by Oreste Baldini, in Francis Ford Coppola’s 1974 film The Godfather: Part II. Unlike Vito Corleone when he matured, the majority of immigrants, then as now, do not end up engaged in dangerous and unlawful activities, despite what rabble rousing politicians want everyone to believe.

There also appears to be an assumption among even open-minded white people that since Africa is the cradle of all humanity, then Africans themselves must be genetically closer to that cradle than the rest who emigrated to far continents. African ancestry noted in the DNA test results must, they reason, be hearkening back to long ago ancestors white people share with everyone on earth. No. The test results show genetic input from recent African ancestors. And those recent African ancestors have evolved along with everyone else on earth, including Europeans, triggered by similar environmental and social changes pushing them to adapt. The European discoverers should not be so quick to flatter themselves with ideas of inherent superiority that they lose sight of how other societies have adapted quite well under unique circumstances without prizing discovery and conquest above all else as the sine qua non of human existence.
— Ed.

 

 

A Taste Sensation

 

Since 1968, when the New England Journal of Medicine editors precipitously and unfairly saddled adverse reactions of some people to Monosodium glutamate (MSG) with the name Chinese Restaurant Syndrome, MSG has been stigmatized as a food additive that is apart from and somehow unhealthier than other food additives. The first person to report symptoms to the Journal was a Chinese-American doctor, Robert Ho Man Kwok, who complained of numbness at the back of his neck, general weakness, and heart palpitations after eating at a Chinese restaurant. On this slim testimony and that of several others, the Journal coined the phrase Chinese Restaurant Syndrome.

Dasima 2
A type of kelp known as Dasima in Korea, and Kombu in Japan, is a key ingredient in Dashi, a broth from which Japanese professor Kikunae Ikeda identified the quality of umami in 1908 that led him to the discovery and production of MSG. Photo by freddy an.

 

Use of MSG is not limited to Chinese cookery, however, and it can be found in many processed American foods such as Doritos, which millions of Americans appear to consume regularly without complaint. It would be interesting to see if more people would attest to adverse reactions to eating Doritos if they were made aware the product contained MSG. It is listed among the ingredients on the package, and using its most recognizable name, too, rather than one of the many names that can hide its presence, such as autolyzed yeast.

This is not to say no one can have a real adverse or allergic reaction to MSG. But for just about any ingredient in food there are some people who react badly to ingesting it. The main thing to remember is that in scientific studies of MSG, as opposed to the purely anecdotal stories that appeared to satisfy the editors of the New England Journal of Medicine in 1968, no one has found that MSG is any more dangerous than any of a multitude of other food additives. If it were as dangerous as some people appear to believe it is, not only would the Food and Drug Administration (FDA) likely take it off its generally recognized as safe (GRAS) list, but thousands or even millions of Asians and Asian-Americans would be suffering every day from its effects.

 

Yet Asian chefs and home cooks continue to add MSG to their meals. They are either perverse in their determination to eat the possibly unwholesome ingredient, or they are unconvinced by the nearly hysterical denunciations of it coming from some people in North America and Europe. Given the ubiquity of MSG in highly processed foods that Americans eat and enjoy every day, any real or imagined adverse reaction to it could just as easily be called American Junk Food Syndrome. There is already one name for that, which is Obesity. American food processors discovered around the time of World War II that MSG was a useful flavor booster for otherwise bland or even flavorless foods like canned vegetables and corn snacks. MSG by itself does not encourage obesity, but its overuse in helping to make some rather unpalatable and non nutritious foods delicious does contribute to obesity.

Katsuobushi
Shavings of Katsuobushi, a preserved and fermented skipjack tuna used in Dashi, the umami broth from which Professor Ikeda first isolated MSG. Photo by Sakurai Midori.

At the same time as food scientists and agribusinesses were discovering how to make cheaply made, highly profitable junk food flavorful, they were also inadvertently taking the flavor out of healthful foods by manipulating them to improve qualities like pest resistance, standing up to shipping, or tolerating being confined on factory farms, all at the expense of flavor and nutrition. Those practices yielded bland, watery supermarket produce, and meats needing seasoning and breading and all sorts of treatments in order to taste like much of anything. It’s not all that mysterious why shoppers, particularly poor ones who can’t afford to seek out higher quality ingredients, turn to highly processed, highly flavorful foods, even at the cost of poor nutrition and cumulative destructive effects on their health.

Dr. Joe Schwarcz of McGill University’s Office for Science and Society talks about the MSG controversy.

In this country, people like to blame the victim. After all, free enterprise and free choice means people don’t have to eat junk, doesn’t it? It’s also useful to have an Other to blame, as in Chinese Restaurant Syndrome. The sensible thing would be to teach children in schools about moderation in all things, including sprinkling additives on their food. A little bit of MSG on already healthful food gives an umami flavor boost and has not been shown to do harm to the great majority of people who eat it that way. MSG put on every unwholesome, processed food cannot be healthy since the bad effects of poor quality food combine with excessive amounts of this otherwise relatively harmless additive. Enormous amounts of any additive are probably not healthy, not just MSG. School administrators could stress in the curriculum healthful eating instead of allowing vending machines full of snacks, sodas, and sugary fruit drinks in the hallways. In the case of young people at least, free enterprise and free choice should take a back seat to learning healthy habits.
— Izzy

 

Voting Should Be Easy

 

Over 75 percent of the American people have smartphones, and since voter participation in elections hovers around 50 percent of eligible citizens, the idea has come around to increase voting by making it possible for people to use their smartphones for that purpose. This year, West Virginia is trying out smartphone voting on a limited basis. The biggest concern with this practice is ballot security from smartphone to tabulating facility, usually a government office such as in a county courthouse. The medium used for that transmission would, of course, be the internet.

Smartphone Zombie Girls (15773553090)
Pedestrians in the Rahova neighborhood of Bucharest, Romania, on October 27, 2014, days before the first round of the Romanian presidential election on November 2. Photo by J Stimp.

 

Now the internet is many wonderful things, but numbered among them is not airtight security for the general user. Some users haven’t the faintest idea about or concern for the security of their system, whether it be on a desktop or laptop computer, a tablet or a smartphone. It’s clear that the integrity of internet voting by smartphone or any other device would need to be maintained by a third party, since the users themselves are unreliable.

The voting system would have to be capable of freezing out “man-in-the-middle” hacks, which have historically been the greatest vulnerability of internet communications and the most commonly exploited by hackers. Think of it as the postal system, in which Party A mails a letter to Party B by entrusting it to Party C, in this case the United States Postal Service, with the understanding that in between point A and point B no one will intercept and read it, save perhaps a Postal Inspector who can show probable cause.

 

The internet has never been even as secure as the postal system. More often it has been like the party lines that used to exist on some phone systems around the country. Until the security problems can be fixed, smartphone voting is unlikely to see widespread use. The safest system for voters is still paper ballots filed either by mail or in person at a polling place. Voting should be easier, not more difficult, as all the voter suppression laws passed by Republican controlled state legislatures have made it, with the idea that low turnout favors their candidates.

Voters wait in line to cast their ballots in the US presidential election in Philadelphia 14200A
Voters wait in line to cast their ballots in the U.S. presidential election in Philadelphia, Pennsylvania, on November 8, 2016. Note how some are looking down at their smartphones, a common sight in public places now. Photo by Voice of America News.

Relatively few people are motivated to spend a long time waiting to vote in a queue that may keep them outdoors in bad weather, though some do appear willing to endure similar conditions in order to purchase the latest iPhone. Smartphone voting is a great idea for increasing participation in elections, but sadly it is one that needs work before becoming wholly viable, if it ever does. Until then, voters can still bring along their smartphones to their polling places to keep themselves entertained while they wait.
— Techly

 

All Bottled Up

 

The great engineering writer Henry Petroski, who has explored subjects as mundane as pencils and as sweeping as American infrastructure, always brings out the human element in both the design and the use of engineered products, and it is that aspect of his work which makes it fascinating instead of dryly technical. Mr. Petroski reminds us that the products of engineers and designers are wrought by fallible humans for the use of other fallible humans, and since humans have not changed greatly over the past 300 years or 3,000 years, then ultimately engineering and design are always processes of back and forth and of trial and error, even after a university education has instilled thousands of years of experience into its practitioners. Things can and do still go wrong, largely because of human error and human desires.

 

Take the humble bottle cap, or “crown” as it is known in the trade, and twist it off. Can’t do it? Then it’s a pry off cap, or crown, and requires an implement for dislodging it. Why the different designs? Isn’t there a best practice, and if so why don’t all makers of products put into bottles adhere to it? Specifically, bottlers of beer, some of whom use twist off caps, while others use pry off caps. Isn’t beer beer, no matter how it’s bottled? According to craft beer makers, no. William Painter, an American mechanical engineer, invented the pry off cap in 1892, and it remained the standard means of capping bottled beers and sodas until the 1960s, when twist off caps with good sealing capacity came on the market.

I dream of jeannie bottle (5844956138)
The prop genie bottle, with stopper, from the 1960s television show I Dream of Jeannie. Photo taken in June 2011 by Eva Rinaldi.

Major bottlers soon replaced their pry off caps with twist off, and customers were satisfied since the beers and sodas they bought were not flat due to carbon dioxide leaks going out of the liquid inside the bottles. There may have been slight leaks of oxygen into the bottles, but not enough that the average drinker of major brand American beers would notice. When craft beer makers first came onto the market in a big way in the 1980s and 1990s, many of them decided to eschew twist off caps in favor of pry off caps for several reasons. The machinery for pry off caps was cheaper than for twist off caps, which suited small scale brewers operating on tight margins. Then there was the slight difference in the tightness of the seal between pry off and twist off, which was more of a factor thirty years ago than it remains today after improvements in twist off cap seals. Lastly there was the impression that twist off caps were for cheap, indifferent domestic lagers, while most European imports still came with old fashioned crowns or stoppers, and an impression of higher quality along with their higher prices, an impression which lingers to this day.

It is perhaps the last reason, the purely psychological one, that retains a foundation for using pry off caps, and of course it amounts to pure snobbery. The snobbery over twist off caps is even worse when it comes to consumer perceptions about wine. The craft brewers contradict themselves, too, when they plump for selling their beers in cans instead of bottles. They claim, with good reason, that because of improved linings for aluminum cans there is no longer any reason for consumers to be prejudiced against buying beer in cans. Fair enough, but just as with twist off caps for bottled beer, there remains the perception among consumers that beer in cans is inferior not merely because of possible taint from the container, but because for years beer in cans was the cheap, low quality preference of working class people.

Chubby Checker had a big hit in 1961 with “Let’s Twist Again”, the follow up to his 1960 hit “The Twist”. Some folks may feel the follow up is the better song, but it doesn’t matter much in the end and comes down to personal preference.

Let’s stop pretending then and admit the main reason to crown a beer bottle with a pry off cap is to send a signal to the consumer that what’s inside the bottle is quality stuff. It may well be, but snobbery around rituals should not be the determining factor. That’s as bad as wine snobs who get upset over twist off caps even though vintners and wine aficionados alike claim twist off caps are an improvement to the quality of wines not meant to be aged extensively but drunk within a few years of bottling. And as to craft sodas in bottles with pry off caps, that is merely an excuse to charge a premium for a product of dubious value. At least beer and wine producers can lay claim to some health benefits for their nectars, despite their consumption by mere mortals who allow their preconceived notions to get in the way of enjoying a good drink.
— Techly

 

Talking Heroes

 

Arizona Republican Senator John McCain died on August 25 after a long battle with brain cancer, and since then there has been much discussion nationwide of his role as an American hero both for his service in Vietnam and as a political figure afterward. Less noticed was the 63 month jail sentence imposed on former National Security Agency (NSA) contractor Reality Winner on August 23 at a federal court in Georgia for supposedly violating the Espionage Act of 1917. Ms. Winner had in early 2017 turned over to online investigative news outlet The Intercept classified documents relating how the Russians had meddled in the 2016 presidential election. For many people and for Ms. Winner herself, what she did was more whistleblowing about malfeasance in the United States government than espionage on behalf of a foreign power because the NSA obviously knew of the meddling but for reasons it won’t specify sat on that information.

 

We Support Whistleblowers Free Bradley Manning (Chelsea Manning) Twin Cities Pride Parade (9181428436)
2013 Twin Cities Pride Parade in Minneapolis, Minnesota, in support of whistleblower Bradley (later Chelsea) Manning. Photo by Tony Webster.

Reality Winner is the latest in a recent series of whistleblower defendants to be charged by the government under the Espionage Act, starting in the Barack Obama administration. The most notable whistleblowers charged have been Army Private First Class Bradley (now Chelsea) Manning in 2010, Central Intelligence Agency (CIA) officer John Kiriakou in 2012, and NSA contractor Edward Snowden in 2013. Ms. Manning and Mr. Kiriakou have served time in prison, and Mr. Snowden lives as an asylum seeker in Russia. The Espionage Act was always a draconian piece of legislation open to abuse by authoritarians in power, but it is only in the past ten years that those authoritarians have enlisted it to hammer down on whistleblowers to intimidate others into silence.

Calling whistleblowers national heroes in no way takes anything away from Senator McCain. Rather, it broadens the concept of heroes to include those whose patriotism included the courage to speak out against abuses of patriotism and authority by those in power. Sitting quietly by while a foreign power meddles in American elections is not patriotism, and neither is putting a lid on military abuses in Iraq or condoning torture by CIA agents or spying on American citizens at home. Whistleblowing on those abusers and their actions is true patriotism, while using the heavy hand of the Espionage Act to prosecute the whistleblowers is another abuse of government authority.
— Vita

To those principled individuals bothered by abuse of authority and ethical dysfunction within any system the two options available are fighting or selling out, as illustrated in this scene near the end of the Mike Nichols film Catch-22, with Alan Arkin as Yossarian, Martin Balsam as Colonel Cathcart, and Buck Henry as Colonel Korn.

 

Mind Your Peas

 

The federal government sends out mixed signals about dietary health by promoting the establishment of fast food restaurants in poor city neighborhoods on the one hand, and then on the other hand advocating healthier eating by limiting consumption of fast food. It boosts the use of cheese in fast food items, and then suggests consumers curtail their dairy consumption. It works hand in glove with ranchers in the beef industry by leasing grazing rights to federal lands at minimal cost, and then warns the public off eating too much red meat. That’s a lot of taxpayers’ money wasted on bureaucrats working at cross purposes with each other.

Dane county farmers market
The Dane County Farmers’ Market in September 2007 on the grounds of the state capitol building in Madison, Wisconsin. It is the largest producers-only farmers’ market in the nation. Photo by Kznf.

 

People are so inured to confusing messages about what’s healthy to eat and what isn’t that most of them pay little mind to medical experts and bureaucrats, or rather they take their advice with a grain of salt, and that would be in the form of sea salt or Himalayan salt for foodie elites, and regular old table salt for everybody else. Like everything else in America over the past thirty or forty years, food culture has split into two halves, something akin to the haves and have nots. There are the foodie elites of the professional and upper classes, and then there is everybody else, from the lower middle class which is frantically scrabbling to keep from sliding down into the working class, which is itself struggling to stay one step ahead of poverty.

Americans can make a quick, cheap meal of sorts from a one or two dollar box of macaroni and cheese mix. For some, meals like that are their only option. It’s disgraceful that people of limited means should have to bear the disdain of people with nearly limitless means because their diet is based on calorie value per dollar over nutritional value. The poor and the economically struggling don’t have the luxury of being absolutely sure of their next meal. As to how the well off view their meals, anyone who has ever worked as a table busser or as one of the waitstaff in a high end restaurant can attest to the tremendous amount of food wasted by the patrons, even though they may be spending for one meal what a working class person can expect to earn in a day. The upper classes have that luxury because they have the security of knowing there’s more where that came from.

Sid Caesar and Imogene Coca as The Hickenloopers, Charlie and Doris, visit a health food restaurant in a skit from Your Show of Shows, with appearances by Howard Morris as another customer and Carl Reiner as a waiter.

Unlikely as it seems, the fast food restaurants scorned by foodie elites as obesity enablers for the great unwashed may hold the key to turning things around for people who can’t afford to buy groceries at Whole Foods, aka Whole Paycheck. Taking their cue to keep promotions of healthier options low key so as not to arouse the suspicions of poorer customers whose purchases are based on calorie value per dollar, yet feeling increased pressure from public health groups to offer healthier foods, fast food restaurants increasingly change ingredients and practices in a balancing act to satisfy both constituencies. No one will ever claim that a cheeseburger and fries are healthier than a homemade meal of vegetables from a farmers’ market, but given the realities of human psychology and the country’s current economic conditions, demonizing those who choose to eat the former more often than the latter is ultimately unhelpful in lessening the obesity epidemic, while reinforcing the widening economic inequality that is driving it.
— Izzy

 

Too Hot for School

 

There never was any truth to the notion that schools closed in the summer so that farm children could help out with chores at home, and the real reason had to do with urban schools having low attendance in the summer and teachers and administrators wanting a summer break to escape city heat in the days before air conditioning, as well as using the extended break to pursue avocations or take temporary jobs. Farm children were needed at home in the spring for planting, and then again in the fall for the harvest. While it’s true farm work never slacks off entirely, particularly when animal husbandry is involved, there still were lulls in the summer and in the winter when children could attend school. Through most of the nineteenth century, a short school year was sufficient for farm children who had no ambitions in learning beyond the sixth or eighth grade. Farm children who had greater ambitions resorted to supplementing their learning on their own when they could, much like what we know about how Abraham Lincoln learned to become a lawyer.

 

The modern summer break came about instead from the needs of urban school administrators and the upper and middle economic class students and their families who supported many of the schools. The needs of poor students and their families, as always, hardly entered into the concerns of the rest of society. Before school attendance became compulsory in the late nineteenth century, urban schools were open year round, but often were only half full, and even less than that in the summer. School administrators eventually came around to following the model of colleges by closing for the summer so that students and teachers could pursue other interests outside the baking cities, leaving behind only enough staff to help students who needed to take extra courses of learning during the break. Public health officials added their approval to emptying out the schools in summer because they deemed the hothouse conditions unhealthful in general, and not conducive to learning in particular. By the early twentieth century school administrators had generally adopted the summer break, which started in late May or early June, and ended in late August or early September.

September - back to work - back to school - back to BOOKS LCCN98509757
A 1940 Works Progress Administration (WPA) poster promoting reading and library use upon returning to school in September after the summer break.

The system appeared to work well for most of the twentieth century. Rural schools synchronized their schedules with those of their urban counterparts so as not be left behind as it became increasingly clear a high school diploma was the minimum academic achievement necessary in modern society. The tourism industry could count on a steady source of both customers and labor during the two to three month summer break. The American public school system ranked highly among the school systems of other industrialized nations, even with its extended summer break. Then in the late twentieth century alarm bells started sounding about the supposed failings of that highly successful public school systems, the details of which are beyond the scope of this article, and so in effort to increase academic rigor, or at least appear to do so, school boards have been eroding the summer break, largely on the back end.

Satterfield cartoon - Back to School (1913)
A 1913 “Back to School” cartoon by Bob Satterfield (1875-1958) that captures how most children have always viewed the occasion.

 

In many school districts, fall semester classes now start in the first weeks of August. School may have ended in mid-June, leaving less than two months for the summer break. And yet still academic achievement appears to be falling, at least among the middle and lower economic classes. That also is another article for another day. For today it is sufficient to point out that the public school system does not exist in isolation from the greater society, and lackluster academic achievement by the students cannot be remedied merely by making them sit at their desks for more days every year.

The problem is in quality, not quantity. The society as a whole is fracturing, and the problems with poor learning begin and end in the home. The long summer break enacted by the twentieth century public school system was an excellent compromise that worked well for nearly everyone except families that had both spouses working outside the home. That presents a difficulty today, too, but the answer is not in charging the public schools with child daycare duties and calling that increased academic rigor. It’s not. August is too hot for school, in air conditioned facilities or not. August is for causing students anxiety about the imminence of schools reopening when they start seeing “Back to School” sale advertisements, which now also draw the attention of their teachers, who too often feel pressed to use their own money to buy supplies for their students. July is too early for a return of that unique schooldays anxiety, especially when schools closed only a few weeks before, in June.
— Vita

 

It Grows without Spraying

 

A jury at San Francisco’s Superior Court of California has awarded school groundskeeper Dewayne Johnson $289 million in damages in his lawsuit against Monsanto, maker of the glyphosate herbicide Roundup. Mr. Johnson has a form of cancer known as non-Hodgkin’s lymphoma, and it was his contention that the herbicides he used in the course of his groundskeeping work caused his illness, which his doctors have claimed will likely kill him by 2020. Hundreds of potential litigants around the country have been awaiting the verdict in this case against Monsanto, and now it promises to be the first of many cases.

WEEDING SUGAR BEETS NEAR FORT COLLINS. (FROM THE SITES EXHIBITION. FOR OTHER IMAGES IN THIS ASSIGNMENT, SEE FICHE... - NARA - 553879 (cropped)
Migrant laborers weeding sugar beets near Fort Collins, Colorado, in 1972. Photo by Bill Gillette for the EPA is currently in the National Archives at College Park, Maryland. Chemical herbicides other than Roundup were in use at that time, though all presented health problems to farm workers and to consumers. Roundup quickly overtook the chemical alternatives because Monsanto represented it, whether honestly or dishonestly, as the least toxic of all the herbicides, and it overtook manual and mechanical means of weeding because of its relative cheapness and because it reduced the need for backbreaking drudgery.

 

Monsanto has long been playing fast and loose with scientific findings about the possible carcinogenic effects of glyphosate, and the Environmental Protection Agency (EPA) currently sides with Monsanto in its claim that there is no conclusive evidence about the herbicide’s potential to cause cancer. In Europe, where Monsanto has exerted slightly less influence than in the United States, scientific papers have come out in the last ten years establishing the link between glyphosate and cancer. Since Bayer, a German company, acquired Monsanto in 2016 it remains to be seen if European scientists will be muzzled and co-opted like some of their American colleagues.

 

Empty Glyphosate (Herbolex) container discarded in Corfu olive grove
The intensive use of glyphosate herbicide to remove all ground vegetation in olive groves on Corfu, a Greek island in the Ionian Sea, is evidenced by the large number of discarded chemical containers in its countryside. Photo by Parkywiki.

The scope of global agribusiness sales and practices that is put at risk by the verdict in Johnson v. Monsanto is enormous. From the discovery of glyphosate in 1970 by Monsanto chemist John E. Franz to today, the use of the herbicide has grown to the preeminent place in the chemical arsenal of farmers around the world and has spawned the research into genetically modified, or Roundup Ready, crops such as corn, cotton, and soybeans. There are trillions of dollars at stake, and Monsanto and its parent company, Bayer, will certainly use all their vast resources of money and lawyers to fight the lawsuits to come.

Because scientists have found traces of glyphosate in the bodies of most people they have examined in America for the chemical over the past 20 years as foods from Roundup Ready corn and soybeans spread throughout the marketplace, they have inferred it’s presence is probably widespread in the general population. That means there are potentially thousands of lawsuits in the works. Like the tobacco companies before them and the fossil fuel industry currently, agribusiness giants will no doubt fight adverse scientific findings about their products no matter how overwhelming the evidence against them, sowing doubt among the populace and working the referees in the government.
— Izzy

 

Everyone Knows it’s Windy

 

The Trump Baby balloon that floated over London, England, last Friday was the culmination of efforts on the part of graphic designer Matt Bonner and a team of political activists and balloon fabricators who wanted to make a statement about the petulant and childish temperament of the current American president. As a mocking indictment of his destructive behavior, it is an effective piece of work. The activists plan to have the balloon shadow it’s real-life angry baby model as much as possible wherever he travels around the world.

Trump Awakens (43381966091)
The Trump Baby balloon rises over London’s Parliament Square. Photo by Michael Reeve.

Large balloon caricatures came about with the work of Tony Sarg on the Macy’s Thanksgiving Day parade in the 1920s. Mr. Sarg was a German-American puppeteer who took the concept of marionettes and simply turned them upward and inflated them, though the comparison ends there because the guide ropes for a balloon caricature do no more than tether and control them, as opposed to the thin wires that puppeteers use to manipulate the movements of small marionettes.

The technology for creating large balloons with discretely modeled characteristics like arms and legs has changed over the years, of course, with the biggest difference coming in the planning stage when designers can now model the character with 3D animation on a computer, eliminating some of the trial and error involved in the design and fabricating of earlier balloons. Experienced engineers at the fabricating plant can then examine those computer designs and make or suggest alterations that will improve the balloon’s stability when floating overhead and streamline its manufacturing, all without greatly changing if possible the designer’s intent.

Churchill statue Westminster
The statue of Winston Churchill in London’s Parliament Square. Photo by Braveheart.

These protest balloon caricatures appear to be gaining popularity, and it’s easy to see why since they fit the criteria of making an impact over a wider area than a hand-held placard and they can show up around the world as needed with a relatively small support team. An excellent graphic design can also generate revenue for the protest movement through merchandising. The main difficulty in deploying the balloons is in securing permission from government officials, which ought not be that much different from acquiring the usual permits for a protest other than stipulating a maximum height for the balloon when it is in the air.

Since the balloons are not intended to float higher than about 50 feet, conflicts with aviation should be minimal. The main obstacles aloft to safe deployment, besides high winds, are things arising from the ground such as power and light poles and electrical and communications cables. Let’s hope these symbols of protest continue floating freely wherever there’s enough helium a need for them, as a reminder to everyone that many powerful public figures need to have the air let out of them, not necessarily for their benefit since it can be all but impossible to deflate their often massive egos, but for ours as citizens in a still relatively free society.
— Techly

 

1 2 3 4 5 6 9