Racing Ahead

 

In the 1965 comedy film The Great Race, loosely based on a 1908 race around the world, the lead characters drive racing versions of gasoline powered internal combustion engines. That the earliest cars used gasoline would seem to be without question considering how things developed through the rest of the twentieth century. It comes as something of a surprise then to learn that electric cars were quite popular in the early years of motor vehicle development, and it was an electric car that won the first closed circuit automobile race in the United States, in 1896.

Halfway in their race around the world, the characters portrayed by Jack Lemmon, Peter Falk, Tony Curtis, and Natalie Wood are marooned on a melting ice floe in the Bering Strait. Though certainly unintentional in 1965 when the film was made, there is some irony to their situation given the perspective of today’s warming climate.

As anyone can tell, electric cars all but disappeared until recently, as infrastructure and cost improved for gasoline engines in the early twentieth century, overtaking the electric option by 1920. The price of oil went down, giving a boost to the market for gasoline engines, while the crude state of battery technology limited the appeal of electric cars. Environmental impacts were not even a factor in the equation for most consumers or manufacturers until late in the twentieth century. Even then, the initial assessments of the impact of vehicular pollution was limited to local problems such as smog. It wasn’t until the last decades of the twentieth century that at first scientists, and then the public, looked at the larger impact of tailpipe emissions on the global climate.

Now, in the early twenty-first century, after some halting steps by manufacturers to reintroduce electric cars, it appears they are gaining in popularity, particularly in places like China which face deadly levels of air pollution. Battery technology, the Achilles heel of electric cars, has made great strides lately. A question that doesn’t crop up often enough, however, is whether electric cars are as environmentally friendly as the manufacturers would have the public believe they are. In many cases, electric cars still run on power generated by burning fossil fuels, it’s just that they give an illusion of green running because they’re not emitting noxious fumes. The noxious fumes are instead displaced to a coal or natural gas fired power plant more or less many miles away. Out of sight, out of mind.

Kintigh Generating Station - Somerset, New York
The coal fired Kintigh Generating Station in Somerset, New York, in 2007; photo by Matthew D. Wilson.

The batteries in electric cars don’t present as big a problem from an environmental standpoint as they used to, now that up to 98 percent of the materials are recycled. To make an electric car run truly green, the power source used to charge its batteries needs to come from renewable generators like wind and solar. Since most air pollution comes from gasoline internal combustion engine exhausts, it stands to reason that a major switch over to electrically powered vehicles running on renewable energy will make the single greatest impact on reducing air pollution, and with it the particulates and gases that are contributing to global warming.

Organizations like NASCAR and Formula One racing could do their part in flipping the switch by turning all or part of their circuits over to electric cars. Besides being a spectator sport, car racing has always served as a proving ground for manufacturers. The big racing organizations are still clinging to the old technology, which may be popular with fans who enjoy the noise and familiar smells produced by internal combustion engines, characteristics evocative by long association with high horsepower. To continue glorifying this outmoded technology means that well-known racing organizations have abandoned any meaningful proving ground aspect of their sport for the sake of pleasing the crowd with loud noise, fumes, and ludicrously low miles per gallon of fuel efficiency. Never mind tomorrow, they’re living for today, come what may.


Solartankstelle
Younicos Solar Filling Station at Solon SE Headquarters in Berlin, Germany in 2009; photo by Busso V. Bismarck.

Newer racing organizations are stepping forward with their own electric car circuits. As drivers test and prove the newer technology on the race track, manufacturers should be able to improve efficiency of the batteries and perhaps drop the price of consumer models to be on a par with, or even cheaper than, comparably equipped gasoline powered cars. When that happens, electric cars will start to overtake the old technology, the same way they were overtaken in their earliest form by the internal combustion engine in the early twentieth century.

The crucial piece of the puzzle needed to solve pollution problems comes from the power generating source, not the cars. That may happen on a more individual level than on a corporate or government level, as people will find it convenient to do most of their car charging at home, where they can be assured of a cleaner source by installing their own solar panels or wind turbines. Waiting for government to promote the necessary infrastructure changes to ensure cleaner power generation will not push improvements in transportation, decrease pollution, and ultimately limit the effects of global warming, not with the government currently in power.
― Techly

 

What Time Is It?

 

This week the Supreme Leader of North Korea and the Supreme Leader of the United States traded more schoolyard threats in their ongoing spitting contest, with the specter of nuclear war hanging in the balance. These two strange hair baby men really should sort out their differences between themselves and leave everyone else alone to go about their business. In January, the Bulletin of the Atomic Scientists published their annual Doomsday Clock, and they had moved the minutes hand up from three minutes to midnight in 2016 to two and a half minutes to midnight this year (they do not use a seconds hand, only hours and minutes). The scientists cited the possibility of just the kind of belligerence we witnessed this week. They know their Supreme Leaders inside out.

 

The scientists responsible for setting the hands on the Doomsday Clock do it only once a year, in January, and therefore we must guess what the hands would read right now if they were inclined to change them. Certainly closer to midnight. The last occasion for setting the hands this close to midnight was during the coldest part of the Cold War in the Eisenhower administration. At that time, nuclear proliferation was not what it is now, and the major concern for confrontation was between the United States and the Soviet Union. The Soviet Union was led by Nikita Kruschev, a colorful man but certainly more mentally stable than Kim Jong-un, the current Supreme Leader of North Korea. There is no comparison worth making between Dwight Eisenhower and the current Supreme Leader of the United States other than Eisenhower enjoyed golf and so does the new fellow, apparently to the exclusion of doing his job.
The White Rabbit (Tenniel) - The Nursery Alice (1890) - BL
In this illustration by John Tenniel (1820-1914), the White Rabbit anxiously checks his pocket watch shortly before disappearing down the rabbit hole, followed by Alice, in the first chapter of Alice in Wonderland, by Lewis Carroll (1832-1898).

 

Back to the clock, when the hands were near midnight in the 1950s, the leaders of the time understood the risks. Both had seen the devastation of war up close. There is not that sense with the two leaders facing off now. Both are spoiled, privileged inheritance babies who want their own way no matter what pain it may cause others. The pain and suffering of others is not even part of the equation for them. Will Rogers, the homespun American humorist of the early twentieth century, had a comment about Congress which applies well to both Supreme Leaders in the current confrontation: “This country has come to feel the same when Congress is in session as when the baby gets hold of a hammer.” Time is running out.
― Techly

Richard Widmark portrays the monomaniacal captain of a US Navy destroyer in the 1965 film The Bedford Incident. On the bridge with Widmark in this scene are Sidney Poitier as a civilian photojournalist, Eric Portman as a German naval advisor for NATO, and James MacArthur as the hapless ensign at the rocket firing controls.

 

The Name of the Game

 

Board games don’t require the high technology of video games, but they are enjoying a resurgence in popularity nonetheless, and in many respects they are opposite in nature to video game culture. Playing a board game is a social activity for two or more people, usually in the same room. While there are electronic ways to play board games remotely, such as chess via email, most people still play the games face to face. Board games can be violent in the players’ imaginations only, and game graphics are usually reserved. Like all games, however, they lean toward competitiveness over cooperation, with some board games promoting a winner-take-all outcome.

Monopoly has been such a game, popular since the 1930s when Parker Brothers introduced it. Or at least that was the story until the 1980s, when it became clear that the game was older, and that Parker Brothers had not been first with it. Elizabeth Magie patented the game in 1904 (renewing it in 1924), with two sets of rules, both meant as teaching tools to demonstrate the value of cooperation over unfettered capitalism. She called her game The Landlord’s Game, and under one set of rules, called “Prosperity”, land was taxed and the goal of the game was to make sure the player lowest in resources eventually doubled them, in which case every player won. The other set of rules, called “Monopolist”, contained the elements of the game as we know it today.


Ms. Magie did not mass market her game, with the result that homemade variants popped up over the years, mainly in the parlors of educators at Eastern colleges, who enjoyed the game more or less for its instructional value, as Ms. Magie had intended. She made no money from it. In 1932, Charles Darrow started marketing the game as his own after dropping the “Prosperity” set of rules and changing the name to Monopoly. It is unclear how much he knew about the game’s true origin. He sold the rights to Parker Brothers in 1935, and it was Parker Brothers that, after hearing rumors about Mr. Darrow not inventing the game, investigated and found Elizabeth Magie to sign a deal with her. Parker Brothers made and sold Ms. Magie’s The Landlord’s Game, with her original sets of rules, but it apparently didn’t sell well and soon faded into obscurity.

Landlords Game board based on 1924 patent
The Landlord’s Game board, based on Elizabeth Magie Phillip’s 1924 patent; image by Lucius Kwok.

Meanwhile, Parker Brothers continued promoting Charles Darrow as the inventor of the game Monopoly, and that game sold quite well. All games teach lessons, whether overtly as in Ms. Magie’s Landlord’s Game, or in a way most of us take for granted, as in Mr. Darrow’s Monopoly. We take for granted that people are competitive to a fault and that capitalism serves the seemingly inherent nature of people to take all they can, and if their riches come at the expense of others, then so be it. We take it for granted, but it is not entirely true. Human nature as represented in the lessons of the game Monopoly is an aberration. Why then has it become such an immensely popular board game? Why isn’t The Landlord’s Game, with its depiction of a more equitable world, just as popular?

Perhaps the thing about games is they are just that – games. They allow for a certain amount of role playing, of more or less behaving in a way a person would not behave in everyday life. Some games, video games in particular, take that release too far, or rather the players do. There are cooperative board games available, and they are becoming more popular, but they will most likely never equal in popularity their competitive cousins. There are other games, such as Anti-Monopoly, that try to redress the twisted lessons of Monopoly. The game Class Struggle goes even further. Will they ever become more popular than Monopoly, or even come close? Probably not. It’s good for players to stretch out in their games, however, and learn other lessons and other ways of being, and not end up with a one dimensional, selfish view of the world, a view that glorifies some others who probably shouldn’t be glorified, the ones who call themselves “winners”, though they take everything and leave nothing for anyone else, just like at the end of Monopoly.
― Techly

 

Anything Is Possible

 

“Precisely because of human fallibility, extraordinary claims require extraordinary evidence.”
― Carl Sagan, speaking about alien abductions.

At a hearing last week of the space subcommittee of the House Science Committee, Representative Dana Rohrabacher (R-CA) asked NASA scientists if it was possible there was a civilization on Mars thousands of years ago. Kenneth Farley, a professor of geochemistry at the California Institute of Technology, answered there was no evidence of a civilization. Representative Rohrabacher could have been referring to the story on the Alex Jones InfoWars website last month about a slave colony on Mars, or he could have been referring to stories dating back to the 1970s about the “Face on Mars”, one of the supposedly artificial constructs among others in the Cydonia region of Mars. In any event, no one but Mr. Rohrabacher knows for sure.


Sagan Viking
Carl Sagan with a model of the Viking lander in Death Valley, California. Sagan (1934-1996) devoted the fifth episode, called “Blues for a Red Planet”, of his thirteen part 1980 PBS documentary Cosmos to Mars and the possibility of extra-terrestrial life.

Both of the above mentioned stories are what people generally call conspiracy theories. Mr. Jones in particular is almost always referred to by mainstream media as a conspiracy theorist. They use the term pejoratively, as a smear, and in Mr. Jones’s case they are probably within bounds for doing so, though the haughty contempt attached to their use of the phrase also serves to dismiss people whose objections to the standard media or government line on any story are offered with more substantial evidence and sounder reasoning. To call someone a conspiracy theorist is to lump that person in with Mr. Jones and his far out contemporaries.

The public must use critical thinking in evaluating conspiracy theories, the conspiracy theorists who propound those theories, and their critics who attack them. Unfortunately, critical thinking appears to be in short supply lately. Many fake news stories gain traction among the gullible in the online echo chambers where people go to read opinions and conspiracy theories they want to believe. It’s all fun and games until a half wit with an assault rifle decides to take matters into his own hands, as happened with the Pizzagate conspiracy theory circulating online last year.

 

It’s unrealistic, silly, and unconstitutional to try to shut down the websites peddling the most egregious conspiracy theories. Education in critical thinking is the only way to combat the spread of lies, but there will always be people immune to learning. All that can be done in their cases is to limit the damage they can cause. Conspiracy theorists do serve a positive purpose, however, in poking holes in an official story. Rulers and their mouthpieces in the corporate media have an interest in constructing stories for the public to cover up their crimes or unethical behavior. Critical thinking by the conspiracy theorists and those willing to hear them out serves an important watchdog role in such instances. Just because the government of a supposedly democratic republic such as the United States tells a story about something does not mean that story is entirely, or even partially, true, and to dismiss critics of the government’s story as conspiracy theorists becomes a cynical method for shutting down debate.

Flammarion
A wood engraving by an unknown artist that first appeared in Camille Flammarion’s L’atmosphère: météorologie populaire in 1888. The image depicts a man crawling under the edge of the sky, depicted as if it were a solid hemisphere, to look at the mysterious firmament beyond. The caption underneath the engraving (not shown here) translates to “A medieval missionary tells that he has found the point where heaven and Earth meet…”


A scene near the end of Oliver Stone’s 1991 film JFK, with Kevin Costner as New Orleans District Attorney Jim Garrison being filled in on a theory of the assassination by government insider, Mr. X (modeled on Fletcher Prouty), played by Donald Sutherland. The film was successful and was praised by critics, but major media and government figures labeled Stone a conspiracy theorist and took him to task for telling a story antithetical to the “Lone Gunman” findings of the Warren Commission.

There very well could have been a civilization on Mars long ago, though scientists contend it is unlikely. After all, we are still discovering – or rediscovering – ancient civilizations here in our own backyard on Earth. A present day day slave colony on Mars is even more unlikely, to the point of being improbable. Scientists do hypothesize that life, in the form of microbes at least, may once have been present on Mars billions of years ago, before it lost most of its atmosphere and it’s liquid water either evaporated off into space or turned into ice locked within rocks. Some of that microbial life, according to the theory of panspermia, may have seeded itself on Earth long ago when meteorite impacts were more common in the solar system, and rocks flung into space from impacts on Mars found their way to Earth. In that sense, it’s possible we are all descended from Martian life. The scientific consensus, however, is that life originated on Earth, and if there is any cosmic seeding going on, then our planet is the one doing it. In the universe as we understand it, anything is possible, but in critically thinking about agreed upon facts known as evidence, we come to realize that some things are more likely than others, and are even probable. In the most critical view, nothing is certain.
― Techly

 

We’ll Be Grading on a Curve

 

This past Wednesday, July 12, many internet companies and net neutrality advocacy groups participated in a “Day of Action to Save Net Neutrality”. They were attempting to influence Federal Communications Commission (FCC) Chairman Ajit Pai and his two fellow commissioners during the public comment period on reversing the 2015 FCC net neutrality rules, or as Chairman Pai would have it, “Restoring Internet Freedom”.
Ajit Pai - Caricature (33950745973)
2017 caricature of Ajit Pai by DonkeyHotey.
The public comment period is open until mid-August and is all well and good, but based on Chairman Pai’s previous comments as well as recent remarks, the entire thing is merely a charade to satisfy bureaucratic regulations. After the public comment period is over, Chairman Pai and the other Republican on the Commission’s board will vote to roll back net neutrality and deal with the consequences in court over the next few years. The FCC board has space for five commissioners, but currently there are only three, two Republicans and one Democrat. [Editor’s note: For an accessible version of the Wikipedia page about the FCC, click here; the amount of commissioners listed on the page may have changed since this post was written.]

A fantasy scene from the 1983 film A Christmas Story, with Peter Billingsley as Ralphie, and Tedde Moore as his teacher, Miss Shields.

 

The recent remarks from Chairman Pai that make a mockery of the public comment period have to do with his off-hand dismissal of the sheer number of genuine letters, calls, and emails in favor of net neutrality on the grounds that numbers will not sway him, only the content as he judges it. Oh. In that case, he may be judging these comments, or compositions, based on grammar, originality, penmanship, and interesting presentation. Please have them all on his desk by mid-August, as late comments will be marked down for tardiness.

Doris Day sings the theme song from her 1958 film co-starring Clark Gable.

 

Mr. Pai, formerly a lawyer for Verizon, has not shown as much critical judgment of the anti net neutrality comments the FCC has received, many of them astroturfed. Those comments must have been from the “D” students, and Mr. Pai, in the interest of fairness to everybody, but particularly to them, has decided to overlook their faults and boost their grades at least to “C”. The smarty pants crowd will take what marks Mr. Pai gives them in the interest of Restoring Internet Freedom to Verizon, Comcast, CenturyLink, AT&T, and other mega-millions Internet Service Providers (ISP). There’s the level playing field everyone likes to believe in, and then there’s the reality of the playing field groundskeepers have groomed to suit the home team. To get all these mixed metaphors to agree, think of the grading curve fix which benefits major sport athletes in school. Verizon? A+!
― Techly

 

Cooling the Customers

 

Air conditioning and movies – or movie theaters – go together so well that it’s hard to imagine a time without the benefits of both together. In 1902, just as movies were getting started, Willis Carrier (whose company made the political news in 2016), a mechanical engineer, invented the first modern air conditioning plant to help a Brooklyn, New York, printing company solve a paper wrinkling problem at its facility. It wasn’t until 1925 that Carrier got together with a movie theater owner to install air conditioning at the Rivoli Theater on New York City’s Times Square. It was a match meant to be, and from then on the summer, which had been the poorest season for movie theaters, became the richest as people attended movies as much for the air conditioning as for the entertainment.

When The Seven Year Itch, starring Marilyn Monroe and Tom Ewell, appeared in theaters in 1955, most houses and apartments did not have air conditioning. In the scene before this one, they leave an air conditioned movie theater after viewing Creature from the Black Lagoon, a 3D monster movie the appeal of which, for them, was probably not as great as the cool comfort of the theater itself.

Home air conditioners were still unusual in the 1950s and 1960s, but by the 1970s most homes had some form of air conditioning, whether central or window units. Movie fans no longer flocked to theaters in summer only for the sweet relief of a few hours respite from summer’s heat and humidity. People continued going to see movies in theaters in summer on account of children being out of school, and how air conditioning in theaters since the 1920s had established summer as movie season. Watching movies at home was still unsatisfactory because of small television screen sizes, low picture resolution and poor sound, and a lag of one or more years before Hollywood would release movies to television.


Meredith Willson 1967
Meredith Willson, when he appeared on the Texaco Star Theater television program in 1967. Willson, who was born in 1902, coincidentally the same year that Willis Carrier invented modern air conditioning, had a long career spanning Broadway theater, Hollywood movies, radio, and television.

All that has changed in the past forty years, of course, starting with home video technology and the ability to either buy or rent movies for home viewing. Theaters felt the pinch, and old style movie palaces shut down, relegating the movie theater experience for the most part to shoe box multiplexes at suburban malls. Drive-in theaters, another summertime movie going experience from a bygone era, shut down along with the air conditioned movie palaces. Now in the last ten years the home theater experience, for people who can afford it (and it becomes more affordable all the time), has progressed to the point that a fair portion of movie fans feel little pulling them toward returning to theaters. Their homes are air conditioned, their televisions and sound systems have gotten bigger and better, and Hollywood releases movies for home viewing so quickly that only the most impatient fans aren’t happy to wait a little while.
The old movie palace experience was something special that can’t be matched by watching a movie at home, no matter how comfortable and technologically sophisticated circumstances at home have become. Comedies and big, crowd pleasing musicals in particular seemed to take on a frisson of excitement when viewed in a well appointed theater among other patrons who were similarly enthralled. Now that theater owners around the country have finally gotten the message and are starting to move away from the nothing special, cookie cutter mall multiplex and toward building theaters that reestablish the grandeur that is only possible outside the home theater, it is questionable whether movie fans will return.

Meredith Willson’s most famous entertainment, The Music Man. Robert Preston, shown in this scene with Buddy Hackett, starred in the long running Broadway show before doing the movie version in 1962.

Some people have had time to drop the movie going habit, for one thing, and for another there is a relatively recent technology that has come into the equation which affects their enjoyment of movies – cell phones. In the theater, cell phone users interfere with the other patrons’ enjoyment of the movie, but at home, for those people who simply can’t do without their phone for even two or three hours, then at least they’re not annoying other paying customers, and for their own enjoyment of non-stop cellular connectivity there is always the pause button on their home theater remote control. Might as well stay home then to enjoy summertime movies, and keep your cool.
― Techly

 

 

Your Bitcoin or Your Files

 

The WannaCry, or WannaCrypt, ransomware that attacked mostly networked computers running unpatched Windows operating systems last month did not affect many non-networked home computer users, but that doesn’t mean those users will avoid future attacks. The computers of home users are often just as vulnerable as those used by banks, hospitals, and other large institutions. They are less likely to be attacked only because they aren’t generally tied into a larger network and because loss of their data is not critical. Home users also have less money, or access to Bitcoin, than large institutions, making an attack on them not as worthwhile for hackers.

 

Computer Using Cat
Cat using computer; photo by EvanLovely.
Any computer running any operating system connected to the internet is vulnerable to ransomware, malware, viruses, and a host of other exploits. Macintosh and Linux operating systems are partially less vulnerable than Windows, but not invulnerable. The same goes for the Android and iOS mobile phone platforms. Frequently updating an operating system with patches downloaded from the operating system provider is key to maintaining security. An equally important best practice is to avoid human error in daily computing, such as being wary of web links or attachments in suspicious emails, and even being careful of clicking on ads from unknown providers on sketchy websites. The internet is a teeming public square where pickpockets mix with everyone else, and where some side streets and alleyways lead to unwholesome places, increasing the likelihood of something bad happening.

 

All this seems like common sense and fairly common knowledge, so why are large institutions with professional Information Technology (IT) staff on hand nonetheless vulnerable to cybercrime exploits that home computer users who are conscientious about updating their software and careful when visiting the internet can usually avoid? Are the IT departments incompetent? The answer is they apparently do their best most of the time, like anyone else with a job to do, but their efforts are many times hobbled by that second factor mentioned above – human error. And the larger the organization and the more computers tied into the network, the greater the chances for one small human error to multiply throughout the organization. IT specialists are also hobbled by the unwillingness of higher ups to let go of outdated operating systems like Windows XP. The WannaCry ransomware targeted unpatched, networked Windows XP computers.


From Woody Allen’s 1969 movie Take the Money and Run, a job interview presumably for an IT position, with a nod to the old TV quiz show, What’s My Line?

Here we have blame enough to go around for everyone: from the executives who, whether out of cheapness or reluctance to overhaul their company’s computer systems, failed to modernize; to the IT specialists who, whether from incompetence or overwork, failed to install vital patches to an outdated operating system; to the end users or user sitting at a computer who, whether out of ignorance or foolishness, clicked on a malicious link or fell for a phishing scam, and then passed it on to co-workers. What made the WannaCry ransomware especially vicious was its ability to exploit the very minimum of human error in order to replicate throughout a network. Computer experts are still not certain of the attack vector WannaCry used to gain initial access. The patch Microsoft issued months earlier should nevertheless have protected Windows XP computers, human error or no.

 

1940 Oldsmobile Station Wagon
1940 Oldsmobile Station Wagon advertisement. You rarely see Woodies like this on the road these days!
Windows XP was Microsoft’s most popular operating system ever, and it’s understandable many users are reluctant to let it go. There are a lot of reasons Microsoft has tried to move on from Windows XP, as popular as it remains, and at this stage those reasons, good or bad, believable or not, are beside the point. The fact is Microsoft is moving on. For computer users to cling to Windows XP at this point is like automobile fanciers who own vintage cars: Yes, having a fine old car can be engaging, but don’t expect there will be many qualified mechanics available to work on it, or driving it on interstate highways will be a safe and effective means of travel in the 21st century. Windows 10, the up to date model of Microsoft’s operating system, has plenty of faults, among them being a data hog that is far too chatty with its home base so that it can mine the user’s personal data for sale, a lesson Microsoft learned well from Google, but at least it’s safely built for travel on today’s internet, the information superhighway, as Al Gore called it. Drive safely.
― Techly

 

Watching Out for Number One

 

After numerous high profile cases of questionable use of deadly force by the police in the past few years, the cry has gone up from the public and from politicians for more widespread use of police body cameras to augment the already prevalent use of dashboard cameras in police cars. The technology does not present a difficulty since data storage capacities have skyrocketed and battery strength in a compact device has increased enough to allow recording over an eight hour shift. The difficulty is with how human beings implement the technology and whether the technology will improve how police interact with citizens.

 

There is evidence that when police wear body cameras the incidences of police violence and abuse of authority declines. That is, the incidences decline when a rigorous protocol for the use of the body cameras is instituted and enforced by civil authorities and police management. In some places, the police have body cameras but their use is left too much up to individual officers, and that naturally leads to the officers recording only the encounters that they calculate will make them look good. A lax protocol like that amounts to none at all. The American Civil Liberties Union has put out an excellent article detailing the best ways to deploy police body cameras and the drawbacks their use may entail.
Barney Miller cast 1974
1974 cast photo from the television series Barney Miller. Clockwise from left: Ron Glass (Ron Harris), Jack Soo (Nick Yemana), Hal Linden (Barney Miller), Max Gail (“Wojo” Wojciehowicz), Abe Vigoda (Phil Fish) (back toward camera). The show took place in the fictional 12th Precinct in Greenwich Village, New York City. Over the years since its initial airing in the 1970s, police have praised the show as a more realistic portrayal of day to day police work than many higher octane TV shows and movies.

 

It’s not surprising that police behave better when they know they’re being watched. The question is why they bear watching. Certainly police work isn’t like warehouse work; police work is often stressful, with the ever present possibility of a dangerous encounter, and by its nature the work involves dealing with other people every day in an environment that can be hostile. That’s the job they volunteered to do. No one drafted them. The fact that it’s not relatively placid like warehouse work is therefore no excuse for police officers acting like dangerous loose cannons when the going gets tough, and definitely not when they feel like going off on someone for some piddling reason that they will later claim “made them fear for their life”. As the saying goes, you knew the job was dangerous when you took it.

 

Fort Apache Police Precinct, 2007
The former 41st Precinct Station House at 1086 Simpson Street in Foxhurst, The Bronx, New York, in the summer of 2007. The building was formerly known as “Fort Apache” due to the severe crime problem in the South Bronx; photo by Bigtimepeace.

The real problem is lack of accountability for officers who behave criminally, and a police culture that from academy training onward instills an “Us vs. Them” mentality. Body cameras are all well and good, and as a purely technological answer to the problem they are excellent, providing privacy issues for both the officers and the public are addressed. But body cameras will take the solution only halfway, if that. Until the public demands that criminal police officers face the kind of punitive and fiscal penalties everyone else in society must face, we’ll continue to see the same violent, bullying behavior from some cops. Paid administrative leave (a paid vacation for being a jerk!) and penalizing the taxpayers with a fine is not going to do it. How could anyone in their right mind, excepting a police union boss, expect otherwise?

The matter of police culture is harder to address. It starts with training and continues with taking away all the militaristic toys police departments have acquired in the past forty years, and most of that in the past twenty. No, you are not a soldier on garrison duty in a hostile foreign country. You are a police officer – a peace officer, if you will – at home amongst your fellow citizens, friends, and neighbors, vile as some of them may seem to your law abiding heart. Playing dress up in GI Joe gear with full body armor and intimidating your fellow citizens with armored personnel carriers and other cool stuff should not be part of the job description. Changing that macho police culture won’t happen, however, until the public stops living in fear of every little thing, handing over far too much authority, money, and blind obedience to a group of men and women meant to be our servants and not our masters, some of whom unfortunately respond to the situation by puffing themselves up with arrogance and steroids, looking and acting like goons any sensible person would run away from, rather than the friendly cop on the beat, a fellow citizen instead of an overseer.
― Techly

 

 

How About That Free Lunch Now

 

The great thing about the internet is that it is interactive; interactivity is also one of the bad things about the internet. When people read paper newspapers, way back when, they were exposed to advertisements paid for by commercial establishments in the news and features sections, and to classified advertisements paid for mostly by individuals or small businesses in a section of their own. Paper newspaper advertisements were interactive only in the sense that the reader could choose to ignore them. This was reasonably easy for the reader because the ads themselves did not hop up and down, yell and scream for attention, obfuscate the actual content of the newspaper for a period, or otherwise make a nuisance of themselves and detract from the peaceful enjoyment of the newspaper by the person who had paid a dime or a quarter for it.

When newspapers and writers of other content moved to the internet, they still needed to make a living, of course, and naturally they turned to advertisers to help fund their efforts. Since there was no pay model for the internet, such as had been the case in the days of paper newspapers when readers either subscribed for home delivery or paid directly at street corner kiosks, publishers relied even more heavily on advertisers for income. For some reason, people had gotten the notion that internet content should be free, and rightly or wrongly that’s the way things developed. Here is where the interactive part kicked in and started an internet arms race.

Bob Dylan performs his song “Mr. Tambourine Man” at the 1964 Newport Folk Festival. Dylan’s guitar and harmonica rig is much like the getup buskers used then and today to make a few dollars for their efforts. All that’s missing here is the hat or guitar case for collecting money tossed in by passers by. Many small websites, like this one, have to either pass the hat by posting a “Donate” button, or hope for the best from advertising revenue, or both.

Advertisers realized that since the internet was interactive and didn’t just lie there waiting to wrap fish after it was published like the old paper newspapers did, they could do things to jazz up their ads and, they thought, readers would pay closer attention and the advertisers would see higher returns. Great! Not all advertisers, just the ones who lacked any restraint, got their ads to hop up and down, to yell and scream for attention, to obfuscate for a period the content the reader was actually there to see, and to otherwise make a nuisance of themselves in order to draw attention. It turns out people did not like that, particularly the ones with slow internet connections or limited bandwidth, which the sparkly new advertisements ate into, much to the hapless reader’s dismay. Enter software engineers with a retaliatory response.

The software engineers had some experience in combating opponents in the advertising field after having worked to swat away the pop up army of advertisements that plagued internet users in the early days. One thing many advertisers have never been known for is restraint. Now here they were again, but instead of pop ups they were employing twitchy, sparkly, pushy advertisements. The software engineers working on behalf of browser makers and internet users came up with ad blockers. Now all ads were blocked. Hah hah! Internet users had the option of whitelisting – or permitting – ads on a website in the options menu of their ad blocker, but who would ever bother to do that? Publishers noticed, however, that their internet ad revenue plummeted.

An emotionally fraught rendition of “Silver Springs” in a 1997 concert by Fleetwood Mac, which demonstrates why they continued to draw large crowds well after their heyday. The song, written and sung by Stevie Nicks, who as a songwriter ranks in the top echelon of 1970s and 1980s pop and soft rock, is a deeply personal revelation. Fleetwood Mac had by 1997 long passed their peak of popularity for album sales, but concert ticket prices for such an established group with an extensive catalog of hits remained high, from $20 to $50 for the cheap seats, to over $100 for the best seats. The internet works similarly, with an enormous underclass of websites barely making it, and several well established websites with large followings dominating the market.

Enter Google in the spring of 2017 with the Funding Choices program and their own ad blocker built into their Chrome browser, which in the past year has overtaken Microsoft’s Internet Explorer as the world’s most popular browser. But since Google makes the lion’s share of its revenue selling ads and marketing user information, why would Google then be against ads? Because the obnoxious ads that prompted the development of ad blockers have poisoned the well for everybody, and Google, with its dominant market position, can dictate which ads will fly and which ones won’t.

The Funding Choices program is geared toward internet users, telling them they can pay to subscribe to a publisher’s content and go ad free, or view the content free on condition they allow ads, which Google assures them they have vetted for good behavior. Google’s ad blocker built into its Chrome browser is geared toward advertisers, telling them essentially that unless they allow Google to vet their ads for good behavior, they will not see the light of day on the world’s most popular browser. All of this would seem a boon to both internet users and publishers. But that depends on how much they trust “Don’t Be Evil” Google. Rather than turn over yet more power to Google, a company which has already surpassed Microsoft in ways not only financial but morally suspect, perhaps the time has come for internet users to seek alternatives not only for search but for the multitude of other applications which Google has used to ingratiate itself as the public’s servant, the servant whose ear is always at the door. This website, for one, will seek alternatives to displaying Google ads. Oh, you weren’t even aware there were Google ads on this website?
― Techly

 

Light the Way

“Better to light a candle than to curse the darkness.” ― the Rev. William L. Watkinson*(1838-1925)

The nonprofit organization Fight for the Future has set up a website called Comcastroturf which allows people to check if their name has been used surreptitiously to file anti net neutrality comments with the Federal Communications Commission (FCC). The FCC had recently opened up a public comment period in anticipation of rolling back net neutrality regulations, and within a short time they were inundated with anti net neutrality comments which appeared to be generated by Internet Service Provider (ISP) astroturfers from hijacked subscriber lists. Comcast threatened Fight for the Future with a lawsuit, from which they have since backed down after it appeared obvious even to them that it was a tone deaf public relations debacle. They blamed the company they outsource their brand defenses to for excessive vigilance.


Let there be light
Let there be light; photo by Flickr user Arup Malakar, taken in the Kamakhya Temple, a shakti temple on the Nilachal Hill in the western part of Guwahati city in Assam, India.

 

It’s clear from new FCC Chairman Ajit Pai’s remarks on net neutrality that he does not support the FCC net neutrality regulations adopted in 2015. Since the rest of the now Republican controlled board of commissioners will most likely go along with him when they vote on the regulations in September, the public comment period appears to be a mere formality. That apparently hasn’t deterred the ISP industry from feeling the need to generate a phony campaign to support the anti net neutrality commissioners. Perhaps they were concerned about the pro net neutrality campaign of John Oliver. Now Comcast and other industry giants have egg on their faces, and Chairman Pai has not escaped squeaky clean either, as he has declined to question the validity of the astroturfed comments, while distracting the attention of the public by complaining about mean tweets.

 

There are customers of Comcast who complain endlessly about the company’s high prices and lackluster, even adversarial, customer service, yet those customers do little or nothing to explore alternatives to Comcast. They sit grumbling in the dark and won’t lift a finger to light a candle. Such people do a disservice to themselves, but since they seem to be satisfied with that in a masochistic way, the rest of us are left having to listen to their grousing while they do nothing to improve their situation. That shouldn’t be a problem for those of us who can get out of earshot, though in the broader perspective it is those dissatisfied but immovable customers of Comcast who grant the company its monopolistic power in the marketplace, and that’s a problem for everyone because it limits affordable options as smaller companies struggle to establish themselves when confronting the obstacles thrown at them by industry giants and the legislators and regulators they have in their pockets.

 

Comcast and companies like it might behave better if they started experiencing mass defections. Comcast has been losing subscription television customers, while increasing broadband customers. Cord cutting does not affect Comcast’s bottom line if the customers doing so are merely moving their dollars from one part of the service to another. City dwellers and suburbanites have more options for internet service than people in the countryside, and so they have fewer excuses for continuing with Comcast even while they dislike the company intensely. Ultimately the choice comes down to a decision about what to give up, at least in the short term. Even without streaming video, life goes on. For the majority of the Earth’s inhabitants, the whining of some Americans about how to get along without access to each and every National Football League (NFL) game over either the internet or subscription TV, or the latest “must see” TV series, must appear ingloriously obtuse and selfish. These companies have too much power because we have given it to them. It’s like being abused in a relationship and not working to free yourself because you can’t imagine life any other way. Who knows, life untethered from the Comcast umbilical cord could be better than you think.
― Techly

 

Pigeon Point Lighthouse Fresnel
Pigeon Point Lighthouse, south of San Francisco, California; photo by Diedresm.

 

 

1 7 8 9 10 11