By Any Other Name

Some political labels are, at best, chameleons. It’s not their fault. They are complex constructs that inspire different responses in the minds of different people, involving innumerable stories that persist over decades. Their adaptive coloring matches the conceptual background provided by the observer, a background that either existed prior to their arrival or one that was freshly constructed according to political prejudices. Definitions and connotations vary widely and shift with the partisan currents, drifting in and out of favor among the majority of voters even as those who use a particular label to define themselves don’t alter their preferences or policies.

In the last half of the twentieth century the term “liberal” started as a highly popular label, associated with such efforts as the New Deal and Civil Rights legislation and such Great Society reforms as Medicare and the War on Poverty. The decline of liberalism may have begun with the War in Vietnam and the failed presidential campaign of the consummate liberal politician Hubert Humphrey. It further lost favor on the left with the rise of corporate centrism and the confusingly mis-titled neoliberal philosophy. But the lowest point in terms of public esteem followed not long after the 1980 election of Ronald Reagan. Reagan was regarded as an anti-government ideologue, but he was actually only opposed to liberal government. His infamous quote was “The nine most terrifying words in the English language are, ‘I’m from the government and I’m here to help’.” But if that “help” was to be in the form of policing efforts or military power he was more than happy to expand the governmental role and any related deficit expenditures. His attacks on big government were limited to liberal policies that were intended to improve the lives of individual people and to promote equitable treatment. In that effort he repeatedly denigrated the term liberal and those who professed liberal values, and demonization of liberals has been a strong feature of conservative thought ever since. The right-wing’s oddly and extremely oversimplified framing has for decades even conflated liberalism with socialism and communism and, through them, with totalitarian control.

At least the term “liberal” has fairly consistently been associated with policies to the left side of the political spectrum, even among most of its detractors. An older term, “populist”, has been used to describe such varied individuals as Huey Long, George Wallace, Ross Perot, and, in this most recent decade, both Donald Trump and Bernie Sanders. The original Populist movement of the late nineteenth century provides minimal guidance here, given that it was an extremely broad and diverse campaign. It began primarily as a rural farm-based revolt against corporate malfeasance and monopoly power, and it has retained a strong anti-oligarchic bent, but it soon expanded to incorporate such diverse and often contradictory issues as industrial worker rights, race equality, exclusion of minorities, eugenics, dominion theology, and anti-communism. The mix of policy preferences often depended on the location of the local sub-group. As the movement consolidated and as populist politicians gained nation-wide recognition, the dominant policies have become dependent on the specific leader of the moment. As a result, anyone who now considers themselves a populist must specify which individual populist leader they support, and people tend to skip the label and go directly to that individual. That often gives it the appearance of a cult of personality. The only consistent characteristic of populism seems to be that it is an ideological movement supported by a large collection of ordinary people—it is, in short, popular.

The problem with using populism as a label can effectively be illustrated by looking at who the media has most often referred to as populist in the decade leading up to 2020. On one hand we have Bernie Sanders, who is, indeed, the charismatic leader of a movement that strongly opposes oligarchic control of government and promotes specific reforms that would reduce corporate political contributions and lobbying, expand voter opportunities, and improve the lives of average people by reducing individual debt, guaranteeing health care, and expanding union power, infrastructure and the social safety net.

The other current politician referred to as populist is Donald Trump, an oligarch who has made broad claims about improving the economy and increasing employment and reducing corruption, but who forwarded almost no detailed programs, other than cutting taxes, to achieve those goals. The contrast between these two “populists” is extreme. The Sanders effort is closer in spirit to the original grassroots populist vision of the late 19th century. The Trump campaign is, if anything, closer to the vague populist leadership of George Wallace, even to its inclusion of, and support for, an energized coalition of white supremacists. During the 2020 election season many media reports have used “populism” as a derogatory or dismissive term, whether they were referring to Sanders or Trump. This is a tendency that tells the reader or listener more about media preferences than about the characteristics of the specific populist leader in question. As for the candidates, neither one has applied the term to himself or to his followers.

The Sanders campaign leads us to another political term that has widely divergent connotations. That is the term “socialist”. Whereas, in recent years, “populist” has been applied without modification (or definition) to two strikingly different individuals, “socialist” has been used to apply two strikingly different representations to one single individual. It is the true observer-defined chameleon. Among progressives, and in general among younger voters, “socialism” is now viewed as a positive concept, connected to thoughts of the stable social democracies of Europe and of popular social institutions such as Medicare and Social Security and public education. Although some pundits have expressed concerns about how the word would play among older voters and independents, in many ways support for Bernie Sanders has been enhanced by his embrace of the self-applied socialist label.

The leadership of the Republican Party, at the same time, sensed an opportunity in having a prominent Democratic candidate self-identify as a socialist. Since then they have done their best to promote two questionable concepts. First, they have worked to strengthen and exploit any and all existing negative biases against socialism, including associating it with extremes such as totalitarian communism and with the struggles of countries such as Cuba and Venezuela. Second, they took that perversely skewed popular interpretation and applied it not only to Sanders but to the entire Democratic Party, including, of course, former Vice President and candidate Joe Biden. The Republican strategy has been effective on the right, raising the fear level among those who still view the entire spectrum of leftists as bogeymen, and undoubtedly in that way have provided strong motivation for GOP base voters.

Much of what passed for political discourse in the 2020 election cycle demonstrated that labels can be used to influence voters without any reference to specific policies or issue preferences. Especially in campaign ads, political labels and associated vague inferences and exaggerations were commonly used to obscure issues, often replacing accurate description with hyperbolic misrepresentations of policy, in effect bypassing any real discussions of candidate plans and platforms. The avoidance of issues and solutions is one result of the failure, or often the refusal, to agree on meaningful definitions for common political labels, just as those definitions continue to be one result of the desire to avoid real issues. As long as people and their leaders are willing to promote and believe oversimplified distortions of such labels, such problems will continue.

Posted in Politics, Sociocultural | Tagged , , , , , , , | Leave a comment

Customary Collapse

As I began writing this, in late September, 2020, I have just finished viewing the installation of the casket of Ruth Bader Ginsburg in the statuary hall in the United States Capitol building. There were distinctive featuresin this ceremony; for example, the choice of Denyce Graves to perform two songs and Rabbi Lauren Holtzblatt to give the eulogy. The eulogy itself, containing elements of Hebrew religious rites and prayer, reflected the fact that Ginsburg was the first Jewish person as well as the first woman to be laid in state in the building. The room was also very sparsely occupied, with a much smaller number of attendees present and chairs widely spaced in deference to the Covid-19 pandemic.

However, the overall procedure and setting tells of something other than difference or uniqueness. The coverage begins with the arrival of a hearse at the east steps to the capitol building. An honor guard of nine men in dress uniforms representing every branch of the U.S. military marches out to the hearse and in well-rehearsed coordinated movements removes the casket from the vehicle. They carry it to the foot of the stairs and, in command-led moves taken at the rate of about one step every three seconds, raise it slowly up the 33 steps to the entrance to the central rotunda. Marching to a rhythmic “hup, hup” that echoes through the quiet structure, the honor guard marches from there through the rotunda, and turning to the left, they enter the Hall of Statuary, the former chamber of the House of Representatives. In that large semi-circular room, with the full-size bronze and marble figures of more than 100 prominent U.S. citizens looking on, the casket is marched to the center and placed on the black-draped Lincoln catafalque, a platform first “hastily constructed” in 1865 and used to support every casket placed in state in the U.S. capitol since then. A brief ceremony follows before those in attendance line up to pay their individual respects to the casket as they leave.

This process as viewed on television might be characterized as one plodding sequence after another, a composite spectacle performed by small groups of actors who move slowly and deliberately through a much larger scattering of participants who stand relatively immobile as they wait for the completion of one element after another. From one performance to another it varies only minimally, whether it is used to honor Ginsburg, or Representative John Lewis two months earlier, or Representative Elijah Cummings in 2019, the first Black man to receive the honor, or former president George H. W. Bush the same year, or Rosa Parks, who was “laid in honor” rather than “in state”, but using the same procedure, in 2005. The choice of eulogy and music varies, but other than that it is as if the specific individual in the casket is almost irrelevant to the ceremonial. That consistency, of course, may be one of the most important symbolic functions of the entire process. Consistency is also, I might note, one of the most important aspects supporting the continuity of any government, perhaps especially of any government dependent on “the will of the people.”

As a country we must recognize how much we depend on continuity and repetition, on the recycling of procedures, ceremonies, and other conventions that are maintained either by tradition or by law. We select our leaders through procedures which are controlled by individual counties and which vary in many details, but which are governed by a common set of expectations and rules. The winners of our elections are sworn in in one set of functions and there’s a different ritual to inaugurate the beginning of each legislative session. The sessions themselves are governed by Robert’s Rules of Order augmented by additional rules specific to the legislative chamber. Interaction between and within the varied federal agencies, the Department of Justice and Environmental Protection and Homeland Security and the White House, and between the Executive and Legislative, are all regulated not only by legal constraints but by traditional expectations and customs.

That fact about government brings us to the specific example that is the United States at the time I write these words, and to the aberrant executive administration that has been in power for just short of three years and nine months. After all, one of the recurring disappointments during the Trump reign, if I can use such a mild descriptive term for how I feel about events since 2017, is that we could never rely on the executive branch to recognize, much less adhere to, the laws, the rules, the customs, the normal expectations, all of the major and minor consistencies that normally govern its operations.

Begin with nepotism. Ever since the appointment of Robert Kennedy as Attorney General in his brother’s administration six decades ago, presidents have avoided bringing family members into high positions in the executive branch. All of that restraint was overturned with a vengeance when President Trump entered the White House. His daughter Ivanka and her husband, Jerold Kushner, became senior advisors to the president and have acted as representatives of the United States in negotiations with Japan. Kushner was additionally charged with a variety of official policy roles, including acting as a sort of ambassador-at-large cdirecting relations with the middle east. In that position he has led arrangements for arms sales to Saudi Arabia and a series of peace proposals regarding Israel. All this despite the fact that Kushner was initially considered ineligible for a top security clearance before that determination was overruled by the president.

The long-standing high status given to the daughter and son-in-law contrasts with the frequent turnover among other presidential advisors and cabinet members. More than 90 percent of the top positions have been replaced at least once and more than a third at least twice. This reflects the common observation that under Trump, the primary qualification for retention is loyalty to the president and his often-changing opinions, not the traditional measures of competence or effectiveness. The result is that the top echelon of our executive branch is filled with individuals who have repeatedly been caught on video defending the president’s talking points even after the arguments have been demonstrated to be false. The most egregious example of this is Trump’s misuse of the Department of Justice and the Attorney General and the Inspectors General of federal agencies, all of which the president has treated as if they were his own personal defenders and his representatives. The DOJ and its AG and the Inspector General positions are, in fact, charged with representing the United States and the Constitution, not the president or his actions. Yet many have been chastised and fired by the president for enforcing the law. The president’s actions are clearly a violation of their customary and necessary roles.

The loyalty problem is related to the administration’s frequent misuse of the Hatch Act, which prohibits federal employees from performing partisan political actions while on the job or while in a government office or vehicle or while wearing government identification. Like the use of cabinet officers to spread partisan talking points, Hatch Act violations undermine public trust in their government. During the Trump years, the Office of Special Counsel (OSC) has found Hatch Act violations by “at least thirteen” officials, including Mark Meadows, Dan Scavino, Marc Short, Ivanka Trump, and Kellyanne Conway. Some of these officials were reported for violating the Act several times. Other violations include the use of the White House lawn for the final event of the Republican Convention this August and for a large campaign rally just last week (October 10th). In regard to these last allegations, however, the OSC oddly ruled that the White House lawn and residence aren’t federal buildings and thus the event is not a violation–never mind the appearance of those buildings in the background–and White House aides who take leave from work are allowed to assist. The OSC has not ruled on the fact that during the GOP Convention there was also the use of an official naturalization ceremony at the White House as a promotional video (making matters worse, the participants were not asked for their approval).

The final and most egregious act betraying our expectations of tradition in governance has not yet occurred, but has been threatened repeatedly. This is President Trump’s refusal to commit to an orderly and peaceful transfer of power to his successor if he should lose the election next month. As citizens we have often complained about elections and questioned the results, especially when the vote totals are close, especially when the winner of the popular vote was declared the loser and even more so when that result was certified by a questionable Supreme Court intervention. But we have recognized that it is vital, after any challenges and recounts are resolved, that the declared loser accept the result and move on. A losing incumbent who refuses to concede and step down is a characteristic of unstable autocratic nations, not of an established democracy. Lawrence Wilkerson, former chief of staff to Secretary of State Colin Powell, participated in scenarios run by the Transition Integrity Project, acting out possible results in a disputed election. He noted, “We found out the Constitution has so many holes in it, it’s pitiful… The only things that patched the holes over time were precedent, protocol, and decency.” Those are elements that have been notably lacking in the Trump administration.

We expect and deserve better. We deserve a different national executive administration, one that is not dedicated to ignoring tradition and protocol in the pursuit of power, one that can restore consistency with the past and public confidence in government.

“Trump had spent so many years undermining people who challenged him. Not only his opponents but those who worked for him and for the American public. And here was the problem: By undermining so many others not only has he shaken confidence in them but he had shaken confidence in himself. This was particularly apparent when the country most needed to feel the government knew what it was doing in an unprecedented health crisis.”

— Bob Woodward, Rage, p. 387 (Epilogue)

Posted in Politics, Sociocultural | Tagged , , , , , , | Leave a comment

Mumblecore Reviewed

“How now, brown cow.” That phrase may not be as well-known as “the rain in Spain,” but it was a cliché of actor training decades ago. It may still be a standard practice, as far as I know, but for reasons I’ll mention later I suspect it probably isn’t. But it was once commonly used as part of the set of drills that helped an actor produce clear, well-defined diction with distinctive vowel sounds. It may be that such repeated vocal drills were primarily important for stage actors, the performers who not only had to project their voice across an entire auditorium, but to do it in distinct clear forms, to make certain that the customers in the cheap seats could understand the dialogue. However, once the talkies became standard, it probably carried over into films as well, with stage actors moving to the big screen and stage directors and acting coaches taking their skills to Hollywood.

Whether that was true or not, and it likely was, things seem to have changed. I’m not sure when it started, but I believe I first became aware of it in the early 2000s. In many movies since then I’ve noticed that sections of on-screen speech became slurred, the actors rushing into and through their lines in ways that made many words unintelligible. Most recently, I noticed this tendency again in a dramatic series called Yellowstone, a Kevin Costner production that seems to be a sequel-quality rewrite of the excellent earlier serial Longmire (the description “sequel-quality” is not a compliment, by the way, but you probably already knew that).

Longmire was about a crusty rural sheriff-protagonist (Robert Taylor in full Harrison Ford mode) who battles with corrupt land developers and casino managers and an Indian tribe. Yellowstone was about a crusty rural rancher-protagonist (Costner) who battles with corrupt land developers and casino managers and an Indian tribe and the local sheriff. But I’ll never find out to what degree Costner copied Longmire because I’ve decided not to watch anything else after struggling through the first episode. One reason for my choice is that much of the important dialogue in Yellowstone is delivered in tossed-out phrases in which the key plot points are obscured, as in “We hafta get the sirble nevunth now!” So I often was not quite sure what anybody was actually planning to do or who was going to do it. That can take a lot out of a plot line. When I’ve had this kind of comprehension problem in other television presentations, as, for example, in BBC productions that include characters with heavily-inflected Manchester or Glascow accents, I’ve been known to turn on the closed captioning feature. That usually helps.

If you haven’t tried closed captioning on foreign movies, I can recommend it as a potentially diverting and informative form of entertainment, especially for someone who likes to play with language. Obviously, I’m not talking about those movies that have the original foreign-language dialogue supplemented by subtitles. These will probably seem to be straightforward to you unless you’re fairly fluent in the language in question, in which case you’ve probably noticed that you can enjoy the varied editorial choices made by the translator who decided which English phrases to use in the subtitles. No, the movies or serials I’m talking about are the ones that are produced in a non-English version, then dubbed into English and then also provided with English closed captions. In many of these the dubbed English dialogue seems to be chosen to try to match the lip movements of the actors. The captions, on the other hand, seem to be chosen based on … well, I’m not exactly sure what criteria are used, but they obviously haven’t attempted a faithful reproduction of the dubbed English. I’ve seen dramas in which the dubbing is in United States dialect and the captions use United Kingdom terminology. In other movies, it seems as if the caption writer might have been doing the job while downing excessive pints of Guinness in a loud and rowdy pub. In either case, you get two versions of any discussion among the actors, both of them in English but often varying widely in content or connotation. If you enjoy discordant language experiences, I recommend it. If not, better just leave the captions off. Let that be a warning. At least, in most cases, the voice actors who perform dubbed words in these films tend to use fairly clear diction, although sometimes in a strongly variant (i.e., “un-American”) accent.

To get back to Yellowstone, however, I must admit that I tried to turn on closed captioning in my attempt to view it on NBC’s Peacock service. I failed to do so. Either that option wasn’t available or I wasn’t able to access it; I must also admit that I didn’t try very hard. For one thing, the powers that control Peacock‘s free debut on xFinity chose to make it an odd sort of streaming service, one in which you can fast forward through the program itself but not through the commercials. That’s perhaps understandable from a revenue-specific perspective, but it’s still one major strike against them. For another, from what I did manage to understand about the confused plot line and the writing in the first 90-minute episode of Yellowstone, I decided that it wouldn’t be worth the effort to find and activate the captions.

A couple of decades ago, when I first began to notice questionable enunciation in movies and television, I at first thought that it was a sign of my own aging auditory system. After all, people my age are known for needing hearing aids even if they’ve taken good care of their hearing, and my own eardrums never really recovered fully from my youthful live encounters with groups like Jefferson Airplane and Credence Clearwater. But through rather extensive research on Turner Classic Movies I’ve found that I have hardly any difficulty with James Cagney’s gangster asides or Eli Wallach’s pseudo-Mexican accents or Michael Caine’s real British slang, even when the soundtrack in question has itself suffered from age-related deterioration. I also have little difficulty with “on-the-street” interviews, unless the background traffic or the hurricane winds are too noisy. I’m also not one of those senior citizens who keeps the TV up so loud that it can be heard four houses away. The problem, then, seemed to lie with some modern actors and movies, not with my ears. Early on I decided that this “sirble nevunth” tendency had become so common in modern movies and television shows that it needed a new name as a descriptor. Given that it was akin to carelessly misarticulated speech, and yet central to the performances in question, I decided a good title would be “mumblecore.” I was quite happy with it. In a pre-Internet world that term might have stuck, that is, I might still be using it for my own purposes. But even back then, in the very beginnings of the twenty-first century, a cursory search on the web was available, and it indicated that my word already had an accepted definition within the acting world. Unfortunately, that prior definition had only a limited applicability to my own desired usage:

Mumblecore: (film) An American independent film movement of the early twenty-first century, characterized by low-budget production, focus on personal relationships between twenty-somethings, improvised scripts, and non-professional actors. (from wiktionary.org)

In short, my own planned use of the term mumblecore seemed to have some similarity with the dialogue qualities resulting from “mumblecore the movement”, which may be why the name was coined by a sound editor, Eric Masunaga. But the other key characteristics of the above definition would make it difficult to apply the term to well-funded scripted major studio productions employing aging (experienced) actors like Kevin Costner, even if they featured semi-intelligible dialog. Of course, this background story in the definition of mumblecore might indicate where the “sirble nevunth” problem originated; young actors and directors and sound engineers graduating from under-financed indie movies to films with professional-level funding and equipment without changing their verbal approach to the script. Maybe they simply didn’t recognize the difference, or they didn’t consider proper enunciation important. Because of this I will from now on refer to this tendency as “mumblespread”. I had thought about using “mumblespeak”, in homage to George Orwell, but that would imply that the spread from indies was intentional, which it doesn’t seem to have been.

I should note, in summary, that I am using Yellowstone only because it is the most recent example I’ve seen. It is far from being the only offender in this decades-long decline in diction. For that reason, I encourage all actors to repeat after me, slowly, clearly, “How now, brown cow.” Follow it with “The cow in Spain browses mainly in the plain.” Now do it again, with feeling this time!

Posted in Media | Tagged , , , , , , , , | Leave a comment

Individual Myth

For decades now we as a society have been following the social preferences of what Lynn Parramore, in an October 26, 2019 episode of Evonomics, called homo economicus, the supposedly rational human for whom the highest and best path for a human economy is determined by financial considerations. That means, for example, an emphasis on individual utility maximization and cost-benefit analysis. The alternative to this is homo communis, the compassionate and multifaceted human for whom the most important factors cannot always be reduced to monetary measures. Communis favors what is sometimes referred to as “the greater good.” In the corporate world, economicus has brought us to the extreme of shareholder primacy, in which the business of business is solely to maximize profits, stock market gains, and dividends. The communis alternative, which actually was much more common in the middle of the twentieth century, is to factor in the interests of all corporate stakeholders, including the workers, their families, the neighborhoods and cities surrounding the corporate facilities, and even the customers. In politics, economicus focuses on policies that are intended to stimulate economic growth, or even more narrowly to follow the short-term dictates of the leaders of private industries and to rationalize that tendency by invoking the ethos of growth. Monetary analysis tend to minimize such difficult-to-quantify externalities as the welfare of workers or communities, the effects of the pollutants created and released in business operations, and the subtleties of product quality and lifespan.

The communis alternative is to act in ways that improve the living conditions and welfare of all citizens by operating businesses in ways that account for the effects of all such unquantifiable externalities, methods that minimize the negative impacts and recognize the value of continuity and long-term sustainability. In the public sphere it means policies such as redistributing wealth, creating and enforcing an equitable legal system, providing a social safety net, and maintaining a broadly useful common infrastructure. Those efforts would be a recognition of the importance of community, and the greater good, as a contributor to the economic health of the country.

But just as we have elevated homo economicus over homo communis, there is another related dichotomy that has become unbalanced by an unfortunate preference for one of the alternatives. This is the choice between individualism and communitarianism. Individualism is, in the extreme, the myth that each of us solely determines our own fate. In this model, every negative outcome in our life is our own responsibility. If someone is working in a job that pays minimum wage and living in substandard housing it is their fault. If they lose their job and are bankrupted because of an unexpected illness, it is their fault. If they go into debt to complete a training course to upgrade their skills and still can’t find a job, they invested in the wrong career path, so it is again their fault. If they’re laid off when their employer moves its operations to Mexico … well, you get the idea. Being born into deep poverty is also no excuse. There are a host of Horatio Alger anecdotes out there just waiting to quash that excuse, ignoring the fact that statistics show that the United States has one of the lowest (read: near impossible) rates of upward mobility in the world.

The meme of individual responsibility was institutionalized in the middle of the twentieth century using the sociological terminology of “the culture of poverty,” a self-referential collection of rationalizations that essentially said that multi-generational poverty was primarily the result of the attitudes and life choices of the families themselves rather than any characteristics in the surrounding society. That philosophy was rejected by academicians more than six decades ago, but the concepts live on in political doctrine and in the popular culture of individual blame, the concept that any people who are unable to pull themselves out of poverty do not deserve anything better. It has been used to justify repeated reductions in the welfare state, the near-destruction of the Aid to Families with Dependent Children (AFDC) program and repeated cuts in such support programs as food stamps and housing subsidies and Medicaid.

Conversely, if an individual experiences positive outcomes, it is considered to be not only their own personal accomplishment, but a sign of social and moral superiority. Chief officers of successful corporations are praised as if they were solely responsible for that prosperity and are often consulted as experts on a wide variety of topics unrelated to their business model. Executive salaries that are hundreds of times larger than average worker pay are justified as fair compensation for contributions that nobody else could have provided and for supposed personal qualities like a distinctive work ethic that nobody else exhibits. External inputs such as market or environmental or societal timing, random luck, personal network connections, or inherited wealth, are largely ignored. The circular logic of positive outcomes decrees that a successful individual must have earned their success because they’ve received it. The invisible hand that chooses winners and losers in the economic market is apparently also the arbiter of personal quality in the wider social world. Ameliorative policies such as progressive taxation are rejected as unfair punishment of exemplary individuals.

The uproar that greeted Hillary Clinton’s 1996 book “It Takes a Village” is also indicative of this meme. She was accused of undermining the primacy of the nuclear family as well as minimizing the standing of individual agency. Note that the “nuclear family” myth, succeeding on its own without extended family or societal supports, is just another fallacy of the individualist philosophy. Another similar outrage came after President Obama, in a 2012 speech, uttered the blasphemy, “If you’ve got a business—you didn’t build that.” His point, of course, was similar to Clinton’s. They posited that we all live in a society, and our lives and our successes, the things we build, depend on a complex supportive system, the physical infrastructure and legal structure and traditional customs and interpersonal relationships, not to mention the individual efforts of all the people who support and believe in the many parts of that system. It is no surprise that virtually all of the outrage against the Clinton and Obama statements came from conservatives, those who subscribe to the individualist philosophy phrased by Margaret Thatcher when she said, “There’s no such thing as society.”

Many commentators have argued that Thatcher was not rejecting government action. However, later in the very same speech she went on to reject a concept that she characterized as “I have a problem, it is the Government’s job to cope with it!” and to praise the idea that “the quality of our lives will depend upon how much each of us is prepared to take responsibility for ourselves and each of us prepared to turn round and help by our own efforts those who are unfortunate.” Her statements are the essence of the conservative ethos of individualism, incorporating the idea that the problems of “unfortunates” should be solved not by society acting as a whole, but by individual contributions, through charity.

The conservative rejection of communitarian action has expanded since the days of Thatcher and Reagan. President Trump’s administration has done its best to dismantle the federal structure of social responsibility, rolling back environmental limits and public land protections and business regulations, and reducing enforcement efforts regarding whatever remains. His attitudes toward the Covid-19 pandemic have ineffectively dithered about any meaningful federal role in limiting the spread of the virus, and his fans, following his lead, have turned reasonable prophylactic actions into questions of individual choice dissociated from any obligations to the larger community. The simple act of wearing a mask to reduce the spread of the contagion has been turned into a political imposition, with the anti-mask groups refusing to give up their supposed individual rights to refuse any government mandate.

The anti-mask movement, often coupled with pressures to “open up the economy quickly,” is the ultimate expression of individualistic preference in opposition to the communitarian goal of protecting the general public. It is only one of the ways in which the Covid-19 pandemic has exposed the failure of this self-centered philosophy, a list which includes the obvious lack of interest in, and government funding of preparations for, a disaster that the nation should have anticipated. No, the primary goal prior to the pandemic was to reduce any government expenditures or efforts that would benefit the greater good. All of this is a logical extension of the conservative trends of elevating individualism and rejecting communitarian options, and of following the cult of homo ecomonicus, yet another example to add to the misguided overall goals of negligible regulation and minimal taxation.

Posted in Economy, Sociocultural | Tagged , , , , , , , , , | Leave a comment

Defund What?

Proposals to “Defund the Police” are trending as I write this, spurred on by the deaths of George Floyd and Breanna Taylor, among others. But the defund movement has a much longer history in the United States, with serious arguments and discussions dating back to at least the 1930s. The noted sociologist W. E. B. DuBois called for “abolition-democracy” in his 1935 book Black Reconstruction. This concept advocated the removal of institutions rooted in racism and repression, including white police forces. Another wave of defunding advocacy grew during the civil rights era, especially during the 1960s and 1970s when many police forces became involved in a variety of unconstitutional and often violent anti-protest actions. In the wake of the Church Committee’s investigation of the COINTELPRO scandal, significant restrictions were placed on the FBI, but calls to reform local police forces were largely ignored, despite the fact that many were actively complicit in the repressive program.

The current defund movement encompasses a wide range of proposals, ranging from outright abolition to disband-and-replace to reduction of mission and assignments. Its popularity at this time is obviously strongly tied to the same societal characteristics found in the 1930s arguments; the existence of systemic racism and discrimination in society, its codification in law, and its acceptance and eager application by many of those assigned to enforce the law. The effects of these have been intensified by several waves of “war on crime” philosophies at all levels of government, and related increases in police funding, since the 1960s. But modern defunding analysis has also been adjusted to recognize (relatively) newer trends that accelerated in the late 1970s, trends that not only affected public policing but public education as well.

The newer problem is an indirect effect of the widespread tax revolt of the mid-to-late twentieth century, a revolt that still has many powerful adherents in the United States. Most notable is Grover Norquist, who is well known for his “drown government in a bathtub” quote, but the first major shot in the revolt was fired by Howard Jarvis and Paul Gann when they promoted California’s Proposition 13 in 1978. Brought to a vote by the California ballot initiative process, Prop 13 passed with 63 percent. Its popularity reflected the high inflation of the preceding decade, which included a rapid increase in property values and related taxes. Its primary provisions were that property taxes could not exceed one percent of the assessed value of any property and that those assessed values could not increase by more than two percent each year. It immediately reduced property tax revenue in California by about 60 percent statewide, a deficit that was gradually reduced, but by no means eliminated, by adding fees on other state and local services.

The success of Proposition 13, and the widespread media coverage that followed, inspired a spate of similar efforts in at least 30 other states. It was also in part responsible for the results of the 1980 election in which Ronald Reagan, California governor, became president of the United States. To the Republican Party the lesson seemed to be that “no new taxes” was a winning strategy, and they’ve adhered to it since, pledging to reduce the amount of money provided to, and spent by, governments at all levels. Wherever Republicans have been in control of the budget process, and where they have been joined by like-minded Democrats or Independents, they have made efforts to do just that, justifying their efforts by frequent references to governmental waste and to arguments that the amounts provided for government operations are excessive.

The trouble is that the conservative rumors of waste and excess funding are largely bogus. Admittedly, before 1978 K-12 schools and Universities in the state of California were recognized as excellent. The state also had a good reputation for the quality of its roads and other infrastructure. Was this too much, or is it what we should expect to be getting for our tax money? That seems to be a philosophical divide that has been answered in favor of the latter in most European social democracies, but in the United States citizens have tended to demand necessary public services without wanting to pay for them. So in the past seven decades we’ve fairly consistently voted in favor of lower taxes. That has inevitably led to fewer government benefits and more circumscribed public services. The result, in one important area, was summarized on June 30, 2020 by Doctor Robert Redfield, the Director of the U.S. Center for Disease Control, in testimony before congress. Speaking about the national response to the Covid-19 pandemic, he noted that “for decades there’s been consistent underinvestment in public health.” The continuation of that trend meant that in the most recent decade the number of state and local public health professionals declined by more than 5,000 per year even as the number and severity of public health threats have increased. The government response, in short, was so small it was drowned in the coronavirus.

But what has all of that meant for the public services included under the general categories of law enforcement and education? First, the downward pressure on tax collections and revenues has meant reduced funding for both, at all levels of government. Budgets for both facilities and salaries have not kept up with inflation. Personnel shortages are common as most police departments and school districts have been unable to compete in the job market to attract all of the qualified individuals they need. Shortages have meant that many of those hired to fill empty positions have not been fully vetted, and have not been adequately trained or competent. That has undoubtedly increased the frequency of unprofessional behavior in both professions, a result that could have been expected.

But that’s only the most direct effect of the revenue decline. With steady declines in government services and facilities to address such varied problems as health care, mental crises, housing and the homeless, families in distress, and poverty, a large proportion of the government responses to those problems have been shifted to the police and schools. Teachers are expected to deal with troubled, poor, and ill students with minimal or no assistance from school nurses or counselors. Police officers are sent to respond to 911 calls for minor public disturbances and welfare checks and domestic arguments even when there is no connection to a crime in progress. In effect, the constant pressure to reduce tax revenues has pushed social problems into the hands of people who are only minimally trained to deal with them and who would have had a full-time workload even without such demands. This not only delays or disrupts the provision of appropriate police and teaching functions, but also increases the likelihood that improper and potentially destructive methods will be applied. This is true even if our police officers and teachers have received supplemental training, because such preparation will inevitably be substandard or incomplete. Many of these public contacts could, and should, have been better handled by social workers, trained counselors, conflict-resolution specialists, and emergency medical technicians.

There are, certainly, other issues raised by Black Lives Matter and related protests, most notably the militarization of the police and the decline in community-oriented police programs during the past two decades, and the recalcitrant role of police unions in refusing any attempts at reform or at punishing bad actors. Then, of course, there’s the question of specific arrest procedures such as choke holds. These are in some cases being dealt with as separate limited reforms in a few cities and states, a response that is significantly inadequate. Thus far, any meaningful national solution has been blocked by the president and his followers in the U.S. Senate.

To return to the more general question of defunding the police, the only completed effort thus far has been the disbanding and restructuring of the force in Camden, New Jersey, and that occurred in 2013. It was apparently very effective and would serve as a positive example for other cities. But most of the newest proposals for defunding the police are less comprehensive, being calls to divert some of the funds now assigned to police departments and apply them to the types of social services that would be most appropriate for many of the problems that public agencies face. This would expand the numbers of specialized workers that could respond to specific social problems and thereby reduce the demands placed on both the police departments and school districts, actually freeing up the police and teachers to do more of what they were hired to do. In fact, “defunding” of this type could make both our police forces and our schools more effective. Ideally, though, we would not merely shift current expenditures, but would also expand taxation and funding to make up for the losses that have occurred as a result of the stinginess of the political austerity philosophies of the past. There is already growing pressure to do that, as shown by the passage of recent voter initiatives, for example in both Arizona and California, to raise taxes and improve services. We can only hope that this tpe of action will be a future trend across more of our country. Let’s call for “refunding” public services in general.

Posted in Politics, Sociocultural | Tagged , , , , , , , , | Leave a comment

Police Disruption 2

I prefaced my first post on this subject by noting that I have been a participant in many protests in my adult life. The causes have differed widely, including opposition to various wars, support for civil rights, rejection of pipelines, and dissent against the election of a sexist and racist president. Almost all of them were non-violent, marches and gatherings with signs and speeches in the hopes of attracting media attention. The first major event I was involved in, however, was not. More on that later. The reality of protests is simply that there is a range of possibilities, and that that there are likely to be violent people among both the protesters and the police.

Current practices among protesters are to scatter monitors throughout a crowd to watch for violent individuals and to redirect their actions to non-destructive ones. The appearance of objects thrown at the police from the crowd shows that this is not always effective. Likewise, the police strategy is primarily to form lines to keep a protest crowd corralled and to avoid any excessive force. Recorded incidents of unnecessary use of rubber bullets, batons, and hand-held pepper spray by individual officers shows that this, also, is not always maintained. In any event which involves two or more oppositional groups, especially when emotions are inflamed, behavior can be difficult to control.

Another common response to protests, violent or not, is the one that has been recommended recently by President Trump and his allies. This is the myth of “dominant force” as an effective deterrent. The president’s solution was made clear when he tweeted, “When the looting starts, the shooting starts” and told the National Guard leadership to “dominate” the streets. In the history of protests, strikes, and other demonstrations, massive force has been the response of choice for centuries. The United States has repeatedly broken up demonstrations by having the police (or the Pinkertons or the Cavalry) fire guns directly into the crowd. Largely because of public backlash the most deadly such methods have been gradually reduced during the past fifty years. However, the president’s attitude, like the actions of many individual policemen, shows that in some circles the concept of severely punitive controls is still attractive, and still viewed as potentially effective.

One of the first protest actions I participated in was a set of demonstrations in Berkeley, California, in May of 1969, during the height of the opposition to the war in Vietnam. It was focused not on the war, although that was part of the background. but on a specific local issue. It was the People’s Park Riots. The University of California had bulldozed a residential block a few hundred yards south of the campus and wanted to develop it into high-rise student housing. While they delayed, however, the lot became a popular site for anti-war activists and others to hold meetings, and members of the community had spent significant effort cleaning it up and planting grass and flowers. A confrontation ensued when the university tried to fence off the lot, with protesters facing off against police and county sheriff’s deputies. Two very controversial politicians became involved; President Richard Nixon and California Governor Ronald Reagan. The police were using batons and tear gas to help them disperse the crowds. The county deputies also used shotguns loaded with “OO” buckshot. Governor Reagan defended their firing such weapons even after a bystander had been killed and 128 citizens hospitalized. This obviously inflamed the situation.

On the street, however, the tactics of both sides became clear, and the results didn’t bode well for the “law and order” forces. The uniformed officers moved in groups defined by rows of men shoulder-to-shoulder, concentrating on occupying territory by slowly and forcefully pushing protesters backward. Their opposition, however, moved quickly and in a fluid manner. One portion stayed within view of the uniforms. Another set of individuals kept more distance, advancing only to throw objects. Still others broke up into small mobile groups that went off in many directions. The amount of firepower was overwhelmingly in favor of the authorities, but the contrasting tactics were reminiscent of the revolutionary war, when the British Redcoats employed the mass formations and rigid lines of European war against a guerrilla army that briefly attacked from covered positions and then rapidly melted away. The police managed to control a few central blocks, but the larger goals of disbanding or arresting the protesters was largely a lost cause. The carnage certainly didn’t encourage protesters to quit and go home. The demonstrations continued for a week, with daily disturbances and some property damage, and in the end the university capitulated, turning Peoples’ Park into a permanent public space that still exists today.

It is important to review this distant event because it has similarities to many of the demonstrations and riots that have occurred since. Television coverage shows that the forces trying to control protesters are once again attempting to establish territorial boundaries, often after almost all the potential vandalism there has occurred. While they do this, massing in relatively static or slow-moving formations, clumps of demonstrators flow rapidly away from them into side streets and small bands of vandals and looters rampage through nearby neighborhoods. The police weapons of choice are now tear gas and batons, as before, and hand-held pepper spray and what are referred to as “rubber bullets”. The two new tools are technically non-lethal, an improvement over guns and buckshot, but can cause significant damage. As could be expected, though, the police tactics are no more effective in achieving their goals than they were in 1969.

The lesson of Peoples’ Park is also directly relevant to the myth of dominant force. Could a more aggressive and deadly use of force put a stop to protests and destruction? The answer is no. The fact is that if our newly militaristic police forces were augmented by the National Guard or, as President Trump suggested, by the (illegal) use of full-time military troops, and even if they were allowed to use lethal force, their efforts would be largely ineffective, and even counterproductive. There would be casualties, of course, but it would do little to stop any vandalism or looting, and it would actually strengthen the resolve of the protesters. Trump’s folly might even be remembered for decades, in the way that the 1969 killings at Kent State have lived on. Any police chief who follows his suggestions might reflect on the fate of Frank Rizzo, “legendary” mayor and police commissioner of Philadelphia, now remembered primarily for the patterns of police brutality, intimidation, and disregard for constitutional rights that he encouraged. During the week of June 8th this year, a South Philadelphia mural depicting Rizzo was painted over and his statue on the steps of the Municipal Services Building was damaged, set on fire, and finally removed.

Dominant force sounds logical. It has an appeal to “law and order” types who seem to believe that protesters do not deserve to be listened to or treated in humane ways. It definitely has a strong appeal to white supremacists who believe that minority people are threats and do not deserve to be full members of our citizenry, and do not merit the ordinary protections of out constitution. It clearly is attractive to President Trump, who also displays an affinity to despots and totalitarian societies and an evident animus toward anyone who opposes him. It is a simplistic philosophy that equates protests with misbehavior that deserves immediate punishment, and that assumes protesters will respond to strong punishment by cancelling their plans. It is not only incorrect, but it is distressingly similar to the philosophy that has informed most of the police killings that have inspired these recent protests in the first place. The proper response to protests against extra-judicial police punishment is less punitive behavior, not more.

Posted in Politics, Sociocultural | Tagged , , , , , , , | Comments Off on Police Disruption 2

Police Disruption 1

I have been a participant in many protests in my adult life. The causes have differed widely, including opposition to various wars, support for civil rights, rejection of pipelines, and dissent against the election of a sexist and racist president. Almost all of them were non-violent. Almost all of them involved lengthy marches through city streets along approved routes, flanked by police escorts, inconveniencing drivers but causing no physical damage. I have been happy to donate my time and my physical presence to these efforts.

As I write this there are similar activities filling streets in all of the major cities in the United States and a large number of minor ones, including my current home town of Albuquerque. There have also been supportive rallies in many cities around the world. They are inspired by the May 25th death of George Floyd at the hands of Minneapolis police, but have increasingly commemorated other deaths, including Breanna Taylor and Ahmaud Arbery and Philando Castillo. The demands of the protesters are primarily in favor of significant police reform that would hold officers accountable and reduce the use of violence. The movement and phrase “Black Lives Matter” has been rejuvenated and strengthened and has begun to achieve greater acceptance among people who had previously rejected it. As for the protests, the images appearing in the media are familiar. The one significant difference is that the marches and gatherings have been continuous, daily, for more than two weeks, and do not seem to be coming to an end despite the fact that significant concessions have already been offered by various government agencies. On June 12th the complaints were given new life by the police shooting of Rayshard Brooks in Atlanta. The number of locations and the persistence of the events is only rivaled by the protests against the Vietnam War in the late 1960s and early 1970s.

The first three or four nights after the death of George Floyd the protests were marred by vandalism and looting. As is common in such situations the looters were generally separate from the vast majority of the protesters, often becoming active only after the peaceful demonstrations had subsided. Despite this, much of the early media analysis lumped them all together and highlighted the violence. In reviews and discussions they attempted to provide rationalizations for both the protests and the destruction, basing all such responses on injustice and poverty. It is true that those factors are operational, but as the most recent weeks of large daily non-violent gatherings has shown, it is always a mistake to conflate all of the varied disruptive activities, violent and non-violent, even if they occur in the same general neighborhoods on the same days. The reality is that in any sequence of this type at least five very different groups of people, with very different motives, can be involved:

1. Protesters who care deeply about sending a message and who wish to do so peacefully. This is usually the vast majority of those who come out into the streets during the daytime and early evening. In coverage of the Minneapolis “riots” it was clear that there was a strong majority of members who not only wanted a peaceful protest but who were actively working the crowd to maintain calm, sometimes in spite of police provocation.

2. Protesters who care deeply about sending a message and who are angered and frustrated enough to respond through vandalism, by causing damage to those they see as oppressors, i.e. the police and their cars and buildings.

3. Protest looters drawn to the action by the unrest and the related opportunity to send a message by taking material goods from those they believe are allied with the oppressors.

4. Opportunistic looters drawn to an area by the unrest, but interested primarily in enriching themselves. This includes groups that are taking advantage of the diversion of police patrols, which allows them to loot stores that may be distant from the scene of the protest. In the most recent case these looters often included organized gangs that used social media to select stores, track police movements, and coordinate attacks.

5. Thrill-seekers, anarchists, and others who are drawn to the unrest primarily by the opportunity to destroy whatever they can. These are often motivated by generalized social anger or angst unrelated to the specific cause of the moment. In recent actions, these included white supremacists and right-wing militia members looking to inflame racial animus.

As a day’s protest stretches on and the hour gets late the members of group number 1 thins out and the other groups, those who want to loot and/or cause damage, become dominant. That is why any destruction tends to get worse late at night. The motives of groups 4 and 5 (the most destructive ones) are often nonpolitical or non-specific. They include the types of people who also show up at celebrations of holidays or sports victories, demonstrating that the damage they cause is largely irrelevant to the purpose of the instigating event. All they need is for the police to be distracted. Such violence certainly cannot be allowed to challenge the validity of the protest message—this means that in general, observers shouldn’t be distracted from the purpose of a protest by complaints about property damage.

There is one other reason that outside commenters, professional or otherwise, should be careful about what they say about protests. When people ask, “But why do they destroy their own community?”, or even when others respond with justifications such as, “They act out of decades of frustration”, both are, unfortunately, employing the same lumping stereotypes that motivate bigotry. The fact is, there is no homogeneous “they” in a protest. There are only multiple “theys”. Any protest contains a multitude of individuals and motivations. To imply otherwise, especially to argue that the actions of a few reflect badly on the aims of the many, is only slightly less abhorrent than to assume that black or Hispanic criminals represent the attitudes of all black people or all Hispanics. The leaders or instigators of a non-violent protest know well that there is a chance that their efforts will attract people who do not share their philosophy. That is an unfortunate and unavoidable reality. They have no control over that, and any analysis that implies that they should better control any destructive individuals, or even worse that they should avoid mounting any activity, is both unrealistic and irresponsible.

The best way to avoid vandalism and looting on such a large scale is to create policies that will keep the police from misusing the power they have. The absolute best way is to build supportive and cooperative relationships between the police and the larger community, the one that police are supposed to “serve and protect”.

Posted in Politics, Sociocultural | Tagged , , , , , , , | Comments Off on Police Disruption 1

Monumental Virus

In Oakland, California, where I spent the first eighteen years of my life, I was aware of several major landmarks. All of these are within a mile of the downtown and city hall, and but one of them I visited frequently in my youth. My list does not include the city hall itself, a Beaux-arts building that at 320 feet was the first high-rise government building in the United States when it was built in 1914. Its fourteen floors are in two tiers, like a massive wedding cake, a comparison that is completed by the largely decorative clock tower on the top. The fact is, although I passed it frequently, city hall was not a meaningful presence in my early life.

A few blocks away there was the central facility of the Oakland Tribune, which had some of its offices in an oversized clock tower that made it one of the tallest buildings in the city (no longer even close). The addition of the tower in 1923 raised it to twenty-two stories and 306 feet, slightly shorter than the existing city hall. My mother worked on one of the lower floors for more then twenty years. My mental image of that building serves as a reminder that at one time city newspapers were both popular and very profitable (no longer true). I distinctly remember the bottom twenty feet of the building, which housed the printing press and an oversized tunnel. At about five on most mornings, for my first two years of college, I would drive a 15-foot box truck into that tunnel to pick up bundles of newspapers to be delivered to many of the hundreds of small storefronts, scattered throughout the east bay area, where paperboys would pick up the news they would deliver to customers on their routes. That entire delivery network has been gone for more than two decades, collapsing well before the Tribune devolved into a weekly in 2010. The Tribune building, however, still stands.

Another significant building was the Oakland Civic Auditorium, a traditional rectangular arena with seven massive arches over the entrances on the northeast side, facing Lake Merritt. It is now the Kaiser Convention Center. Inside, the arena floor is more than large enough for a full-size basketball court and is surrounded by enough seats for almost 5,500 people, an interior space uncluttered by vertical supporting beams. This means, of course, that it is about half the size of any modern multi-purpose arena, but that also meant that people in the cheap seats at the highest reaches of the auditorium could see the floor action clearly without the interposition of a jumbotron screen. When it was built in 1914 it was one of the largest of its type. I was there for performances by the Ringling Brothers/Barnum and Bailey Circus and the Harlem Globetrotters. My high school graduation was held there. The Grateful Dead performed there 57 times. Like the Tribune building, it is also still in place and still an impressive sight, although it has remained mostly unused for the past fifteen years.

Next among Oakland landmarks is Lake Merritt itself, originally a salt-water lagoon and center of a thousand-acre wetland. In the second half of the nineteenth century, after it became a noxious settling pond for the city’s sewage, it was reduced in size by border bulwarks and a dam, both to clean it up and to reduce tidal flooding. The current lake now has a rough triangular shape about ten city blocks (3,000 feet) on each side. The California State legislature voted to make it a wildlife sanctuary in 1870, the first official such designation in north America. On the shores of the lake I remember Children’s Fairyland, a walking park with statues of characters from traditional tales, and the Camron-Stanford House, a large Italianate Victorian, built in 1868 by then-mayor Dr. Samuel Merritt. It served as the Oakland Museum until 1967. Every year on July 4th the lake hosted daytime boat races and a fireworks display.

A final significant landmark in my mind is the main public library building, an imposing five-story stone edifice, a massive rectangular box with thick white columns framing 30-foot tall window openings on every side above the first two floors. First opened in 1951, it still fully occupies an oversized city block about a thousand feet west of the Civic Auditorium. While I was in high school I spent most of my library time in the small Laurel branch of the Oakland library system, a storefront on MacArthur Boulevard, but the offerings at the main branch were significantly larger and were a mere four miles from home, a distance I often covered by bicycle.

There were other significant locations and buildings in Oakland and there are more now, but these were the ones most meaningful to me at the time. Two of them have resurfaced in my memory through a visual connection to the current worldwide pandemic. When Covid-19 first came into public consciousness as a serious threat this March it brought up a couple of visits I had made, as a high school student, to the Oakland History rooms of the main library building. On the walls at that time there were pictures taken during the peak of the misnamed 1918 “Spanish” influenza, one of them showing the interior of the Oakland Civic Auditorium.

I already was aware of the size of the auditorium and the broad wood surface at its center. In the 1918 pictures that wide floor is there, but it is covered by a large number of plain metal-frame single beds, set six to eight feet apart, with a few nurses and supply tables scattered among them. In the grayscale picture the white bedding and nurses’ uniforms contrast sharply with the dark flooring, enhancing the sense that the beds and caregivers were an aberration. The thought of the familiar auditorium, a place of sports and entertainment, converted into a makeshift hospital seemed the perfect symbol of the global devastation caused by that previous pandemic. None of the photographs or the written accounts of the 1918 flu that I have seen since then have affected me as much as that image, and none have stuck with me as long.

That is why when I heard the first news reports about Covid-19 the image of the Civic Auditorium came back to me. I have since seen similar images used in retrospective photo collections of the 1918 flu, including a Wikipedia entry that shows the auditorium after tall partitions were added to divide up the space. There are many similarities between the two pandemics; in 1918, like today, businesses were closed, large gatherings were banned, and face masks were required in public places. And there were deniers and resisters, including prominent politicians, who refused masks and promoted and participated in large public events. The influenza came in three waves; a small one beginning in July of 1918, a rebound lasting from September to December that was five times as deadly, and another less lethal bump the following Spring. In the United States approximately 28 percent of the 105 million population became infected and the death toll took at least a half million. The crisis only ended because the virus mutated into less virulent forms, some of which are still around. With today’s virus, in 2020, we may see a rebound by Covid-19; the initial attack has not yet subsided.

I continue to hope that the 2020 pandemic, already serious and still growing, will not duplicate the level of human destruction that occurred a century earlier. However, as with the 1918 influenza, there is no vaccine or effective Covid treatment available as I write this, and a large proportion of our population is not taking the pandemic seriously. In fact, one of the phrases used by people who dismiss our efforts to reduce the Covid-19 spread is that the disease is “no worse than the flu.” Looking back at our experience in 1918, that line is hardly reassuring.

Posted in Sociocultural | Tagged , , , , , , , , , | Comments Off on Monumental Virus

Economic Shift?

You may be forgiven if you aren’t aware of it, but in August of 2019, there was a major shift in emphasis in modern economic philosophy. For one, the event wasn’t much reported in any of the mainstream news media. For another, it doesn’t seem to have caused a pronounced observable shift in the activities of capitalist societies or of Fortune 500 corporations. That is probably because it wouldn’t exactly have been a welcome development among the most influential groups of investors (oligarchs). At this time, let it be said simply that on that date the Business Roundtable (BRT) produced a document that supported a seismic change in the operating principles of corporations.

In the previous life of the BRT, which was formed in 1972, the guiding philosophy was in line with neoliberal dogma, which stated that the primary business of business was maximization of shareholder returns (i.e., profits, share values, dividends). That led almost all corporations to focus almost exclusively on short-term profit-maximizing strategies, anything that would lift the stock price at the end of the current quarter. If you think this is business as usual, you would be right, but you would be ignoring the broader philosophy of major corporations of the immediate post-WWII decades, in which they generally expressed support for the cities in which they operated, the families of people they employed, and their customers as part of their mission statement.

In the past five decades the doctrine of shareholder primacy has led to a variety of common but questionable choices. Manufacturers have laid off workers, reduced wages and benefits, sold real assets, and reduced the quality of input materials to cut costs. They have outsourced labor to distant countries to reduce wages or to avoid payments to mitigate the environmental impacts of their operations. Some of these actions were taken to reduce the effects of temporary financial setbacks, especially during the 2008 recession, but most occurred during periods when corporate profits and management compensation were already at record levels. After the 2017 tax relief, most large corporations used almost all of the billions they received in tax savings to buy back their own stock and, in so doing, raise the value of their shares.

The stockholder-centric fetish and its hold on corporations has been strengthened by narcissistic investors who have sued companies in which the management has “unnecessarily” chosen to raise wages or benefits or to abide by environmental or health regulations, or indeed where mangers made any other foolish decisions that might, god forbid, reduce short-term profits. It was also encouraged by the practice of paying top managers with stock options, giving leadership yet another motive to focus only on equity prices. But the most egregious example of investor-exclusive attitudes has come as I write these words. As the fatal effects of the COVID pandemic continue to spread across the world, investors, CEOs, hedge-fund managers, and other self-obsessed oligarchs are strongly encouraging President Trump to end the shelter-in-place strategy that has effectively reduced national mortality. For them, restarting the economy and reversing the recent stock market shortfall overrides all of the other negative impacts of the disease, including the likely threat of a deadly resurgence. This is stockholder primacy at its most virulent; the idea that the role of the United States government is to maximize financial growth above all else, even public health.

We can’t ignore the lengthy wave of market deregulation that was inspired by investor-centered economic theory, starting with the Nixon administration but continuing through the 1999 Gramm-Leach-Bliley Act and the following explosion of derivative finance. Deregulation brought us such entirely avoidable disasters as the Savings and Loan collapse of the late 1980s, the dot-com bubble of the late 1990s, and the great recession of 2008, all of which would have been virtually impossible if we had retained and enforced the New Deal laws and rules put in place during the 1930s and if we had retained the multiple stakeholder ethos of the 1950s.

One of the guiding principles of the New Deal, and of the economic revival that followed World War II, was that financial prosperity should not be the sole measure of progress. There were other goals that a moral nation and its corporations should consider as part of “the pursuit of happiness”. Those goals recognized that there were a number of stakeholders who were important to the national well-being and who were not defined solely by financial characteristics. We almost entirely forgot about them and that multiplicity of characteristics during the last half of the twentieth century.

Now the BRT, a collection of almost 200 CEOs from major corporations, has produced a list that remindes us about what we have lost. Their document states that they make a “fundamental commitment” to all of their stakeholders, specifically mentioning the following:

  • Delivering value to our customers.
  • Investing in our employees.
  • Dealing fairly and equitably with our suppliers.
  • Supporting the communities in which we work.
  • Generating long-term value for shareholders.

It is notable not only that shareholders are the last item on this list of important stakeholders, but that they chose to define shareholder value as “long-term”, not just limited to the next quarter’s numbers. A company that recognizes and affirms its responsibility to all of these stakeholders and to the long run cannot be the type that moves its operations overseas for a two-percent bump in profits.

However, there remains the question of whether or how this document will be put into practice. This is what Andrew Winston wrote in the August 19, 2019 Harvard Business Review:

The BRT announcement certainly sounds like a big deal. Shareholder primacy has been the core operating principle of public companies for about 50 years, since economist Milton Friedman famously declared “the social responsibility of business is to increase its profits.” These ideas have been promoted for decades by a very well-funded and wildly successful effort—with the Koch brothers at the core—to make the free-market, shareholder-primacy, neoliberal philosophy the dominant global economic model.

Of course, this Winston commentary was given the skeptical headline, “Is the Business Roundtable Statement Just Empty Rhetoric?” Many of the companies represented on the BRT have not been exactly devoted to long-term planning, stakeholder interests, or broader societal concerns. Notably, they include fossil-fuel processors who have denied climate change and fought efforts to mitigate it, pharmaceutical producers who have over-promoted highly addictive drugs in pursuit of massive profits, and financial firms that continue to follow many of the risky and often fraudulent investment strategies that resulted in the 2008 great recession.

It remains to be seen how effective the Business Roundtable statement will be in modifying the single-minded attitudes and policies of its members, the companies they represent, and other global corporations, but the mere recognition of other stakeholders is a start.

Posted in Economy | Tagged , , , , | 1 Comment

COVID Commerce

The economy is global. That is nothing new. European colonial leaders rarely recognized it in public statements, but for most of the past two centuries their national prosperity was dependent on countries thousands of miles away, dependent on the cheap natural resources provided by their colonial possessions. Much of the European resistance to the anti-colonial independence movements of the 20th century was motivated by the desire to maintain control of the supply chain needed to keep the factories and the tea shops and the construction sites humming. Their form of dependency has changed, and it is now sustained more by diplomacy than by military and bureaucratic control, but it still involves relationships that span the world.

In recent years there have been events that have demonstrated that globalization can have significant drawbacks. In the last half of the first decade of the 21st century the financial sector in the United States, inflated by high risk investments and a housing bubble, collapsed. It was inevitable and predictable, and it should have been isolated, or at least compartmentalized, to the housing market or at least to that and housing-related finance. But it wasn’t, and, in retrospect, couldn’t have been. The financial market in the United States is massive and thoroughly integrated into every segment of the national economy. For example, it was said that after the 1980s General Motors morphed from being a car company into a financial corporation that happened to sell cars. The ubiquity of finance made it impossible to limit the recession damage to one sector. And it wasn’t even limited to one country. The financial markets in most other nations were also inextricably tied in with the investments and securities and strategies of the leading United States firms. Major banks in Europe, Asia, even South America were compromised; if they didn’t collapse entirely they sharply curtailed their lending, and in the process they brought their local economies down with them. The 2008 recession was a worldwide phenomenon.

That recession was one created and defined by distortions in capital markets; cross-border movement of capital is a large part of globalization. But there is another significant global trend that has strengthened in the past five decades. The outsourcing of processes and the resulting transportation of materials—raw materials, parts, and finished products—has increased dramatically. Cell phones are assembled in Korea using components built in five or six other countries. Chickens are sent to China to be butchered and cleaned before being returned to the United States. Exotic woods grown in Brazil are shipped to India to be cut into boards which are sent to Japan to construct houses and furniture. Using such arrangements the manufacturer gains the flexibility to shop around for venues that can provide them with the lowest overall costs, reducing expenditures for labor, land, and regulatory obligations. In comparison, the added costs of long-distance transportation for components and finished products can be relatively insignificant.

Global outsourcing has increasingly been combined with the concept of “just-in-time” delivery (JIT). The JIT method has been promoted in television commercials showing a small retail outlet with empty shelves until the day before it opened, shelves that were almost magically filled by timely deliveries just before the doors opened to customers. JIT has been widely implemented in commerce, allowing manufacturers and retailers to reduce inventories and storage costs, along with waste from excess supplies, by scheduling shipments to anticipate short-term demand. For the most part the process is invisible to consumers, except perhaps for those clothing shoppers who find themselves staring at a rack of dresses sized extra-small and extra-large with nothing in between.

The routine problems arising from the combination of outsourcing and just-in-time are easily predictable. The latter requires that any location that needs specific components or products must accurately predict near-future demand. To respond to that demand, each stage in the supply chain must also know how much time it will take to fulfill an order after it is made, including both the production time for multiple components and the outsourcing-related delays involved in transportation from two or three different sources. This is complex enough, and it is fortunate that we have computers to track such routine factors in international commerce.

The situation today, however, is not routine. The threat of the coronavirus (the COVID-19 strain), as it has escaped China and rapidly, often inexplicably, appeared in countries on every continent, has demonstrated how vulnerable our global market systems are. Wuhan, the massive Chinese city where the virus apparently started, is one of their largest industrial and transport hubs, is known as their “motor city”, and is a major tech center as well. All of that, of course, was shut down as China attempted to control the outbreak. At the beginning of 2020 the import traffic from China was already depressed by the unnecessary and poorly-planned tariff wars of 2019 and by the usual week-long shutdown for Lunar New Year, but it has not recovered to the levels seen prior to 2019. Imports from other countries have also been impacted, and some factories in the United States have shut down operations for lack of necessary components. Demand for some commodities, such as oil, have declined, sharply pushing down prices and industry revenues.

Global trade is not the only casualty. As in China, firms in many countries have temporarily shut down operations to avoid bringing their employees into contact with each other in central facilities. Italy and Spain have imposed severe nation-wide restrictions on travel. In many countries, including th United States, large events—concerts and conventions and competitions—have been cancelled. Consumers have begun suspending unnecessary shopping for the same reasons, often after flurries of hyper-purchases of essentials. The resulting pauses in economic activity will severely depress wages and profits. The travel industry has suffered what have been termed “catastrophic losses” as tourists have cancelled or delayed plans for vacations. The Dow Jones average mirrored the level of business concern about the future effects of the epidemic by declining almost 14 percent in the final ten days of February and by swinging wildly in the weeks following that, shifting into a bear market for the first time since the 2008 recession and almost wiping out all of the growth achieved in the past three years. That’s probably an overreaction, as is common on stock markets these days, but it does underline the seriousness of the problem.

As we watch the commercial markets and our own national government attempt to deal effectively with the COVID-19 pandemic, several things become clear. We have created interconnecting systems that are highly complex and dependent on standardizing assumptions. They provide many benefits, products and either cost savings or profits, that would not be available otherwise, and we manage to keep them running relatively smoothly under normal conditions. But the 2019 tariff wars and the mismanagement of COVID-19 have demonstrated that the system requires knowledgeable national leadership dedicated to global cooperation. The current administration in Washington, D.C. has failed on both counts, making the inevitable difficulties much worse than they would have been.

Posted in Economy, Politics | Tagged , , , , , , , , | Comments Off on COVID Commerce