The Year of Me Too

The Person of the Year this year, at least according to Time magazine, was not a person, but a group—a large group, in fact, all of them female. Under the title “The Silence Breakers”, the cover featured five of the women who have exposed powerful men for their tendency to harass or belittle or threaten or rape people who were dependent on their decisions. The many other women who have also come forward are represented anonymously on the cover by the elbow of another woman who remains mostly off the right side. In 2017, their actions have become an odd sort of social event, a movement composed largely of individual decisions motivated by the individual decisions of others. One woman comes forward and inspires another to speak, then another and another. In most cases, their efforts have become effective only through the cumulative force of multiple reinforcing stories.

The wave of accusations caused an explosion in a pre-existing online campaign. The sharing of stories of sexual harassment using the phrase “Me Too” began in 2006, created by community organizer Tarana Burke on the MySpace network as an effort to create “empowerment through empathy”. It inspired the creation of of the Twitter meme #MeToo, and in 2017 that collection of personal complaints and inspiration quickly grew to millions of messages worldwide. Actress Alyssa Milano, providing her own vignette about producer Harvey Weinstein, wrote “If all the women who have been sexually harassed or assaulted wrote “Me Too” as a status, we might give people a sense of the magnitude of the problem.” That magnitude should have been obvious to anyone open to the ubiquity of sexual harassment, but fewer people can now remain in denial.

There are several recent well-publicized events that presaged, and in some ways promoted or influenced the current movement. An early one may have been the publication of Cheryl Sandberg’s self-help book Lean In. Among some commentators, at least, that book has been credited with inspiring a certain amount of self-assertiveness, a willingness of individual women to stand up for themselves and not to remain silent in the face of mistreatment. That is probably a stretch, one that certainly overstates the influence of Sandberg’s book.

A more likely initial inspiration was the 2016 election, the one that saw an admitted sexual predator and government amateur win the presidential contest against an accomplished, experienced, knowledgeable woman. The election brought us the massive demonstration on the day after President Trump’s inauguration, a protest that included proponents of many progressive issues, but which was primarily a “Women’s March” against sexual harassment. The march developed into a movement organization, Together We Rise. The election and march inspired a record number of women to run for office around the country, which in turn led to a surprising number of women winning their races, especially in the November Virginia elections.

Throughout 2017, in short, women were prompted to speak out in extraordinary ways, and, in another extraordinary development, the media began noticing and providing broad coverage. It didn’t hurt that even before the election, Fox News fired Gretchen Carlson, which led to her lawsuit and complaints from other employees, and the down fall of Roger Ailes and a number of other Fox employees. Post-election and protests, it is not surprising that a few complaints against Harvey Weinstein became a flood that spawned a series of New York Times articles and harassment charges involving more than 90 women, and that led to a tsunami that swamped other luminaries, including directors Brett Ratner and James Toback; Bill O’Reilly; actors Dustin Hoffman, Ben Affleck, and Kevin Spacey; media icons Matt Lauer and Mark Halprin; comedian Louis C.K.; politicians Roy Moore, John Conyers, and Al Franken. Oh, and also, oddly enough, Garrison Keillor. Of the above listed men, only Donald Trump has retained his job (the GOP tried to keep Roy Moore, too, but failed).

The perpetrators and their supporters generally deny the accusations. When they can’t entirely imply that their accusers are liars or that the relationships were consensual, they often come up with excuses . The most interesting of these is the one forwarded in defense of Weinstein, who claimed that he “came of age in the 60’s and 70’s, when all the rules about behavior and workplaces were different.” This is a theme used by many older predators, including Bill Cosby. Apparently these men and their friends believe that it is impossible to learn and adjust as we age—the old dog and new tricks defense. Unfortunately, their memory of the cultural norms of the past are also flawed. Yes, men have had exaggerated expectations of sexual favors if they had economic or political power, and they may have taken advantage of their relative privilege more often in the past. They undoubtedly faced fewer adverse consequences.

But the sixties was no golden age of male dominance. Yes, there were many jokes and cartoons about bosses chasing their secretaries around the desk, and stories about the casting couch, but those were not signs of approval. And in case you don’t believe that, I have a movie for you to watch. It is The Apartment, an excellent 1960 winner of 5 academy awards which explores the misuse of office power relationships to take advantage of women. The dominant male predators are definitely not positive roles. So the decade did not begin well for tolerance of sexual harassment. And yes, one of the counter-culture efforts of the 1960’s involved sexual liberation, but that was predicated on consensual relationships.

What a concept. Consensual relationships. Why don’t these men understand that? The fact is, there are no excuses for sexual harassment. Like rape, it is generally more about power and dominance than about sexual attraction. It is simply another way for men who have control over the working lives of women to manifest their position and humiliate subordinates. And it is attempted by men at all management levels. As Emily Martin of the National Women’s Law Center noted, “We don’t read lots of news stories about fast food workers experiencing harassment and retail workers experiencing harassment and hotel maids experiencing harassment, but that’s not because it’s not happening.” The media concentrates on the wealthy and well-known, but that, to mangle a phrase, is just the visible tip of the dysfunction.

Tarana Burke, in an interview in the Nation (December 4/11, 2017), notes that she is concerned that the MeToo movement, and the public view of the scandals, will be distorted through a focus on popular individuals. This seems to be happening in the media coverage, and that may mean that the attention will decline after all of the high-profile scandals are resolved or forgotten. It has happened before. This time the perp count is larger and more influential, but the same pattern may still occur. That’s why Burke wants the focus now to be on power and privilege, not individuals. We must focus on the ubiquitous nature of the problem and the importance of believing and encouraging victims, and punishing the guilty. The focus must be on all of the MeToo messages and messengers, now and in the future. And the response must be clear; the predators must lose.

Posted in Sociocultural | Tagged , , , , , , , , , , , , , | Comments Off on The Year of Me Too

Faux Foot Follies

This is a pre-Thanksgiving blog entry, and what could be more appropriate for that than a discussion of … football? After all, Thanksgiving is the traditional football holiday, isn’t it? The game has always been an object of devotion in the United States, rating as high on the loyalty scale as religion, which, unsurprisingly enough, also competes for our attention every Sunday. However, there is a problem. The U.S. version of football is under siege right now, from several directions. So, what’s causing the problems now?

Let’s start with the current controversy about two “nationals”; the National Football League and the National Anthem. Sports and politics, mixed into one controversy and pulled kicking and screaming over into patriotism, all of them volatile topics! We’ve all seen the images of many NFL players “taking the knee” during the anthem, in protest. They actually started this in protest of injustice and racial bias, not against the anthem or the flag, but we can’t simply let protesters decide what their protests are about, can we? Much better if we decide for them, and claim that they are disrespecting the song and the flag and our military and the very principles this nation was founded to promote—even as they protest in support of the principles this nation was founded to promote.

One of the many irrational responses to the players’ protest—and those responses have been dominated by irrational responses—is the movement by fans to boycott football. Fans have even scorned their wannabe fantasy connections to the action on the field, burning their favorite player jerseys and team banners, tossing their season tickets, walking out before the game starts. That’s obviously because nothing causes the NFL team owners more pain than when someone who has already paid for the entire season decides not to stay and watch the game. So far, it seems that the biggest losers from the controversy are the many small shops that deal in sports memorabilia, which have noticed a significant decline in business during the otherwise-busy NFL season. Of course, they’ll undoubtedly recoup many of those losses later, when the same fans come back in to replace the items they burned or tossed.

All of this happens to create mixed feelings in me. I support the player protests, without question. I believe that Colin Kaepernick (and Eric Reid) had every right to protest racial injustice in any way they felt they could, at any time they felt would be most effective, and that the players who continue this protest are also well within their rights and well within the best traditions of the United States and of nonviolent dissent. Note: polls show that a majority of fans agree. No, my problem is that I would like to do what I can to counter the boycott, but I have effectively ignored virtually the entire football season every year for decades, and it wouldn’t make any sense for me to start watching any games now. I guess I could go out and buy a Kaepernick jersey, if I could even find one anywhere, but it would just end up gathering dust in the garage somewhere.

So let’s move on to the important question of whether U.S. football should be radically altered (or even ended) because of the ubiquitous problem with concussions and Chronic Traumatic Encephalopathy (CTE). U.S. football is caught between those people concerned for the players, on one hand, and the other fans (like President Trump) that say that crackdowns on unnecessary roughness are “making the NFL really boring” and “ruining the game” (never mind that football has always been pretty boring—more on that later). The fact is, even devoted fans should care enough to want the players to survive their brief careers relatively intact, especially mentally. If players are spared, either by ending the game forever or by switching to professional flag football, I wouldn’t mind. Personally, I’m not invested in the NFL product or in the ubiquitous college or high school leagues which serve as the minor league apprenticeship system for the NFL.

Let me digress a moment. I am a fan of real football, the “futbol” that is popular throughout the world and that features ninety minutes of virtually non-stop strategic action that pauses briefly only when one team scores a goal and celebrates, which isn’t often, or when someone gets injured (or, more often, dramatically collapses with a fake injury). Neither of these ever causes a very long delay. If you’re watching real football, which people in the U.S. call soccer, there are no opportunities to go out to the kitchen to get another beer or more chips, and there are precious few breaks for advertisements. The time clock and game play almost never stops. I might also point out, by way of justifying use of the name football here, rather than soccer, that real football is in fact the only game in which the ball is moved up and down the field almost entirely by real feet—there’s actually a serious penalty called if a player touches the ball with hands or arms. That’s my bias.

But let’s return to what some people call “gridiron football,” the game played only in the United States and Canada. It could also be called by the name my friends and I used to use when I was young, which was “tackle football”. We could simply get rid of the misnomer “football” and shorten that to “Tackle”. That makes sense because the action called a tackle is probably the most significant element of game play. There are actions called tackles in real football and in rugby, also, but they don’t have the effect of stopping play, as they do in tackle football. But I want to go further. For the rest of this blog post, I will refer to tackle football as “SAR”, short for “Stop-Action Rugby.” I do this because game play in tackle football, and even the shape of the ball itself, is closest to the game of rugby, but with much more frequent stops and restarts. What’s with with the shape of that ball, anyway? An egg modified to increase the unpredictable erratic action when it bounces? Admittedly, the ball does work better in a game where it is so often moved down the field by throwing it, an action, I might note, not involving feet. But I digress again …

The play action on the field of SAR (Tackle) was adequately summarized back in the 1980’s by Mikhail Gorbachev, when President Reagan invited him to attend his first game. In response to a vague question about what he thought of it, he stated, factually, “All get up, all fall down.” SAR is a game in which the average play, the action part, lasts less than ten seconds. Then the teams get ready for the next play; they have at least 25 seconds to do that, and often run right up against that limit. That means repeated short spans of downtime in which players celebrate their last run or pass or tackle, then mill around a bit, then form a huddle (or not), and finally line up for the next play. There is generally so much setup time between plays that on television the broadcasters follow almost every play with an instant replay of what we just saw.

SAR is a game which consists of four 15-minute quarters, supposedly an hour of actual playing time, but the total game length generally, and unpredictably, takes more than three hours. Okay, there’s also a half-hour half-time break, but that still leaves about an hour and a half of extra off-the-clock time. Every SAR game has up to 12 time-outs, two two-minute warnings, lengthier pauses for injuries and penalties, and longer setup periods after every score (i.e., touchdown or field goal) and every time the ball changes hands (i.e., punt). It’s as if the game was designed specifically to allow as much downtime as possible for paid TV commercials. I’m sure the sponsors and the owners love it. In fact, the primary characteristic that makes SAR so boring is the very same thing that makes it so profitable on television.

That profit is vital. The importance of TV revenue is demonstrated most effectively by the lack of loyalty of team owners to “their” fans. Take this from someone who grew up in Oakland, California, which was not once, but actually twice, the “home city” of the team soon to be known as the Las Vegas Raiders. If a team owner doesn’t think the TV viewership in “their” city is providing enough profit, they pack up and leave for a greener locale (that is, one with higher audience ratings and, if possible, a newer stadium with more luxury skyboxes). Yet another digression: That’s an excellent reason to support the Green Bay Packers, the only team owned by the city. Other NFL cities should follow their lead. Instead of going into debt to build a new stadium for an ungrateful private owner, they should buy the team outright. Use eminent domain, if necessary!

Finally, player specialization is also extreme in SAR. Backs are relatively flexible (except the quarterback, who is essentially a risk-averse ball-delivery mechanism), but linemen are intentionally loaded down with extra gut weight, all the better to keep them from being pushed around. They are the Sumo wrestlers of U.S. sport. Each SAR team also consists of several separate teams, one for offense, one for defense, and yet another set of players who sit on the sidelines for virtually the entire game, waiting for the very few minutes in each game devoted to kicking the ball and returning kicked balls (the “special” teams). It makes sense, then, that there is extra setup time every time the ball changes hands, because everyone on the field must be replaced with an entirely different team. That, of course, provides a chance to play four or five more advertisements. Is it any wonder that half of the hype for the big annual Super Bowl now refers to the commercials?

SAR, however, will survive all of the current threats and downturns. This is in part because it is a traditional ritual, especially on Thanksgiving, but mainly because it provides a massive amount of revenue, and the television stations and owners use some of that money to promote the game and build audiences. They won’t let the golden goose die. So there will continue to be more than enough eyes in the living rooms across the country to keep the money flowing, boycott or not.

Posted in Sociocultural | Tagged , , , , , , , , , , , , , , , , | Comments Off on Faux Foot Follies

Luther at 500

Five hundred years ago, on the last day of October, an event occurred that would reverberate across Europe and change the structures of both religion and government. On that day a 34-year-old theology professor and preacher at the University of Wittemberg, Germany, posted a list on the door of a local church. It contained 95 theses questioning the established dogma and activities of the Catholic Church. He was most directly inspired by a disagreement over the use of indulgences, a common practice in which Catholic officials solicited payments from faithful church members for mitigation of sins and other favors, such as efforts to reduce the amount of time that dead family members would spend in purgatory. For years there had been growing dissatisfaction with the church, the result of the perceived greed of the clergy and rumors involving lavish spending and sexual misconduct in the Vatican.

Martin Luther was a complex individual, who “cuts a perplexing historical figure. In various depictions, he is by turns fiery or meek, bombastic or shy, licentious or pious, revolutionary or reactionary. Cunning or naively bewildered by what his high-minded remonstrance unleashed on the world.” (Elizabeth Bruenig, The Nation, July 31, 2017). He instigated and effectively promoted the Protestant revolution in religion, but in the 1524 Peasants’ War, the conflict between feudal subjects and the German princes, he clearly sided with the royal establishment. He was a university professor who defended his positions with powerful logic, but who also called for blind faith, saying that “Reason is a whore, the greatest enemy that faith has … treating with contempt all that emanates from God.” And the forces he inspired were often irrational, if not anti-rational; the most common early victims of the populist violence inspired by Luther’s anti-authoritarian arguments were not leaders in the Catholic Church, although some of those were also attacked, but intellectuals and philosophers.

Earlier unsuccessful attempts at breaking away from the monopolistic power of the Roman Catholic leadership included the Great Schism of 1054, in which the eastern Orthodox Church separated from Rome. A temporary second schism occurred due to a disputed papal election in 1378, leading to the existence of three popes competing for power until 1417. That division was almost entirely the result of political intrigues, not of differences in doctrine. Near the end of this period some early protestant churches were formed, led by John Wycliff (Oxford University) and Jan Hus (Charles University of Prague), both of whom objected to many Catholic practices. Some reforms in those churches included liturgy in the language of the congregants (Czech or English), married priests, the elimination of indulgences and purgatory, and justification (divine pardon) based on faith alone. The Catholic Church requires both faith and “good works” for justification, and “good works” all too often were defined as donations to the Church. The second schism and the early protestant efforts were ended by the Council of Constance (1414-1417), which elected a single Pope, Martin V, and condemned both Wycliff and Hus. Hus was executed by burning, while Wycliff, who was already dead, was exhumed and his remains burned. The lesson was thus sent and further reforms were subdued.

Yet there were continuing perceptions that all was not right in the Catholic Church. After the Council of Constance there were two more official ecumenical councils, attempts to study and address complaints. Information about the church’s failings were increasingly publicized through the availability of printing presses, invented in 1444 and by 1500 found in almost all cities in Europe. Martin Luther was personally repulsed by many of the Church’s practices, most directly during a pilgrimage to Rome in 1511, where he discovered immorality and corruption unexpected in the seat of the Catholic Church.

The Fifth Council of the Lateran, begun in 1512 and concluded in March of 1517, produced a lengthy list of decrees for minor reforms, but ignored the major problems recognized by many throughout Europe. Seven months later Martin Luther posted his list of 95 theses. At first, Pope Leo X ignored his arguments. But soon, Luther was excommunicated by the Church and declared a heretic. At first, he was protected by Prince Frederick the Wise of Saxony. Luther repeatedly rejected the authority of the Pope and called for a new non-hierarchical system based on scripture. In the following years he would add many published works regarding the Pope, devotion to the Virgin Mary and the saints, celibacy, the sale of indulgences, excommunication, justification and good works, and other Church doctrines and practices. A growing number of followers printed and distributed pamphlets containing his writings throughout Europe. He produced a German translation of the New Testament which became a best-seller.

The new Protestant and Lutheran movement quickly subdivided under various leaders, including Huldrych Zwingli of Switzerland and John Calvin of France, not to mention the Anglican variant, created by England’s Henry VIII for quite different reasons. In nations where the Catholic Church retained influence over the government it struck back forcibly, most notably in Spain and the Spanish colonies in the New World, where printing presses were tightly controlled and the violent religious inquisition was extended through most of the 16th century. In France, persecution of the Protestant Huguenots led to a series of religious wars. In the American English colonies, many protestant sects persecuted or imprisoned members of other sects before such actions were made illegal and eventually unconstitutional. And within the Catholic world itself there was increased pressure for real reforms that reduced corruption. The Church’s response was delayed by a series of conflicts with the Holy Roman Emperor Charles V, who sacked Rome in 1527, and with France’s King Francis I, who called for a general council including representatives of the major protestant churches. But eventually they convened one of their most important ecumenical councils, the Council of Trent (1545-1563), which responded to this pressure by standardizing many of the church doctrines and the biblical canon. Indulgences were retained, but abuses of the practice were forbidden.

The eventual outcome of all of this is that we now have a wide variety of religious options available to us, and there is no longer a single monopolistic entity controlling the religious doctrines and governmental structures that direct our lives. That alone is a major improvement and advancement of personal and societal freedom that can be traced back directly to Martin Luther.

The influence of Luther’s 1517 action and his consequent influence is not limited to religion. The spreading revolt against the Catholic Church became a symbol of the possibility of rising up against political authority as well. The German Peasants’ War of 1524, which Luther eventually opposed, was inspired by his example. Luther’s writings also argued that the earthly political sphere and the kingdom of heaven should remain separate. In other words, the long-standing identification of government with God and the Church, and the use of justifying doctrines such as the divine right of kings, could now be questioned. This is a major step in the development of social contract theory, the idea that any government derives its authority not from divine selection and approval, but from the consent of the governed.

It is possible that Thomas Hobbes, John Locke, and the other philosophers of the Enlightenment would have come up with Social Contract Theory on their own—there were certainly antecedents in Greek and Stoic philosophy and the Magna Carta—but it is also likely that the example, and broad popularity, of Luther’s anti-authoritarian philosophies, and more than a century of relative religious freedom, helped both inspire the development of their ideas and make it more possible that they could be implemented. Certainly it would be easier to reform or overthrow an existing government when it isn’t supported by God and the ubiquitous power of a monolithic church.

As the Reformation progressed, contract and property law was also released from adherence to religious law. This meant that contracts would increasingly be adjudicated by secular courts according to established law and a strict reading of the written terms, not by religious jurists basing their decisions on moral concerns and vague interpretations requiring a just and equitable purpose. This opened up many new possibilities for commerce and the development of market capitalism—not entirely a positive legacy, but mostly good.

Perhaps the ultimate extension of this particular aspect of Martin Luther’s reform is in the first amendment to the United States Constitution, which calls for strict separation of government from church control, and the church from government control. After two centuries of legal development and precedent, the results of this are clear. Government and legal systems, for the most part, do not and cannot promote religious doctrine. The United States contains a vast variety of religious sects and congregations in large part because government neither supports nor persecutes any specific religious group. Members of several of the larger religious organizations have been trying to obtain the government imprimatur or financial support that would help them create the kind of powerful monopoly that the Catholics once held in Europe, but religious freedom is too important to allow such involvement.

Over the past 500 years we have greatly expanded personal and political freedoms, and we continue to make progress. Anyone familiar with history and Medieval society will recognize that these have been considerable improvements, and much of it began with a list posted to the doors of a small Wittemburg church and the persistent prolific obstinance of Martin Luther.

Posted in Sociocultural | Tagged , , , , , , , , , , , , , , , , , , , | Comments Off on Luther at 500

Fear and Hoping

In last month’s post (on this same topic) I noted that many of the emotions of envy and fear and hatred displayed by modern conservatives are common human reactions to situations which seem unfair. I noted that I had experienced similar feelings, but also that one difference between people on the left (including me) and people on the the right is that leftists tend to direct such responses at people who have undeserved privileges, and modern conservatives direct them at people who have undeserved difficulties, for example, minorities and people in poverty. In either case, we tend to assume that our chosen “other” group, defined variously by wealth or race or religion or national origin, constitutes a threat to our lives and/or livelihoods, and this inspires both resentment and fear.

There is another significant factor, however. This is most obvious among alt-right adherents, although it is also true of many other less-radical conservatives. These people don’t allow their resentment and fear to dissipate. They seem to revel in it, to constantly maintain it and even reinforce it with dedicated propaganda sources that give them more and more reasons to be afraid. They create, search out, and share internet memes that demonize minorities and exaggerate conspiracy theories and repeat slippery-slope arguments about affirmative action and welfare and foreign aid and liberals and unions and socialists/communists and sharia law. They limit their media coverage to Fox News and conservative talk shows that repeat the threats that “others”, whether political or minority or foreign, are trying to take over the United States or subvert “our” culture. In short, they act as if they do not want to let go of their fear. Fear, and thus hatred, becomes an integral part of their daily personal reality.

I have often said that the base emotion behind bigotry—hatred of “the other”—is fear. That concept has been questioned both by bigots and by those who oppose bigotry. Many bigots clearly want to believe that they are motivated by logic, not by emotions, especially not by emotions as irrational (and perhaps unmasculine) as fear. Never mind—they are fearful, and that inspires their hate. Many people who are against bigotry don’t want to admit that their opponents are motivated by something as common and human as fear. Never mind—they are wrong, too. One problem is that hate is evil, but fear inspires pity, and neither bigots nor anti-bigots want to feel pity toward those who express prejudice. However, the most common methods used by white supremacists to fire up their members have always involved stories about minorities taking their jobs, about violence perpetrated by black and Hispanic men, about other religions displacing their beliefs and legal systems. In recent demonstrations one of the chants used by the alt-right was “You will not replace us”, which frequently morphed into “Jews will not replace us.” This is a sign of paranoia and fear-mongering as obvious as candidate Donald Trump’s declaration that “They’re bringing crime, they’re rapists.” Conservatives in general have been animatedly repeating the story of a San Francisco woman who was killed by “an illegal immigrant” as well as anti-Muslim anecdotes about rape and oppression of women or “jihadi” terror or “sharia law.” And it’s not just Mexicans and Muslims; there is also continuing paranoia about such manufactured threats as the gay agenda and the war on Christmas. These are all blatant invocations of fears that underlie hatred.

There are people on the left who are caught in the same vicious cycle. There are groups of anarchists whose major focus seems to be the destruction of symbols and activities of the capitalistic/oligarchic system. There are proponents of the left, including some members of the Green Party and various fringe groups, who reject all the major “corporate” media and political parties and who continually reinforce their positions through selective media sources and acquaintances who remain on topic, reminding them of the imminent threats represented by oligarchic and corporate/monopolistic control. And this is nothing new. The leftist factions of the 19th century, anarchists and bolsheviks and others, had their own meeting venues and conferences and produced and distributed their own newspapers. They demonized robber barons and fat cats in diatribes similar to those used today against the Koch brothers. The right, of course, had their own more generalized support in the yellow journalism of the day. These are historical tendencies that have been repeated many times, ones that cannot be eliminated entirely—there will always be fearful and frustrated people. But they can be mitigated.

That is the good news. Fear and resentment, and the anti-social consequences of such negative emotions, can be significantly reduced by specific sociopolitical strategies. Paranoia and poor self-image will always cause some people to fear specific differences or to long for an ideology that places them on a rung somewhat higher than some “other” group. There will always be opportunists who will appeal to the fears and stereotypes of such people in order to gain influence and wealth. Such individuals will always seek out people who agree with them, people who are willing to join them in anti-social activities. But we can relieve many of the societal pressures that can initiate or accentuate such feelings and fears. That means we can reduce the factors that assist anti-social groups in recruiting new members.

One solution is to use both legal means and public pressure to discourage anti-social activities. Added penalties against hate-motivated violence have helped somewhat, as have generalized changes in societal attitudes. Lynchings and similar revenge murders once were public celebrations that defied prosecution; they are now widely condemned and the perpetrators are quickly arrested. Segregation, stereotyping, and job discrimination are still a serious problem, but we have made progress in reducing their effects and the attitudes that maintain them.

Another more indirect strategy, in many ways a more effective one, is to enact measures that reduce political and economic inequality, in the process reducing the insecurities and frustrations that can exacerbate fear and resentment. Progressive taxation, a viable economic safety net incorporating an adequate minimum wage, single-payer health insurance, adequately supported public education, and even campaign finance reform (to include public funding); these are all examples of policies that would tend to stabilize society.

History demonstrates the moderating influence provided by such progressive social efforts. The first Gilded Age (the half-century after the Civil War) saw extremes of economic fluctuation and public violence perpetrated by both left and right. Protests and strikes were often associated with, and opposed by, deadly violence. Race riots, ones that often destroyed entire neighborhoods, were common, as were public lynchings. Incidences of these extreme activities continued up to and throughout the Great Depression. In contrast, our current Gilded Age has wealth and income inequality matching the previous one, but there are now social support systems designed to moderate the impacts of economic downturns, labor regulations that reduce the risks involved in earning a living, and laws that reduce the effects of discrimination. Virtually none of these existed before 1910 (and yes, unfortunately, these are among the very systems and regulations that the Republican Party has been continually attempting to remove). In so many ways our current interpersonal interactions are more predictable and less violent and less prejudicial than they have been in the past, simply because we have reduced the economic insecurity and existential fear experienced by the average person.

We must learn from history. We must resist the backsliding that is occurring under the current federal administration and the divisive media that supports it. We must return to government policies that reduce inequalities in income, wealth, and political influence. We must work to protect, and eventually to expand again, the legal, economic, and psychosocial progress that the United States has made over the past century.

Posted in Politics, Sociocultural | Tagged , , , , , , , , , , , , , | Comments Off on Fear and Hoping

Envy, Fear, Hate

Envy and self-pity, and the negative feelings that result from them, are normal human emotions. Envy and self-pity are also often connected, not only in the sense that they can arise from the same life experiences, but because they tend to reinforce each other. We have an encounter with someone who has more than we do, and simultaneously we feel sorry for ourselves and envious of the other person, and these impressions intensify the feeling that life is not fair and that the other person is undeserving of their good fortune. So in our minds we become victims of circumstances beyond our control, circumstances that unreasonably provide others with opportunities and lifestyles that we are denied.

I have recognized such feelings in my own life. At one low point the only job I could find was picking zucchini in the coastal valleys east of Watsonville, California. The work required constant bending to find ripe squash under the large and thorn-studded leaves of mature plants, breaking off appropriate-sized zucchini and putting them in a ten-gallon bucket, and carrying the bucket, when full, across several rows of plants to be emptied into a tractor-drawn trailer. This process was repeated again and again in 90-plus degree direct sunshine. The only break in this eight-hour routine was a half-hour lunch. It was, and remains, the most difficult job I have ever held, it paid minimum wage, and at the time I was living in a cheap shared apartment and driving a 15-year-old car. I was eight years out of high school and five out of the military, with no real prospects for better employment. In short, I had adequate reason to indulge in self-pity.

At the end of the last day, on my way home, I was feeling feverish and worn out and stopped to rest by sitting on the curb near a major road through Watsonville. As I waited for enough energy to continue walking to my car, my attention was drawn to a noisy group in the traffic driving by. It turned out to be five teenagers laughing, joking, riding in a new-model convertible. Add envy to my self-pity, bolstered by the assumption (likely, but never proven) that those teens had not yet begun real life, and that they would do so with advantages I had never enjoyed, and the event produced a significant wave of resentment that I have not forgotten. Not forgotten, despite the fact that I have long ago moved into a relatively comfortable middle-class income and lifestyle.

What does remind me of this incident and the related emotions are the continuing appeals and distortions employed by anti-immigrant forces in the United States. Their statements and arguments have obviously been crafted to inspire envy and self-pity and rage against an unfair system and to direct the resulting emotional response against specific (and relatively powerless) populations. In the carefully constructed worldview created by anti-immigrant rhetoric, immigrants are stealing our jobs, taking advantage of our schools and other public services, getting welfare money and food stamps, and receiving free medical care that citizens cannot obtain, and they obtain all of this despite the fact that they are undeserving, being slackers and dangerous criminals and culturally inferior. Allegations similar to these have been repeated endlessly in the conservative media and accepted uncritically by millions of adherents in spite of the fact that there is little truth in any of them.

Related facts: Undocumented immigrants mostly take jobs that our citizens do not want
(unlike me, they tend to remain in jobs such as agricultural fieldwork and food
processing and sporadic construction their entire lives). Their children do go to our
schools, but they pay taxes that help support them. They rarely ask for government
benefits or welfare, even earned unemployment payments; few of them have the required
documentation. As for health care, they are only eligible for emergency-room care,
which is the exact same option that any low-income uninsured citizen can receive.
Finally, they are almost entirely hard-working people who, statistics show, are
involved in fewer crimes per capita than citizens commit.

Many of the allegations made about immigrants are familiar to anyone who has listened to the rhetoric of another continuing lengthy conservative campaign, the one against programs that make up the “safety net” for citizens in poverty. In this right-wing formulation, poor people are culturally inferior, slackers or “takers”, members of a “culture of poverty” who are themselves responsible for their fate and who don’t deserve to receive benefits paid for by responsible taxpayers. They supposedly take advantage of programs such as welfare, food stamps, and emergency medical care, often through fraud. Again, many conservatives tend to accept these ideas uncritically despite the fact that virtually none of them are supported by statistics.

Whether the target is undocumented immigrants or citizens in poverty, the conservative propaganda is similar, and it is designed to maximize the emotional responses animated by the combination of envy and self-pity. It also takes advantage of what is called “fundamental attribution error”, the tendency for people to excuse their own difficulties and behavior as the result of circumstances, while seeing the difficulties and behavior of others as representative of their character. The message is, in short: “There are people out there who are inferior to you and who deserve their fate, but who are getting benefits free, benefits that you have to struggle to earn and pay for.” It is a powerful message. It incites strong resentment in people who don’t have the job or income they want or the one they feel they deserve, and who have also repeatedly been told (by the same propagandists) that they are paying more in taxes than necessary. It also builds on resentment and fear of people who are different, those whose culture or race or national origin or religion sets them apart. And it stokes the fear that people have no control over the system that is so unfair to them, the system referenced by the phrase “take back our country”. Conservatives who have accepted this message are repeatedly engulfed by the same emotional responses that I felt when I saw those teenagers in the convertible. This explains why so many of them have verbally (and sometimes violently) attacked Mexicans or Muslims (or similar “others”) in public settings. Their envy and fear leads to another four-letter word: hate.

The similarity between my experience and theirs is that, in both cases, our emotional responses are the result of broad assumptions about the “others” and the unfairness of life. I resented the teens in the car without any real evidence regarding the actual conditions of their lives, and conservatives similarly resent safety net programs and immigrants based on false generalities. I recognize and understand the sources of their enmity.

The differences between us, however, are much more significant. My resentment was, and at times continues to be, directed toward undeserved wealth and privilege. As a result, I favor reductions in economic and power inequalities. Modern conservative resentment, on the other hand, is directed at people who have less than they do and who are suffering from poverty and powerlessness, yet who are demonized by a coordinated conservative campaign. This resentment separates its adherents, in fact, from people who work hard like they do and who experience the same economic insecurities that they do. It separates non-affluent conservatives from people who they otherwise would recognize as fellow victims of an unfair system controlled by the wealthy and the corporate elite.

This is by design. The modern conservative message is planned and distributed by the oligarchy. It is part of a strategy to keep the lower 90 percent divided and politically weak. The goal is to increase profits and investment dividends by reducing taxes, regulations, and corporate costs such as wages and benefits. They support their goals by minimizing the cohesion and influence of any groups in which ordinary humans defend and promote their interests, which include responsive government entities, non-government advocacy organizations, and unions. They also work to restrict voting. To achieve all these goals, the modern conservative message is broadcast and reinforced by a large collection of well-funded media outlets and think tanks and politicians. Their strategy has been carefully designed to support the continuation of extreme wealth and social inequality and to strengthen oligarchic political control. It has been remarkably effective.

Posted in Politics, Sociocultural | Tagged , , , , , , , , , , , , , , , | Comments Off on Envy, Fear, Hate

Health Trumped

Chrissie couldn’t believe what she had just been told. She was at the extended-care home and had just finished visiting her mother, something she tried to do for at least two hours at least two times a week. Not that her mother really ever expressed appreciation for what she does—she has Alzheimer’s, and mostly doesn’t even recognize Chrissie when she’s there, and probably forgets she was there a few minutes after she leaves. But after the visit this time Chrissie was met in the hallway by the day manager, Melinda. That worried her. She really hadn’t had much reason to talk to Melinda in the past four years; they obviously had had a few meetings when her mom first moved in, but hardly anything since then. She and her mom were quite happy with the service the home provided, and any visit could only mean that something was going to change. Anyway, Melinda asked her to go into her office, and they sat down at her desk. Melinda apologized in advance. “I’m sorry, I really am, but there’s really nothing we can do to … avoid this. I’m afraid that you’ll have to find some other accommodations for your mother.”

“But she’s been here for years,” Chrissie replied, “and she really likes it here, well, as much as anyone can like being in her situation. She seems happy here. Is there a problem? If there is a problem, maybe we can do something to take care of it.”

Melinda leaned back and looked directly at Chrissie for a minute or so. It seemed like a very long minute. Then she asked a question. “Could you manage to pay us two-thirds of what it costs to keep your mother here?”

Chrissie’s mouth opened, but she said nothing. Her eyes dropped to the desk in front of her. Her mind quickly reviewed the past year, a year in which her family had barely managed to keep paying their essential monthly bills, to keep the mortgage paid and buy groceries, to stay out of debt. They were already worried about the upcoming semiannual payment on the car insurance and had switched to hand washing when their dishwasher died. When she finally spoke, it was only to stumble, “I, uh, no, I don’t think so.”

“That’s what I thought. I’m afraid that’s what it would take to keep her here.”

“But I thought all of her rent and food and those things would be taken care of. That’s what you told me when we brought her here in the first place. You said we would only have to pay for extras, like clothing and toothpaste and soap, stuff like that. That’s the way it’s been so far.”

“Yes, and part of her fees will still be paid for by her Social Security check. But for the past two years, after her savings ran out, the rest of it, about two-thirds of it, was provided through Medicaid. Unfortunately, and this becomes a bit, uh, convoluted, but this year congress passed a new Medicaid law, turning control of the program over to state governments and providing a fixed block grant to the state instead of guaranteed payments. The block grants were not adequate to begin with, and they haven’t changed even though the needs around the state have gone up, and the state had to use some of the money to administer the new system, so they created new criteria to determine eligibility. And under the new criteria, your mother does not qualify for payments. So that part of her monthly fees won’t be covered after next month.”

“Is she the only one?”

“Oh, no, not at all.” Melinda shook her head, and her hand reached out and started to lift a file folder that was laying on her desk, then dropped it again. “We’re losing a number of others, consolidating for a while, moving everyone off the third floor, the top floor, and laying off some of our staff. We’re not happy about it, but it’s what we have to do. We’re lucky, actually. A couple of homes are shutting down completely, and we’ll be getting a couple of their fully paid residents.”

Chrissie wasn’t sure what to say, so she simply continued staring at the desktop, at the folder that Melinda had touched, while her brain shifted to rooms that could be made available for her mother and the family members that might be called upon to provide care for her.

“Oh,” Melinda started up again, “you should also know that the hospital, Saint Mary’s, will be shutting down before the end of this year.”

That surprised Chrissie. “But that’s the only hospital in town. Without it, we’ll have to travel sixty miles, in an emergency.”

Melinda simply nodded.

“I don’t understand,” Chrissie leaned forward and raised both of her hands slightly, turning her palms inward. “We all voted for President Trump and he promised that there wouldn’t be any cuts to Medicare or Medicaid, and that they would make health care available to more people and improve the insurance, cut costs and all that, get rid of Obamacare, do all of that. We voted for a change, I guess, but not this. We wanted to improve things.”

“Yes, just about everyone in town voted for Trump. And for our Republicans in congress.” Melinda tapped the desk with her fingers. “Have your health insurance premiums gone up?”

“Yeah. It went up quite a bit. We’re still working out how we’ll adjust our budget.”

“Under the Obamacare rules the insurance premiums for people in their fifties and sixties couldn’t be more than three times the premiums for younger people, even though their insurance risks are much higher. Also they required younger people to buy insurance, which helped pay for it all. That was what was keeping your premiums lower, well, relatively. With the new replacement law, your premiums are allowed to be as much as five times as much, and healthy young people don’t have to join, so many of them don’t, and insurance companies have to make up that money somewhere. So until you qualify for Medicare your insurance costs are likely to continuing getting a lot worse.”

Chrissie shook her head. “If we survive that long. I don’t know what we’re going to do.” She suddenly straightened up in her chair and looked at her watch. “Well, if I don’t get back to work now they’ll can me and we’ll have even less money to worry about.” She stood and started to turn away, then turned back. “We’ll figure something out and get back to you about moving my mom out. Do we have until the end of the month?”

Melinda also stood up. “The end of next month. I wanted to give you as much warning as I could, so you wouldn’t have to rush too much. I know this is a big decision.”

“Thanks. I guess. I just wish things hadn’t turned out like this. Just when you think you’ve got everything under control, something always comes along to mess it up.”

Melinda nodded. “Isn’t that the truth.” A couple of other cliches jumped into her head; “be careful what you wish for” and “look before you leap”, but she avoided saying any more.

Posted in Politics | Tagged , , , , , | Comments Off on Health Trumped

Never-ending Keynes

We should all remember the name John Maynard Keynes, born this month, on June 5th, 1883. If this name is not familiar to you, it probably should be. You should also know, in general terms, what the term “Keynesian economics” means. Keynes died in 1946, so there is obviously some reason for his name to have been forgotten. The influence of the economic theory named after him has declined over the past half-century. But Keynesian effects have continued to be active, even during the reign of presidential administrations that rejected them.

Briefly, the key feature of Keynesian economics is the concept that aggregate demand controls the overall success or failure of an economy. If the aggregate demand for goods and services increases, then the economy expands. If there is a slump in demand, the economy declines. In general, demand comes from consumer spending, but government can influence demand either by purchasing goods and services directly or by distributing money (i.e., through wages or safety net payments) to people who will then purchase goods and services. The effect can be self-supporting; if demand goes up, employers hire more people to meet that demand, which provides more money that will be used to increase demand.

Keynesianism began to lose favor in government during the inflationary times of the 1970s, when the monetarist theories of economists like Milton Friedman were gaining influence. Monetarism is often called “supply-side” economics because it emphasizes controlling the economy by manipulating the availability of money. Thus a monetarist government chooses to stimulate the economy by such strategies as cutting taxes on investors, thus releasing funding for further investments in production facilities and employment. The problem with monetarist stimulus is that there is no incentive for that additional money to go into productive investments. If there is no increase in demand for specific goods that could be produced, any added goods and services will become unsold surplus. In the absence of demand, new investment funds will likely go into purely financial schemes, raising the prices on existing goods or stock prices and encouraging speculative bubbles.

Supply-side strategies (also known as “trickle-down” economics) have failed repeatedly, both during the 1980-1992 Reaganomics period and the 2000-2008 Bush administration. Unfortunately, those who still promote supply-side, now in leading roles in the Trump White House, can deny that reality because those failures were largely masked by contributions from Keynesian activity which are ignored by Reagan/Bush promoters.

Supporters of Reagan and Bush, and of monetarism, argue that the periods of economic growth that occurred during those presidential administrations were the result of the large monetarist tax cuts that were passed at the beginning of each reign, providing large infusions of “supply-side” cash. However, the record is not pure. Both presidents also engaged in what might be called “military Keynesianism”; Reagan massively increased defense spending in pursuit of the cold war, and George W. Bush did the same by involving the U.S. in wars in Afghanistan and Iraq. This was a form of Keynesianism because it involved massive direct purchases of new goods, even if the goods in question were unnecessary and/or destined to be destroyed. Without the increased demand created by military spending there likely would have been minimal or no economic stimulus during the Reagan and Bush years. Admittedly, “military Keynesianism” in the buildup to World War II was also the stimulus that finally pulled the United States out of the Great Depression of the 1930s, but nobody pretends that that effect was caused by purely monetarist policies.

Just prior to World War II there was a non-military form of Keynesianism at work, with the federal government actively hiring people to build infrastructure—roads and dams and bridges and government buildings and art to decorate them and plays and poetry. Money and jobs were pumped out through the Works Progress Administration with a strong preference for labor-intensive projects. This was the most purely progressive form of the strategy, a combination of “employment Keynesianism” and “infrastructure Keynesianism” that proved effective in expanding the economy and helping to alleviate the misery caused by the depression and, in many cases, creating lasting useful structures. Money was also added to the economy through a wide variety of subsidy and grant programs, including social Security. The key element was that the federal funds were distributed primarily to individuals who would quickly (because of economic necessity) spend them on new goods and services, increasing consumer demand.

Now there is another grand strategy with a new claim to solving all of our economic problems. When you hear about it, it may seem more like avoidance of reality, or an extreme workaround. It has been given the title Modern Monetary Theory (MMT), a name that manages to be both pretentious and decidedly non-descriptive. For fans of economic derivations, the current Wikipedia entry notes that “MMT synthesizes ideas from the State Theory of Money of Georg Friedrich Knapp (also known as Chartalism) and Credit Theory of Money of Alfred Mitchell-Innes, the functional finance proposals of Abba Lerner, Hyman Minsky’s views on the banking system and Wynne Godley’s Sectoral balances approach.” MMT is therefore sometimes known as Neo-Chartalism, a designation that is only minimally more informative.

The essential concept behind MMT is that money is a creation of government, so that when a government borrows money it is really taking back what it originally loaned out; it is not really borrowing. And whenever a government spends money, no matter what it buys, it is adding wealth to the private sector. In short, government deficits and debt are meaningless and government spending is a net positive, so there are good reasons for government to “borrow” money without actually borrowing it from anyone, and without the need to repay it. This process is potentially limited only by the recognition that excessive spending in any sector tends to increase inflation. In the United States, we also have a political limit imposed by the federal debt ceiling statute, but, as we have seen recently, there are ways to get around that.

The result is guaranteed to make many people nervous. After all, there are still influential people who desperately want to return to the gold standard, under which every dollar in circulation was backed by, and in theory exchangeable for, the gold stored in vaults in Fort Knox, Kentucky. But at least the current dollar has the imprimatur of the federal government. In the era of purely private fabrications like bitcoin, MMT doesn’t sound like such a stretch. We have, perhaps, recognized that money is simply another tool, nothing more special than an accepted medium that has no real value in itself other than its utility in exchanging things that do have recognized value.

Modern Monetary Theory can, obviously, be very useful when applied to Keynesian strategies. Do we need to create and rebuild our infrastructure? The short answer, by the way, is yes. And the answer also could be MMT, which would allow the federal government to spend vast amounts of money and to create millions of well-paying jobs building and rebuilding roads, bridges, housing … whatever. Just as obviously, MMT can also be used in a monetarist manner, allowing government to expand the supply of money without any concern whatsoever regarding what the private sector will do with it.

We have recently seen examples of what MMT can look like. The U.S. war in Iraq was financed “off the books” in an effort by the administration of George W. Bush to pay for the war without exploding the national deficit. They would not have admitted it, but this was an example of pseudo-MMT combined with military Keynesianism, with the caveat that the money was, in the end, borrowed; later the total cost was quietly folded into the national debt. As the Iraq war continued the world economy was devastated by the 2007 collapse of the mortgage finance bubble. One of the methods used to pull us out of the resulting Great Recession was something called “quantitative easing”: Essentially, the Federal Reserve created trillions of dollars out of nothing and used it to buy much of the bad debt held by large banks. This was MMT in the service of monetarism shoring up the financial sector—not to create new jobs or build infrastructure, but to maintain the jobs and institutions that were already there and which were threatened with collapse. The Fed’s massive expenditures made their way into the economy without adding to the national debt.

There is now an active constituency on the political left promoting the use of MMT to stimulate the economy, specifically to expand government employment and build infrastructure and perhaps to create a minimum guaranteed income. These are certainly noble goals and tempting strategies. The trouble with such ideas has little to do with the policies that have been proposed so far. The real problem is demonstrated by the two examples we have seen already, the Iraq War and the Fed bailout. The real danger, in fact, is that once you open up MMT for progressive efforts, you also open it up for all of the other possibilities that are much less constructive and much less positive; for waging massively expensive and unnecessary wars, for saving corporate CEOs from the consequences of their risk-seeking behaviors, for subsidies and tax cuts and “no-bid” contracts for favored corporations and industries and oligarchs, for the perpetrators of, rather than the victims of, the failures of capitalism. In our current system, the uses of MMT will most likely be monetarist rather then Keynesian, and the beneficiaries of the ability of government to create unlimited MMT currency will be the same wealthy oligarchy that now benefits from our current deficit-funded largesse, but with many fewer controls or limits. We need those limits.

Posted in Economy | Tagged , , , , , , , , , , , , | 1 Comment

The Plutocrats Strike Back

During his campaign in 2016, Donald Trump made an effort to reach out to the Black community, repeating the phrase, “What have you got to lose?” It was clear to many of us—including 90-some percent of that community—that we in fact had a great deal to lose if he won the election. It is now beyond the first 100 days since the Trump inauguration and the surprise is not what he has been doing but how fast he and his congressional allies have moved to reverse several decades worth of progress in social programs and policies favoring equity and diversity. Even many Trump voters are also discovering that they had something to lose; federal support programs and environmental controls and consumer and worker protections, among others. It is almost as if the oligarchs—those who own the GOP and, apparently, Trump—are moving to punish all of us who presumed to vote for Bernie Sanders and for the other candidates who pledged to rein in Wall Street and “drain the swamp”.

Almost immediately upon Trump’s inauguration the White House web site removed the pages dealing with civil rights, LGBT rights, and global warming. This was the early warning sign. Trump then nominated cabinet leaders who have a history of opposing the mission and activities of the agencies they have been selected to lead. The Bush (43) and Reagan administrations also did this with some agencies, but Trump has expanded the strategy. His appointees have now moved in many ways to purge environmentalists from the EPA and consumer advocates from the Consumer financial Protection Bureau, and in general to make federal agencies more “business-friendly”.

Through a series of executive orders, he also increased the cost of mortgage insurance for FHA mortgages, placed a freeze on all new federal hiring (except for the Border Patrol, to which he wants to add 5,000 members), required all reports and publications from the EPA and USDA to be approved by the White House, reinstated the Bush-era ban on funding for NGO’s that provide abortion information, called for reversal of the federal actions blocking the Dakota Access and Keystone XL pipelines, and attempted to place a total ban on refugees from seven Islamic nations. He also actually reversed an Obama regulation that required investment advisers to work on behalf of their clients (rather than promoting the investments that mostly benefit the adviser).

Meanwhile the Republican-led congress, reinvigorated by their self-proclaimed “massive electoral mandate”, moved forward with their plans to cripple the advances made by the Affordable Care Act (“Obamacare”), to drastically restructure Social Security, Medicare, and Medicaid, to slash taxes on corporations and the wealthy, to forward vast new funding to the military-industrial complex, and to roll back Dodd-Frank and most of the financial controls that were enacted to stabilize the economy after the 2008 recession. This is just the beginning.

Obviously the next two years of Republican dominance will be a wholesale assault on the social safety net, the environment, consumer rights, minorities, immigrants—all of the issues that candidate Trump had promised, and much more, except, of course, for his promises to push back on the financial sector and imports from China. Therefore, it seems as if this is a good time to pass out virtual awards to recognize the efforts of the people who made all of this possible:

The Clarence Thomas “raise the ladder behind you” award to everyone who is now on Social Security and Medicare and who voted for Republicans. It has, after all, been obvious for decades, to anyone who has been paying attention, that the Grand Old Party is dedicated to getting rid of all vestiges of the New Deal and the Great Society. Yes, the GOP politicians have claimed that they will protect the benefits of current recipients (a large part of the GOP voting base). Unfortunately, that means that anyone who will depend on Social Security and Medicare in the future is out of luck. So, regarding the aging Republican base, we might ask why it is that so many current seniors don’t give a damn about retirement funding for their children and grandchildren? Yet the joke may be on them (and us, too) as well: Trump’s choice to head the Office of Management and Budget, Mick Mulvaney, believes that he was chosen to promote significant cuts in Social Security and Medicare benefits to current recipients, despite assurances by Trump and the GOP.

The Wizard of Oz “don’t look behind the curtain” award to the GOP-controlled congress. Now that Republicans have control of both the legislative and executive branches (and soon the judiciary also?) they are pushing forward with their true agenda—privatizing both Social Security and Medicare and, ignoring their promises to older voters, cutting cost-of-living increases for current recipients and slashing Medicare funding. And as for the other remnants of the social safety net, expect Medicaid and SNAP (food stamps) funding to be slashed, and for those programs to be converted to block grants so that the states can take all that federal money and use it for something else, after private-sector money managers take their cut. You can almost hear Wall Street salivating as they push their lobbyists forward to “help” write the new rules.

The Chauncey Gardiner “see what you want” award for all those who selectively ignored political promises that clearly would adversely affect them: A two-way tie between (1) union members who voted for Trump despite his proven anti-labor, anti-worker record and his expressed desire to lower the minimum wage, and (2) anyone dependent on the Affordable Care Act who voted for Republicans anyway, despite their repeated efforts to repeal it. Yes, President Trump is fulfilling his promise to withdraw from the Trans-Pacific Partnership (TPP) and may yet rewrite NAFTA, but his cabinet choices show that he also plans on rolling back decades of progress in worker rights, consumer protections, and health insurance. Two examples of this are fast food exec Andrew Puzder as Secretary of Labor and Tom Price at Health and Human Services. As for those people who believed Trump when he said he would create more jobs? Sorry, neither he nor his GOP colleagues know how to do that, but they clearly know how to hold the line on raising wages and benefits for workers.

The Ralph Nader “doomed to repeat history” award to Jill Stein and Green Party voters who ignored the fact that one, and only one, of the two viable nominated candidates (and, likewise, only one of the two major parties) supported the EPA and the established science on global warming. The GOP candidate, and his party, were blatant climate deniers. The Democrats were not. Instead of recognizing this, the “Greens” focused on Clinton’s corporate ties and her past support for NAFTA, and their overemphasis on the green of money rather than the green of our planet had the effect of helping depress the leftist vote.

The P.T. Barnum “one born every minute” award to President Donald Trump. He used his reality TV strategies effectively to keep his name in the media, giving him an endless run of free publicity. He embraced the old adage that “all publicity is good publicity” and took it to the extreme, not apologizing for whatever antisocial behavior or lies he had exhibited (in fact, often denying or refusing to acknowledge the existence of his own actions or statements). He and his team identified many current popular mythologies, unfounded fears, and unrealistic desires and used them to craft a series of broad and delusive promises to attract people who wanted to believe. Unfortunately too many people did accept his lofty promises, despite his continuing failure to provide any details about how he would implement them.

The Thomas E. Dewey “victory is ours” award to all the polling companies, analysts, and media outlets that were giving Clinton a “90-plus percent likelihood” of winning the election. The true likelihood was that their projections had the effect of depressing the turnout for Clinton and energizing the Republican base. State-level polls and the media myth of the “solid blue wall” of midwestern states also meant that the Clinton campaign reduced their efforts in “safe” states like Pennsylvania, Wisconsin, and Michigan.

The J. Edgar Hoover “purposeful public relations” award to James Comey, the supposedly non-partisan FBI director who, claiming the necessity of keeping congress informed, repeatedly raised the issue of Clinton emails and characterized Clinton as “extremely careless”, while simultaneously suppressing any information about ongoing investigations into Russian involvement in the Trump campaign. Given his past partisan behavior, both during the 1990s Whitewater investigations and in his support for enhanced interrogation during the Bush era, Comey should never have been appointed as FBI director. He has now been fired, a highly political act rationalized by bogus charges, so his short career mirrors, in abbreviated form, Hoover’s rise to prominence and rapid fall into dishonor.

The Robert Heinlein “alternative reality” fiction award, to Sean Spicer for his continuing efforts to explain why his boss said the things he did, or why he didn’t say what we thought we heard, or why what he said was really what he believes and what we should believe as well. There are countless examples of Spicer’s effectiveness in his role as media interpreter for the Trump campaign and the Trump White House, far too many to list here and many of them predating Kellyanne Conway’s use of the descriptive phrase “alternative facts”. Thanks to her, we now have terminology that can be applied whenever we perceive evidence that Trump spokespeople like Conway and Giuliani and Spicer are indeed inhabiting a parallel universe.

Thanks in large part to all of these people we now all find ourselves in the alternative Trump universe, the one in which the empire strikes back. The resulting resistance has been active and has forced some concessions, but as Steve Bannon recently noted about his administration, “We didn’t come here to do small things.” Sadly, the Trump/Republican victory program will continue for at least the next two years, limited only by a crude tool called the filibuster, which can all too easily be bypassed. I would say that democracy rules, but, as in 2000, we can’t even say that. Mazel tov, and may you live in interesting times.

Posted in Politics | Tagged , , , , , , , , , , , , , , | Comments Off on The Plutocrats Strike Back

Filibuster Bust

Is the filibuster dead? Was last week’s Senate move “unprecedented”? Was it “unthinkable”? Did it ever deserve the hyperbolic appellation “nuclear option”? Did the media overreach when they used such language? The event was certainly significant in its effects, but not unexpected. What happened was that, on April 5, 2017, Democrats in the U.S. Senate filibustered the appointment of Neil Gorsuch to the Supreme Court. Normally, this would mean that a vote of 60 or more senators would be needed to end the filibuster and allow the appointment to proceed, but the GOP didn’t have the votes for that. Therefore, one day later, the Republican leadership decided to bypass the whole thing by changing Senate rules. Such a rule change only requires a majority of votes. Their embrace of this tactic, which Senator Trent Lott first referred to as the “nuclear option” back in 2005, means that the filibuster can no longer be used to block approval of Supreme Court applicants.

The media have repeatedly hyperventilated about these events. Admittedly, they tend toward exaggeration on many news stories these days, an obvious attempt to bump up ratings, and their bluster often distorts the story. In this case, in their nearsighted focus on this one so-called “historic” decision to bypass the filibuster, the news coverage ignored the issues involved in the Gorsuch nomination, and the previous gradual historical degradation of the filibuster and of Senate debate rules in general. They also tended to ignore the one truly unprecedented historic event related to the Gorsuch nomination. More about that later.

For many people (if they recognize the word filibuster at all) the term brings up images of Jimmy Stewart in the movie classic Mr. Smith Goes to Washington, with Senator Jefferson Smith (Stewart) standing behind a podium for 24 desperate hours attempting to block an appropriations bill and redeem himself. Similar popularized scenes in recent years have involved State Senator Wendy Davis on the floor of the Texas Senate and U.S.Senator Ted Cruz reading a book by Dr. Seuss. But, these days, those are the exceptions, not the rule. The filibuster is a pale ghost of what it used to be, and the most recent imposition of the “nuclear option” is only the latest small step in the decades-long decline of this once-powerful strategy.

First we should look at the historical reality. Some news coverage has chosen to defend the filibuster, and bemoan its decline, as if it were the key to a long and presumably honorable Senate tradition of extended debate and the primary tool supporting the rights and interests of the minority caucus. This can perhaps best be characterized as upholding a mythological ideal. The reality is that the filibuster has almost always been about blocking legislation, not about debating the issues, a fact that the wide-ranging irrelevancies of the Ted Cruz monologue clearly demonstrated. Outside of the filibuster, everyday debate in the Senate usually does discuss issues, but the intended audience is almost always the media, not the members of the body itself—if you take the time to view these speeches you will note that the floor of the Senate is almost always empty, and only the C-SPAN cameras are listening. As for protecting the interests of the minority group, there is some justification there, but it has most often been abused.

The decline of the filibuster goes back at least seven decades. Back before 1946 Senate debate could be extended indefinitely by one senator, or a group of them working as a tag-team. That year a bill against employment discrimination was blocked for weeks, and the Senate created a provision that debate could be ended by a two-thirds vote (a process called “cloture”). That was the system until the 1960s, when a series of filibusters by southern senators, including a 75-hour filibuster of the Civil Rights Act of 1964, inspired a move to allow the Senate to avoid lengthy shutdowns during extended debates. A rules change created a “two-track” system in which unrelated legislation could be assigned specific working hours separately from the filibuster legislation, allowing other business to be considered and other bills passed while the filibuster continued.

The effectiveness of a filibuster was further adjusted in two ways in 1975. The required vote for cloture was reduced to three-fifths (60 votes) rather than two-thirds, making it easier to end any debate. At the same time, the Senate began trying out an agreement that allowed non-speaking filibusters, and the “virtual filibuster” became a possibility, making it much less difficult for any senator to block legislation. Soon the “talking filibuster” was almost entirely a thing of the past, and the number of filibusters began to increase.

Admittedly, the increase in filibusters was not entirely due to the changes in rules. The past three decades have also seen a significant growth in partisanship. It is not only that the two political parties have widely differing political philosophies, which they do. It is almost impossible today to create a piece of legislation that can avoid strongly-held objections from one or more of the larger caucus groups within the congress. But the two parties also have increasingly worked to enforce party discipline for partisan gains. During the Obama presidency the Republican Party raised obstruction to a new and extreme level, refusing to allow legislation to be considered by the Senate if it had been passed by the Democrat-controlled House or if it was supported by President Obama. What they didn’t stop through employing a record number of virtual filibusters they accomplished by filibuster threats and other parliamentary obstruction. In Obama’s first two years, GOP senators blocked passage of 99 percent of the bills created in the Democrat-led House (cf., for example, my notes about GOP obstruction at that time). Note: That was before Mitch McConnell, the leader of the Senate GOP, made his infamous statement that his primary goal was to make Barack Obama a one-term president.

Both parties have also increasingly recognized the continuing influence of the third branch of government, the judiciary. A large number of the filibusters since 1990 have been aimed at judicial nominations, as each president has tried to put his imprint on the lifetime appointments within the judicial branch and the senators in the opposing party have done everything they could to obstruct as many of them as possible. The 2005 creation of the “nuclear option” concept and terminology was in response to repeated Democratic threats to block the approval of judges nominated by President Bush. That time it was avoided by a bipartisan agreement. Eight years later the situation was reversed, with Republicans refusing to consider any of President Obama’s nominations or to offer a compromise, and the Democrats went “nuclear”, in a limited sense, bypassing the filibuster for appointments of federal judges below the Supreme Court level.

Recognizing all of the incremental changes that have brought us to the current impasse and the gradual loss of the filibuster, if it can be called a loss, we must recognize that nothing more than a small and entirely predictable step has occurred in the Senate’s approval of Judge Neil Gorsuch. It was all but inevitable given the partisan divide and the results of the 2016 elections. The significance of this change to future decades will depend on the health and retirement decisions made by the current justices of the Supreme Court and the outcomes of the federal elections of 2018 and 2020.

However, that doesn’t mean that nothing significant or dramatic occurred in the attempt to replace Antonin Scalia. It simply means that the “nuclear option” was not such a phenomenon. There was in fact one unprecedented event (or non-event) that occurred, and that was the year-long Republican Senate decision to ignore President Obama’s selection for that position, a well-qualified (and relatively moderate) jurist named Merrick Garland. That was an unjustifiable and extremely divisive—and arguably unconstitutional—partisan activity. The media should feel shame for their lack of emphasis on that historic story during the months leading up to the November 2016 election, and all Republican members of the Senate—every one—should feel ashamed for their willing participation in that effort.

Posted in Politics | Tagged , , , , , , , , | Comments Off on Filibuster Bust

Base Psychology

A while back, during a dinner out with friends, the subject of Lawrence Kohlberg came up. Now, you must understand that this is a group of education professionals. These are teachers (mostly retired) who have spent an average of more than thirty years in classrooms at various levels and who are well-versed in both theory and practice. Perhaps predictably, therefore, the other theorist mentioned during this discussion was Piaget. But this is about the Kohlberg reference, which grew out of a general discussion of the current cult of individualism in the United States, the idea that citizens can ignore their connection with their neighbors and with society in general—the idea that they can avoid their obligation to help support the institutions that hold their society together, including public schools.

If you majored in Psychology in college or if you were engrossed in counter-culture theory in the 1960’s you might remember the name Kohlberg. In the event that is not true, or the also likely event that your exposure was so long ago that you have forgotten the details, Lawrence Kohlberg became famous for research leading to his six-level Stages of Moral Development, an adaption of Piagetian theories of individual human mental growth. The model describes a series of steps through which a person matures from a youthful largely self-based worldview to more comprehensive levels based on societal rules and recognition of the concerns and opinions of others. Briefly, he argued that people at each stage use different criteria to justify their behaviors:

Pre-conventional levels (self-centered):
Stage 1: Decisions are based on the direct consequences to the self (avoidance of punishment).
Stage 2: Decisions are evaluated according to narrow self-interest (what’s in it for me?).

Conventional levels (obedience to authority):
Stage 3: Decisions determined by social consensus, approval or disapproval (what “they” want?)
Stage 4: Decisions based on the legal dictates of authorities (law and order morality).

Post-conventional levels (commitment to principle):
Stage 5: Decisions recognize that laws and social conventions are variable (social contract).
Stage 6: Decisions based on abstract reasoning guided by ethical principles.

Kohlberg regarded upward movement through these stages as preferable, arguing that justice is better served and behavior is more responsible and predictable among people at the higher levels. Most discussions of his stages during their public heyday half a century ago also clearly interpreted stage 6 as the ideal societal level and highest personal goal, albeit one that few individuals ever achieve. Much of that reasoning argued that if we look at human relationships within a complex society, and the need to equitably create and apply policy in a democratic system, it is clear that determinations at the post-conventional level (5 or 6) would be preferable to those on the lower stages of this list.

Of course, such a hierarchical preference is not universally accepted. If there is one thing we must have learned from the 2016 presidential election season, it is that there are millions of people who believe that a flexible approach to public policy, and especially to law and order, is a terrible idea, perhaps even a dangerous one.

Consider the widespread movement against Hispanic immigration. It certainly includes elements of xenophobia, racism, and economic insecurity. The key word applied to immigrants without documents, however, is “illegal”. Those who oppose amnesty for, or compromise with, “illegals” frequently insist that no such adjustments are appropriate because those who have entered the United States without permission have broken the law. “Illegal” people are criminals, and as such they do not deserve recognition, or empathy, or the privilege of residence in the U.S., or drivers’ licenses, or public education. Most of all, they do not deserve forgiveness. A related fact is that the same leaders who oppose rights for “illegals” also usually reject Muslims (non-conforming) and have worked to block felons from voting, for life. Felons, also, do not deserve forgiveness, even after they have served their time in jail. If that weren’t enough, the same groups have responded to Black Lives Matter protests by (1) portraying the victims of police violence as “thugs” and criminals, and (2) rejecting programs intended to reform police procedures and calling for oppressive programs such as “stop and frisk”and profiling.

Now, before someone accuses me of diminishing the importance of racism, let me point out that the emphasis on social consensus and group approval within the two conventional stages (3 and 4) would naturally correlate with rejection of people who are seen as non-conforming, or “other”. Racism and sexism and religious sectarianism and rejection of LGBT “perversions” all fit in well with a personal philosophy based on group approval and social conformity. Cultural consensus demands that everyone adapt to the physical and behavioral characteristics of the majority. People who differ from the norm, whether through skin color or religion or language or whatever, are automatically considered less than acceptable, perhaps even less than human. They deserve to be separated, perhaps to be removed and to be deprived of the rights allotted to conforming individuals. And oddly enough, they are misjudged and rejected as individuals, even if the judgement is based on their membership in a rejected group. After all, like the poor and criminals and LGBT people, they clearly have the option as individuals to transcend their group origins, the same way, for example, that Colin Powell transcended his blackness to be accepted.

By the way, Kohlberg’s levels do not necessarily correlate with social success or income. Surveys have shown that Trump supporters and Tea Party members, largely conventional-level people, have a median income well above the national median. Looking at behavior and expressed attitudes, it could also be argued that a large percentage of our most wealthy citizens, the hedge fund managers and private-equity manipulators, behave in ways that are consistent with the narrow self-interest of level 2. In San Francisco, a city in which wealthy silicon valley employees have squeezed out ordinary workers, one tech industry investor wrote, “I know people are frustrated about gentrification happening in the city, but the reality is, we live in a free market society. The wealthy working people have earned their right to live in the city. They went out, got an education, work hard, and earned it.” And the workers in the city who haven’t “earned” enough? I guess they deserve to suffer long commutes, or maybe even homelessness. As for the people who have really, really “earned” it, the upper 5 percent, I suppose they also have the right to special perks like tax breaks and other tools so they can avoid supporting the infrastructure they depend on and helping people who evidently don’t deserve a decent lifestyle. There are legions of examples in which vulture capitalists have extracted billions for themselves through hostile takeovers which effectively bankrupt functioning organizations and throw thousands of people out of work. Self-centered? Absolutely. Socially corrosive? Absolutely.

Currently, the United States has a political majority that kowtows to an electorate motivated on the Conventional levels (3 and 4) and a financial system dominated by Pre-conventional thinkers (1 and 2). Even worse, that self-aggrandizing financial leadership also has significant influence in the political sphere. Is it any wonder our country is in such bad shape?

In contrast, people who believe that social conventions are variable, or who model their behavior on abstract reasoning and principles (those in the post-conventional levels) tend to be more empathic with non-conforming “others” and less likely to reject them. Such people are also more likely to support a diverse, inclusive society. They are also less likely to automatically assume that poor people have caused their own misfortune and therefore more likely to support societal support systems and redistributive systems (i.e.,“safety net” programs and progressive taxation). They tend to believe in and promote community interrelationships rather than individualistic isolation.

Perhaps Kohlberg was right, along with all those sixties fans of his, that his six levels really are a hierarchy, and that as far as society is concerned, people who display the characteristics of the upper two levels are preferable—not only preferable, but necessary.

Posted in Politics, Sociocultural | Tagged , , , , , , | Comments Off on Base Psychology