The U.N. Convention on the Prevention and Punishment of the Crime of Genocide defines genocide as acts committed with the intent to destroy, in whole or in part, a national, ethnical, racial or religious group. The term did not exist until it was first coined in 1944 by Ralph Lemkin, a Polish-Jewish lawyer, seeking to describe Nazi policies of systematic murder, including the extermination of European Jews.
While this definition has generally been accepted, countries have traditionally been reluctant to recognize genocidal activity outside of their own boundaries. This, unfortunately, has most often led to genocidal extermination being allowed to continue under cover of the excuse of "national sovereignty." (It took the U.S. 40 years, from 1948 to 1968, to finally ratify the U.N. Convention.) Consequently, we continue to see genocides happen right before our eyes, and we will do nothing about them.
Case in point - less than 9,000 miles from Washington D.C., Myanmar soldiers are burning Rohingya infants alive, gang-raping teenagers, shooting villagers fleeing their homes, and wiping out entire villages, while the world continues to contemplate if it should define what is taking place as genocide, ethnic cleansing, or just an internal military action. Since late August, during a timespan of just eight weeks, Myanmar's military killed thousands, and forced 600,000 surviving Rohingya Muslims, 58% of which are children who witnessed atrocities no child should ever see, to flee to Bangladesh.
The sentiment of most observers is that, if history is a guide, the international community will abet the situation. Even though U.N. Secretary-General Antonio Guterres has said that it is "an absolute priority" to stop all violence against Myanmar Rohingya Muslims, and a simultaneously approved statement issued by the Security Council condemning the violence, the organization has stopped well short of identifying the activity as genocide. If it had, the U.N. would have been legally bound to intervene, which is why most member states will be reluctant to initiate such a move. Myanmar's State Controller Aung San Suu Kyi, a Nobel Peace laureate, while facing universal criticism for not vocally objecting to the violence, was never even confronted with the subject when she attended the ASEAN Summit in Manila this past week.
This callously predictable non-response has permeated similar unresponsive reactions to previous equally horrific events throughout history.
During and immediately following WWI, Turkey killed, deported and starved to death as many as 1.8 million Armenians. Modern Turks generally refuse to acknowledge that what happened to have been genocide. However, most scholars consider it to have been an orchestrated effort at exterminating an unwanted ethnic group that had lived within the borders of the crumbling Ottoman Empire for centuries. The world just watched.
From 1939 to 1945, during the Holocaust, when the Nazis systematically killed 11 million people, 6 million of which were Jews, the world looked the other way. There was ample evidence about what was taking place. However, the information remained classified, and reports were either denied or catalogued as "unconfirmed," while millions were literally exterminated.
In 1975, in Cambodia, the Khmer Rouge, which wanted to establish a "Communist utopia," annihilated two million people (20 percent of the population) who were considered "enemies of the state." We all knew, but we declined to interfere.
In January 1994 the leader of the U.N. troops in Rwanda was warned that a plan for genocide was in place. His intent to act was nixed by his superiors in the U.N., and most of the 25,000 peace keeping troops were withdrawn from the country. The U.S. government avoided admitting that the subsequent massacre constituted genocide. We argued that we had no business involving ourselves in the internal conflict of another country. Within a 100-day period the Hutu majority killed an estimated one million Tutsis, 70 percent of its ethnic group and 20 percent of Rwanda's population.
Other examples are plentiful. The world shamefully watches, head in sand, claiming "not our problem."
Gregory Stanton. president of Genocide Watch, lists eight stages of genocide: Classification (us against them); Symbolization (attaching labels); Dehumanization (denying the humanity of the other group); Organization (training and planning for genocidal killings); Polarization (involving propaganda and passing new, discriminatory legislation); Preparation (identifying the victims); Extermination (the killing begins); and Denial (it's the victims' fault, hide the bodies). At each stage preventive measures could have stopped the process.
Genocide is the world's worst intentional human rights problem. But it is different from other problems, and it requires different solutions. Because genocide is almost always carried out by a country's own military and police forces, the usual national focus on law and order cannot stop it. International intervention is usually required. However, because the world lacks an international rapid response force, and because the U.N. has so far either been paralyzed or unwilling to act, genocide continues to go unchecked.
With genocide and ethnic cleansing in Myanmar continuing unabated, we should keep in mind that the numbers that are being reported from the area are not just statistics, they refer to real people.
Thursday, November 23, 2017
Saturday, November 11, 2017
THE POLITICS OF EMPATHY
Empathy is defined as "the ability to understand and share the feelings of another." The term has been used in conjunction with "sympathy" and "compassion," and has surfaced occasionally and conceptually during political campaigns. While "sympathy" emphasizes the feelings of pity and sorrow for someone else's misfortune, "compassion' refers to "sympathy" accompanied by a strong desire to alleviate the suffering. A series of studies have claimed to show that conservatives score lower on "empathy" than liberals. Republicans contest these findings, arguing that such studies tend to favor Democratic principles of compassion and care over Republican philosophies of autonomy and self help. (Mark Honigsbaum, The History of Emotions Blog, Dec. 3, 2012).
While the discussion of these concepts may appear somewhat convoluted, there are substantial practical consequences at play when individuals act on them, especially when these individuals happen to hold leadership positions. The degree to which our leaders exhibit the capacity to demonstrate any or all of these influences their reaction to situations that require a decisive response. Political consultants from both parties argue that people want many things from their president, but near the top of that list is the ability to play consoler-in-chief when the moment demands it. Unfortunately, we have had too many opportunities to exhibit these this year.
Psychologists have argued that empathy is not helpful in public discourse or decision making, because it is biased. (Paul Bloom, Yale). Studies show that it is dampened or constrained when it comes to people of different races, nationalities or creeds. Daryl Cameron, a social psychologist at the University of Iowa, talks about the "collapse of compassion." He and others make the point that "empathy is actually a choice." (New York Times, July 10, 2015.) And, to a significant degree, that is the point. The extent to which our policy makers have a focused empathetic capacity often dictates the substance of their decisions.
To illustrate, contrast our response to the devastating effects of hurricane Maria on the U.S. territory of Puerto Rico with our actions after the January 12, 2010 earthquake shattering Haiti. Before dawn, the day after the earthquake hit, the U.S. mobilized as if it were going to war. An Army unit was airborne to control the main fairport. Within 2 days we had 8,000 troops en route. Within 2 weeks 33 U.S. military ships and 22,000 troops had arrived, and more than 300 military helicopters delivered millions of pounds of food and water. The morning after the earthquake the president proclaimed that we were going "to respond in Port-au-Prince robustly and immediately," which gave the entire government clarity of purpose. (Washington Post, Sept. 28, 2017). One week after Maria hit Puerto Rico, seriously affecting the lives of 3.4 million U.S. citizens, supplies were still not flowing. A few days later just 4,400 government employees were participating in federal operations to assist the devastated island, and about 40 helicopters were helping to deliver food.
The "conversation" between Carmin Yulin Cruz, Mayor of San Juan, and President Trump might be indicative of the role empathy played in our response to the devastation. Ten days after the hurricane hit, Mayor Cruz pleaded for more federal assistance, saying: "We are dying, and you are killing us with the inefficiency and the bureaucracy. This is what we got last night: four pallets of water, three pallets of meals and twelve pallets of infant food - which I gave to the people of Comerio, where people are drinking out of a creek. I am done being polite. I am done being politically correct. I am mad as hell." Trump's response, tweeted from his New Jersey golf club, was: "Such poor leadership ability by the mayor of San Juan and others in Puerto Rico who are not able to get their workers to help. They want everything to be done for them when it should be a community effort," while referring to Puerto Ricans critical of the response as "politically motivated ingrates."
Even though analysts may not concede a definitive relationship between a leader's empathy and the intensity of a response to the needs of a population, these two examples inescapably suggest that such a connection exists. President Obama's compassion for the plight of the people of Haiti prompted a massive outpouring of assistance. While hurricanes Harvey and Irma did tremendous damage in Texas and Florida, existing infrastructure support systems kicked in automatically. However, Puerto Rico, an off-shore territory, required empathetic leadership from the top to counteract its total devastation. The consequential banter between President Trump and his critics on the island appeared to have affected our federal response. Mr. Trump seemed more consumed by his tweeted criticism of demonstrations by NFL players than by the calamity experienced by Puerto Ricans. It was only when others in his administration recognized the political fall-out of our tepid reaction that he was given a teleprompter speech designed to express his concerns. Unfortunately, his impassive delivery, lacking appropriate inflection, failed to convey sincere compassion for the victims' plight. Mr. Trump's consistent referral to the island's pre-existing financial problems, coupled with the enormous "budget-busting" cost of eventual reconstruction, and a referral to the "limited number of deaths" incurred when compared to those resulting from previous hurricanes elsewhere, highlighted the administration's insensitivity.
The question may well be asked whether leaders with significant narcissistic inclinations possess a demonstrable empathetic capacity. Empathy is a choice, and, coupled with politics, these choices can have significant consequences.
While the discussion of these concepts may appear somewhat convoluted, there are substantial practical consequences at play when individuals act on them, especially when these individuals happen to hold leadership positions. The degree to which our leaders exhibit the capacity to demonstrate any or all of these influences their reaction to situations that require a decisive response. Political consultants from both parties argue that people want many things from their president, but near the top of that list is the ability to play consoler-in-chief when the moment demands it. Unfortunately, we have had too many opportunities to exhibit these this year.
Psychologists have argued that empathy is not helpful in public discourse or decision making, because it is biased. (Paul Bloom, Yale). Studies show that it is dampened or constrained when it comes to people of different races, nationalities or creeds. Daryl Cameron, a social psychologist at the University of Iowa, talks about the "collapse of compassion." He and others make the point that "empathy is actually a choice." (New York Times, July 10, 2015.) And, to a significant degree, that is the point. The extent to which our policy makers have a focused empathetic capacity often dictates the substance of their decisions.
To illustrate, contrast our response to the devastating effects of hurricane Maria on the U.S. territory of Puerto Rico with our actions after the January 12, 2010 earthquake shattering Haiti. Before dawn, the day after the earthquake hit, the U.S. mobilized as if it were going to war. An Army unit was airborne to control the main fairport. Within 2 days we had 8,000 troops en route. Within 2 weeks 33 U.S. military ships and 22,000 troops had arrived, and more than 300 military helicopters delivered millions of pounds of food and water. The morning after the earthquake the president proclaimed that we were going "to respond in Port-au-Prince robustly and immediately," which gave the entire government clarity of purpose. (Washington Post, Sept. 28, 2017). One week after Maria hit Puerto Rico, seriously affecting the lives of 3.4 million U.S. citizens, supplies were still not flowing. A few days later just 4,400 government employees were participating in federal operations to assist the devastated island, and about 40 helicopters were helping to deliver food.
The "conversation" between Carmin Yulin Cruz, Mayor of San Juan, and President Trump might be indicative of the role empathy played in our response to the devastation. Ten days after the hurricane hit, Mayor Cruz pleaded for more federal assistance, saying: "We are dying, and you are killing us with the inefficiency and the bureaucracy. This is what we got last night: four pallets of water, three pallets of meals and twelve pallets of infant food - which I gave to the people of Comerio, where people are drinking out of a creek. I am done being polite. I am done being politically correct. I am mad as hell." Trump's response, tweeted from his New Jersey golf club, was: "Such poor leadership ability by the mayor of San Juan and others in Puerto Rico who are not able to get their workers to help. They want everything to be done for them when it should be a community effort," while referring to Puerto Ricans critical of the response as "politically motivated ingrates."
Even though analysts may not concede a definitive relationship between a leader's empathy and the intensity of a response to the needs of a population, these two examples inescapably suggest that such a connection exists. President Obama's compassion for the plight of the people of Haiti prompted a massive outpouring of assistance. While hurricanes Harvey and Irma did tremendous damage in Texas and Florida, existing infrastructure support systems kicked in automatically. However, Puerto Rico, an off-shore territory, required empathetic leadership from the top to counteract its total devastation. The consequential banter between President Trump and his critics on the island appeared to have affected our federal response. Mr. Trump seemed more consumed by his tweeted criticism of demonstrations by NFL players than by the calamity experienced by Puerto Ricans. It was only when others in his administration recognized the political fall-out of our tepid reaction that he was given a teleprompter speech designed to express his concerns. Unfortunately, his impassive delivery, lacking appropriate inflection, failed to convey sincere compassion for the victims' plight. Mr. Trump's consistent referral to the island's pre-existing financial problems, coupled with the enormous "budget-busting" cost of eventual reconstruction, and a referral to the "limited number of deaths" incurred when compared to those resulting from previous hurricanes elsewhere, highlighted the administration's insensitivity.
The question may well be asked whether leaders with significant narcissistic inclinations possess a demonstrable empathetic capacity. Empathy is a choice, and, coupled with politics, these choices can have significant consequences.
ARROGANCE AND IGNORANCE, A LETHAL COMBINATION
A little more than two years ago I published a letter chastising Congressional opponents agitating against ratification of the Joint Comprehensive Plan of Action, better known as the Iran Nuclear Deal, reached in Vienna on July 14, 2015 between Iran, China, France, Russia, United kingdom, United States, Germany, and the European Union. The political push-back reached a fevered pitch, even bringing Israel's prime minister to Washington to argue against it before Congress. My take on the conversation back then was that ignorance permeated the dialogue. Steadily increasing sanctions dating back to 1979, enhanced in 2006, after which they were also supported by Russia and China, had not resulted in slowing down Iran's nuclear enrichment program. In fact, by 2015 Iran was only 6 months away from developing the capability to field a nuclear weapon. Sanctions had already begun to unravel. This was a multi-lateral agreement. Our participation was not essential, although it did give the arrangement significantly greater emphasis.
Iran essentially agreed to reduce the number of centrifuges allowed to enrich uranium by 75% over a period of 10 years, while committing to not enrich uranium at a level sufficient to build a nuclear bomb for 15 years. Its nuclear reactor would also stop enriching uranium for at least 15 years. The International Atomic Energy Agency became the organization charged with insuring compliance. During the past two years most of the original opponents of this deal came around to agree that this narrow nuclear-focused agreement ultimately benefitted the region notwithstanding Iran's continued aggressive behavior in other ways.
The facts have not changed. The IAEA, all non-U.S. signatories, and our own cabinet members agree that Iran remains in full compliance with the agreement. Ehud Barak, former prime minister and defense minister of Israel, known for his hawkish views on Iran, agreed that this nuclear agreement, being a "done deal," had been beneficial both to Israel's security and to reducing the volatility in the region. He referred to it as a "bad deal, but necessary."
Enter our 45th president. On October 13 Donald Trump announced that he, contrary to the advice of all relevant, intelligent, members of his administration, would not certify Iran's compliance with the agreement, something he by law is required to do every 90 days. While asserting that he knew better than anyone else, but showing little or no understanding of the content of the deal, he gave Congress 60 days to re-impose the sanctions that were lifted in exchange for Iran to cap its nuclear activities, or do nothing. He has been adamant hat if Congress decided not to act he would terminate the agreement altogether. Although his staff attempted to put a positive spin on his announcement, this arrogant, ignorant, unilateral action could have serious consequences.
If hawks in Congress push through a law demanding further concessions, Iran may be provoked to abandon the deal, eject inspectors, and accelerate its nuclear program. Given their capability two years ago, it could likely produce a bomb within a relatively short period of time. Iran as an aggressive state without nuclear weapons in its perceived sphere of influence is, at best, annoying. Iran as an aggressive state with nuclear capacity is outright dangerous. It would escalate tensions in the Gulf and increase the risks to our military facilities in the region. Saudi Arabia, Turkey, and Egypt among others may also feel pressured to acquire a nuclear capability.
Britain, France, the European Union, Russia and China have already announced that they will continue to support the agreement as written. They deplored Trump's move as unwarranted and dangerously destabilizing. Our relationship with China may be affected as well, since the latter's attempt to mediate between us and North Korea becomes increasingly more difficult. Pyongyang will have even fewer reasons to negotiate an agreement with us when it recognizes that we lack credibility, and could walk away from it whenever we want to. As a consequence, we could find ourselves fighting nuclear antagonists on two fronts.
Since his inauguration, we have become aware that President Trump shoots from the hip and makes a point of ignoring the advice of seasoned, rational people. We already exited the Trans Pacific Partnership Agreement and the Paris Climate Accord. We recently announced that we would exit UNESCO, and we are soon expected to leave NAFTA. Few of these moves apparently involved intelligent dialogue with stake holders. During an interview with Megyn Kelly last year Mr. Trump claimed that he was too busy to bother reading books, insisting he read passages, or sometimes chapters. John Meacham, accomplished biographer of numerous presidents, observed that "Trump came to the office warped by self-absorption, conceit, and a narcissistic certitude that he is always right while the rest of the world, unless it is busy flattering him, is wrong, even hostile." (John Meacham, "The Strength of Humility," Vanity Fair, October, 2017.) His bellicose rants designed to antagonize North Korea, and his imminent decision to exit the Iran Deal are conceived in a mindset of that same narcissistic arrogance and evident ignorance that is not just dangerous, it could kill us.
No wonder Congress is considering legislation that would bar the president from launching a first strike without a declaration from Congress. As things stand now, the Atomic Energy Act of 1946 gives the president sole control. He could unleash the apocalyptic force of the American nuclear arsenal on a whim, within minutes.
God help us!
Iran essentially agreed to reduce the number of centrifuges allowed to enrich uranium by 75% over a period of 10 years, while committing to not enrich uranium at a level sufficient to build a nuclear bomb for 15 years. Its nuclear reactor would also stop enriching uranium for at least 15 years. The International Atomic Energy Agency became the organization charged with insuring compliance. During the past two years most of the original opponents of this deal came around to agree that this narrow nuclear-focused agreement ultimately benefitted the region notwithstanding Iran's continued aggressive behavior in other ways.
The facts have not changed. The IAEA, all non-U.S. signatories, and our own cabinet members agree that Iran remains in full compliance with the agreement. Ehud Barak, former prime minister and defense minister of Israel, known for his hawkish views on Iran, agreed that this nuclear agreement, being a "done deal," had been beneficial both to Israel's security and to reducing the volatility in the region. He referred to it as a "bad deal, but necessary."
Enter our 45th president. On October 13 Donald Trump announced that he, contrary to the advice of all relevant, intelligent, members of his administration, would not certify Iran's compliance with the agreement, something he by law is required to do every 90 days. While asserting that he knew better than anyone else, but showing little or no understanding of the content of the deal, he gave Congress 60 days to re-impose the sanctions that were lifted in exchange for Iran to cap its nuclear activities, or do nothing. He has been adamant hat if Congress decided not to act he would terminate the agreement altogether. Although his staff attempted to put a positive spin on his announcement, this arrogant, ignorant, unilateral action could have serious consequences.
If hawks in Congress push through a law demanding further concessions, Iran may be provoked to abandon the deal, eject inspectors, and accelerate its nuclear program. Given their capability two years ago, it could likely produce a bomb within a relatively short period of time. Iran as an aggressive state without nuclear weapons in its perceived sphere of influence is, at best, annoying. Iran as an aggressive state with nuclear capacity is outright dangerous. It would escalate tensions in the Gulf and increase the risks to our military facilities in the region. Saudi Arabia, Turkey, and Egypt among others may also feel pressured to acquire a nuclear capability.
Britain, France, the European Union, Russia and China have already announced that they will continue to support the agreement as written. They deplored Trump's move as unwarranted and dangerously destabilizing. Our relationship with China may be affected as well, since the latter's attempt to mediate between us and North Korea becomes increasingly more difficult. Pyongyang will have even fewer reasons to negotiate an agreement with us when it recognizes that we lack credibility, and could walk away from it whenever we want to. As a consequence, we could find ourselves fighting nuclear antagonists on two fronts.
Since his inauguration, we have become aware that President Trump shoots from the hip and makes a point of ignoring the advice of seasoned, rational people. We already exited the Trans Pacific Partnership Agreement and the Paris Climate Accord. We recently announced that we would exit UNESCO, and we are soon expected to leave NAFTA. Few of these moves apparently involved intelligent dialogue with stake holders. During an interview with Megyn Kelly last year Mr. Trump claimed that he was too busy to bother reading books, insisting he read passages, or sometimes chapters. John Meacham, accomplished biographer of numerous presidents, observed that "Trump came to the office warped by self-absorption, conceit, and a narcissistic certitude that he is always right while the rest of the world, unless it is busy flattering him, is wrong, even hostile." (John Meacham, "The Strength of Humility," Vanity Fair, October, 2017.) His bellicose rants designed to antagonize North Korea, and his imminent decision to exit the Iran Deal are conceived in a mindset of that same narcissistic arrogance and evident ignorance that is not just dangerous, it could kill us.
No wonder Congress is considering legislation that would bar the president from launching a first strike without a declaration from Congress. As things stand now, the Atomic Energy Act of 1946 gives the president sole control. He could unleash the apocalyptic force of the American nuclear arsenal on a whim, within minutes.
God help us!
Wednesday, November 8, 2017
WILL SEPARATIST MOVEMENTS LEAD TO FRAGMENTATION OF THE EUROPEAN UNION?
The "Peace of Westphalia," which was concluded in 1648, ended the 30 years' war fought between Catholics and Protestants in Northern Europe, and the 80 years' war between Spain and the Dutch Republic. This historic event was said to have created a basis for the concept of national self determination.
On January 8,1918, 270 years later, U.S. President Woodrow Wilson presented his "Fourteen Points" speech, a statement of principles for peace to be used for negotiations to end World War I. His proclamation reiterated that "the right of people to self determination is a cardinal principle in modern international law," a fundamental concept that was later also prominently included in Article I of the U.N. Charter. However, the European Union, a supra-national organization, founded in Maastricht, The Netherlands, in 1993, has remained very quiet on the subject. It essentially pledges to defend the sovereignty of its member states as they are, and jealously guards against the Union's cohesion.
Although th E.U. has seen significant growth since its inception, it has recently appeared to be moving from one emergency to the next. The planned British exit (Brexit), the migration an refugee crisis, fiscal problems in several of the predominantly Southern European countries, and the festering populist opposition to relinquishing sovereignty to Brussels, count among the most important. Enter Catalonia's regional parliament voting to declare the region an independent republic, a revolutionary act, which prompted Spain's national government to assert control over the area, dissolve the Catalan parliament, sack its leaders (one of which, Carles Puigdemont, promptly left for Belgium, which offered him asylum), and to encourage 300,000-plus Catalans to join a demonstration for national unity, and we might easily conclude that this development could well lead to a further fragmentation within the European Union.
While Catalonia's drive for independence has recently captured most of the headlines, we ought to remember that Brexit dominated the discussion in 2016, as did the Scottish independence referendum in 2014. Brexit negotiations are still progressing, be it very slowly, but Scotland will likely initiate another attempt at acquiring independence in 2019, once Brexit is concluded. (Scotland voted 62 percent to 38 percent to remain in the E.U.). Moreover, the consequences for Ireland, and the complex predicament of Gibraltar, a British territory isolated on the Southern tip of Spain, which voted 93% to remain in the Union, are by no means clear. However, thus far few observers suggest that any of this will lead to further unraveling of the European Union.
While Catalonia is featured in the headlines, and while some assert that the disturbances in the region contain some of the same elements that contributed to Spain's civil war in the 1930s, which led to the lengthy dictatorship of Francisco Franco, the Catalan independence movement, although currently the most prominent, is by no means the only identifiable European separatist campaign in existence. In March 2014, 89 percent of the voting public in Venice, Italy, declared in favor of independence. This led to the foundation of a party called "Veneto Si." South Tyrol, which prior to WWI belonged to Austria, became part of Italy after the Treaty of Versailles concluded. The majority of its population, 70 percent, still speaks German, and still prefers to be aligned with Austria. Similar situations have cropped up in other countries, including in Denmark, which has the Faroe Islands, France, which has Corsica and shares the Basque region with Spain, Belgium has Flanders, Germany Bavaria, and Ukraine the Donetsk People's Republic. Thus far none of these appear to have prompted an increased fear of fragmentation among the E.U.'s 28 member states.
History tends to suggest that the consequences of active separatist movements, while threatening cohesion, may, in fact, not turn out that way. Some of the most contentious, and substantially successful drives for independence in Europe actually resulted in expanded E.U. membership. The 1992 dissolution of Yugoslavia created seven new states, five of which have already applied for full E.U. membership. The 1993 breakup of Czechoslovakia, which created the Czech Republic and Slovakia, ultimately resulted in both countries becoming full-fledged members.
One campaign tactic pro separatist politicians like to use is to suggest that a newly independent state can continue to exist inside the E.U. without suffering any consequences. However, since the E.U. is pledged to respect the sovereignty of its existing members, it will support the heavy-handed efforts by national governments to subdue separatist attempts. Additionally, the process allowing new states to join the union requires unanimous consent of all existing member states. None of these have an incentive to reward a movement that could at some later date be encouraged to fester within their own boundaries.
Given today's turbulence contesting traditional governmental authority, we should probably consider revisiting a fundamental question: "What is a nation in the 21st Century?" (See Michael Goldfarb in the N.Y. Times International Edition of October 28, 2017). Is it a country? Is it the same as a state? Does it need to be ethnically cohesive? What does national sovereignty really mean? Albert Rivera, leader of the Spanish "Citizens Party," former member of the Catalan Parliament, decidedly anti-independence, and one of the organizers of the demonstration for national unity, puts it this way: "Catalonia is my homeland, Spain is my country and Europe is our future."
Many Europeans may agree with his sentiment.
On January 8,1918, 270 years later, U.S. President Woodrow Wilson presented his "Fourteen Points" speech, a statement of principles for peace to be used for negotiations to end World War I. His proclamation reiterated that "the right of people to self determination is a cardinal principle in modern international law," a fundamental concept that was later also prominently included in Article I of the U.N. Charter. However, the European Union, a supra-national organization, founded in Maastricht, The Netherlands, in 1993, has remained very quiet on the subject. It essentially pledges to defend the sovereignty of its member states as they are, and jealously guards against the Union's cohesion.
Although th E.U. has seen significant growth since its inception, it has recently appeared to be moving from one emergency to the next. The planned British exit (Brexit), the migration an refugee crisis, fiscal problems in several of the predominantly Southern European countries, and the festering populist opposition to relinquishing sovereignty to Brussels, count among the most important. Enter Catalonia's regional parliament voting to declare the region an independent republic, a revolutionary act, which prompted Spain's national government to assert control over the area, dissolve the Catalan parliament, sack its leaders (one of which, Carles Puigdemont, promptly left for Belgium, which offered him asylum), and to encourage 300,000-plus Catalans to join a demonstration for national unity, and we might easily conclude that this development could well lead to a further fragmentation within the European Union.
While Catalonia's drive for independence has recently captured most of the headlines, we ought to remember that Brexit dominated the discussion in 2016, as did the Scottish independence referendum in 2014. Brexit negotiations are still progressing, be it very slowly, but Scotland will likely initiate another attempt at acquiring independence in 2019, once Brexit is concluded. (Scotland voted 62 percent to 38 percent to remain in the E.U.). Moreover, the consequences for Ireland, and the complex predicament of Gibraltar, a British territory isolated on the Southern tip of Spain, which voted 93% to remain in the Union, are by no means clear. However, thus far few observers suggest that any of this will lead to further unraveling of the European Union.
While Catalonia is featured in the headlines, and while some assert that the disturbances in the region contain some of the same elements that contributed to Spain's civil war in the 1930s, which led to the lengthy dictatorship of Francisco Franco, the Catalan independence movement, although currently the most prominent, is by no means the only identifiable European separatist campaign in existence. In March 2014, 89 percent of the voting public in Venice, Italy, declared in favor of independence. This led to the foundation of a party called "Veneto Si." South Tyrol, which prior to WWI belonged to Austria, became part of Italy after the Treaty of Versailles concluded. The majority of its population, 70 percent, still speaks German, and still prefers to be aligned with Austria. Similar situations have cropped up in other countries, including in Denmark, which has the Faroe Islands, France, which has Corsica and shares the Basque region with Spain, Belgium has Flanders, Germany Bavaria, and Ukraine the Donetsk People's Republic. Thus far none of these appear to have prompted an increased fear of fragmentation among the E.U.'s 28 member states.
History tends to suggest that the consequences of active separatist movements, while threatening cohesion, may, in fact, not turn out that way. Some of the most contentious, and substantially successful drives for independence in Europe actually resulted in expanded E.U. membership. The 1992 dissolution of Yugoslavia created seven new states, five of which have already applied for full E.U. membership. The 1993 breakup of Czechoslovakia, which created the Czech Republic and Slovakia, ultimately resulted in both countries becoming full-fledged members.
One campaign tactic pro separatist politicians like to use is to suggest that a newly independent state can continue to exist inside the E.U. without suffering any consequences. However, since the E.U. is pledged to respect the sovereignty of its existing members, it will support the heavy-handed efforts by national governments to subdue separatist attempts. Additionally, the process allowing new states to join the union requires unanimous consent of all existing member states. None of these have an incentive to reward a movement that could at some later date be encouraged to fester within their own boundaries.
Given today's turbulence contesting traditional governmental authority, we should probably consider revisiting a fundamental question: "What is a nation in the 21st Century?" (See Michael Goldfarb in the N.Y. Times International Edition of October 28, 2017). Is it a country? Is it the same as a state? Does it need to be ethnically cohesive? What does national sovereignty really mean? Albert Rivera, leader of the Spanish "Citizens Party," former member of the Catalan Parliament, decidedly anti-independence, and one of the organizers of the demonstration for national unity, puts it this way: "Catalonia is my homeland, Spain is my country and Europe is our future."
Many Europeans may agree with his sentiment.
Wednesday, September 20, 2017
IMMIGRATION - ECONOMIC IMPERATIVE OR POLITICAL HOT POTATO?
We disagree a lot in this country. A spirited, passionate, boisterous debate is the underpinning of our political system. One fact most of us won't disagree about, however, is that America was built by immigrants. That conviction has been part of our DNA from the very beginning. Tedious legal arguments aside, unless you are a Native American, your ancestors came into this country as immigrants. From 1776 to 2006 we took in an estimated 72 million legal immigrants, about 13% of all who ever lived here. By 2006 12 million lawful permanent immigrant residents inhabited our country. Another 12 million had already become naturalized citizens. (The Globalist, Nov. 29, 2006). On a typical day we process 110,000 foreigners coming into the country, 3,100 receive migrant visas, while 1,500 enter illegally - and there is the rub. Unauthorized migration has been our main policy concern, although the vetting of migrants from target countries for potential terrorist ties has recently also become a significant security focus.
From colonial times onward immigrants arrived in waves. Our first settlers came during the early 1600s in search of religious freedom. Persecuted groups like the Pilgrims established a colony in Plymouth, and between 1630 and 1640 some 20,000 Puritans settled the Massachusetts Bay Colony. As early as 1619 20 immigrants from West Africa, who arrived against their will, were forced into indentured servitude as slaves. Their number sadly ballooned to 700,000 by 1790.
During the period from 1776 to 1819 we accepted around 6,500 immigrants each year. From 1820 to 1879 - during the "continental expansion" period, this number grew to an annual average of 162,000. And from 1880 to 1924, during the Industrial Revolution, these numbers increased again, to 584,000 per year, while dropping to approximately 178,00 after 1925. (Compiled by Vernon Briggs, Cornell University).
Throughout our brief history anti-immigrant sentiment, mostly prompted by fear and/or ignorance, surfaced periodically. The concerns expressed usually included perceived affects on our economy, negative environmental impacts from accelerated population growth, increased crime rates, and changes to traditional identities and values. (Marisa Abrajano, "White Backlash: Immigration, Race and American Politics," Princeton University Press, 2015). More specifically, these arguments have been, and still are, articulated in terms of "national identity," the fear of losing the identity of the native population by an infusion of destructive traditions, culture, language and politics; "isolation," the fear that immigrants may isolate into their own communities, leading to the development of ghettos or parallel societies, rather than assimilating into the native culture; and an increase in competition for scarce resources, like social welfare systems, housing, education, etc. Over the years, opposition to immigration, for whatever reason, led politicians to make policy adjustments to existing laws.
President John Adams signed the Naturalization Act in 1798, which increased the period of residency required for an immigrant to attain American citizenship to 14 years. The Alien Friend Act and the Alien Enemies Act accompanying this legislation gave the president the power to deport any foreigner if he considered such person dangerous to the country. During the mid 19th century anti-immigration fervor turned decidedly anti-Catholic, culminating into the "Know-Nothing" Party. The Roman Catholic church had become the single largest denomination in the U.S., primarily on the strength of immigration from Ireland and Germany. The first significant law restricting immigration into the United States was the Chinese Exclusion Act of 1882, passed by Congress and signed by President Chester Arthur. This act provided an absolute 10-year moratorium on Chinese labor immigration. Chinese laborers had entered the country during the 1850s, first working in gold mines, and subsequently in agricultural enterprises and factories. They were particularly instrumental in building railroads in the American west. As they became more successful, resentment by other workers increased, which eventually prompted Congress to act. A 1917 law required immigrants over 16 years old to pass a literacy test. The Immigration Act of 1924 created a quota system, favoring immigrants from Western Europe, and prohibiting migrants from Asia. The Immigration Act of 1965 did away with quotas, and allowed sponsoring relatives. Current immigration patterns favor Latin America and Asia.
A plaque installed on the pedestal of the Statue of Liberty bears the text of a poem written by Emma Lazarus, which reads in part: "Give me your tired, give me your poor, give me your huddled masses yearning to be free." These words have long been favored sentimentally by many interested in immigration and immigration policy. However, they probably had limited applicability, even when written in 1883. Immigration has always served an economic purpose. Whether we are talking about colonial development, indentured servitude, Chinese railroad workers, factory workers, or agricultural laborers harvesting our crops, legally or not, we, as a country, would not have had the economic success we are so eager to flaunt without this labor pool. If immigration had ceased with the signing of the Declaration of Independence, our population would now probably only be somewhere around 125 million. Immigration fuels the economy. Immigrants increase our productive capacity and raise GDP. "Immigration surplus" has been estimated to amount to $36 to $72 billion per year. (Pia Orrenius, "Benefits of Immigration Outweigh the Costs," George W. Bush Institute, 2016). The current debate about eliminating the DACA program, the proposed construction of a demonstrably ineffective wall on our Southern border, or scaling back the H-1B Visa program, limiting employment of highly skilled foreign workers, could have a damaging effect on our economy. Deporting the 800,000 plus "dreamers" enrolled in the DACA program alone could cost our economy more than $400 billion. (John Schoen, CNBC, Sept. 5, 2017).
Throughout our history anti-immigrant sentiment, articulated and marshaled by populist politicians exploiting fear and ignorance, have run counter to rational economic policies. As always, simplicity sells, complexity breeds vulnerability. We should compel Congress to apply basic economic principles when developing policies that affect us all.
From colonial times onward immigrants arrived in waves. Our first settlers came during the early 1600s in search of religious freedom. Persecuted groups like the Pilgrims established a colony in Plymouth, and between 1630 and 1640 some 20,000 Puritans settled the Massachusetts Bay Colony. As early as 1619 20 immigrants from West Africa, who arrived against their will, were forced into indentured servitude as slaves. Their number sadly ballooned to 700,000 by 1790.
During the period from 1776 to 1819 we accepted around 6,500 immigrants each year. From 1820 to 1879 - during the "continental expansion" period, this number grew to an annual average of 162,000. And from 1880 to 1924, during the Industrial Revolution, these numbers increased again, to 584,000 per year, while dropping to approximately 178,00 after 1925. (Compiled by Vernon Briggs, Cornell University).
Throughout our brief history anti-immigrant sentiment, mostly prompted by fear and/or ignorance, surfaced periodically. The concerns expressed usually included perceived affects on our economy, negative environmental impacts from accelerated population growth, increased crime rates, and changes to traditional identities and values. (Marisa Abrajano, "White Backlash: Immigration, Race and American Politics," Princeton University Press, 2015). More specifically, these arguments have been, and still are, articulated in terms of "national identity," the fear of losing the identity of the native population by an infusion of destructive traditions, culture, language and politics; "isolation," the fear that immigrants may isolate into their own communities, leading to the development of ghettos or parallel societies, rather than assimilating into the native culture; and an increase in competition for scarce resources, like social welfare systems, housing, education, etc. Over the years, opposition to immigration, for whatever reason, led politicians to make policy adjustments to existing laws.
President John Adams signed the Naturalization Act in 1798, which increased the period of residency required for an immigrant to attain American citizenship to 14 years. The Alien Friend Act and the Alien Enemies Act accompanying this legislation gave the president the power to deport any foreigner if he considered such person dangerous to the country. During the mid 19th century anti-immigration fervor turned decidedly anti-Catholic, culminating into the "Know-Nothing" Party. The Roman Catholic church had become the single largest denomination in the U.S., primarily on the strength of immigration from Ireland and Germany. The first significant law restricting immigration into the United States was the Chinese Exclusion Act of 1882, passed by Congress and signed by President Chester Arthur. This act provided an absolute 10-year moratorium on Chinese labor immigration. Chinese laborers had entered the country during the 1850s, first working in gold mines, and subsequently in agricultural enterprises and factories. They were particularly instrumental in building railroads in the American west. As they became more successful, resentment by other workers increased, which eventually prompted Congress to act. A 1917 law required immigrants over 16 years old to pass a literacy test. The Immigration Act of 1924 created a quota system, favoring immigrants from Western Europe, and prohibiting migrants from Asia. The Immigration Act of 1965 did away with quotas, and allowed sponsoring relatives. Current immigration patterns favor Latin America and Asia.
A plaque installed on the pedestal of the Statue of Liberty bears the text of a poem written by Emma Lazarus, which reads in part: "Give me your tired, give me your poor, give me your huddled masses yearning to be free." These words have long been favored sentimentally by many interested in immigration and immigration policy. However, they probably had limited applicability, even when written in 1883. Immigration has always served an economic purpose. Whether we are talking about colonial development, indentured servitude, Chinese railroad workers, factory workers, or agricultural laborers harvesting our crops, legally or not, we, as a country, would not have had the economic success we are so eager to flaunt without this labor pool. If immigration had ceased with the signing of the Declaration of Independence, our population would now probably only be somewhere around 125 million. Immigration fuels the economy. Immigrants increase our productive capacity and raise GDP. "Immigration surplus" has been estimated to amount to $36 to $72 billion per year. (Pia Orrenius, "Benefits of Immigration Outweigh the Costs," George W. Bush Institute, 2016). The current debate about eliminating the DACA program, the proposed construction of a demonstrably ineffective wall on our Southern border, or scaling back the H-1B Visa program, limiting employment of highly skilled foreign workers, could have a damaging effect on our economy. Deporting the 800,000 plus "dreamers" enrolled in the DACA program alone could cost our economy more than $400 billion. (John Schoen, CNBC, Sept. 5, 2017).
Throughout our history anti-immigrant sentiment, articulated and marshaled by populist politicians exploiting fear and ignorance, have run counter to rational economic policies. As always, simplicity sells, complexity breeds vulnerability. We should compel Congress to apply basic economic principles when developing policies that affect us all.
Monday, September 18, 2017
COMFORTABLE OR NOT, IT'S CALLED "FREEDOM OF SPEECH"
Charlottesville, Va., home to the University of Virginia and Thomas Jefferson's mountain-top plantation Monticello, underwent a significant image change as a result of the violent demonstrations it encountered during the weekend of Aug. 11-13. One important outcome of the overt hatred on display that weekend was that many of us began to revisit the concept of "freedom of speech." Are white nationalists, the KKK, neo-Nazis and others protected by the First Amendment to our Constitution when they openly exhibit their hatred of people solely because of their race, religion, ethnic origin, sexual orientation, disability or gender? The issue surfaced well before "Charlottesville, when it came up as a result of the disturbances following UC Berkeley abruptly cancelling a planned February speech by conservative provocateur Milo Yiannopoulos. As a consequence of the negative press the university received, and the cumulative pressure generated nationally by recent events, Cal Chancellor Carol Christ decided to proclaim this school year a "free speech" year.
The First Amendment to our Constitution, officially adopted Dec. 15, 1791, reads: "Congress shall make no law respecting the free exercise of religion, or prohibiting the free exercise thereof, or abridging the freedom of speech, or of the press, or the right of the people peaceably to assemble, and to petition the government for redress of grievances."
Predictably, legal challenges delineating specifically what these rights did and did not include proliferated. Multiple courts concluded that the amendment included concepts like: The right not to speak - specifically the right not to salute the flag (West Virginia Board of Education v Barnette - 1943); using certain offensive words and phrases to convey political messages (Cohen v California - 1971); and engaging in symbolic speech, like burning the flag in protest (Texas v Johnson -1989). Excluded was the right to incite actions that would harm others - like shouting "fire" in a crowded theater (Schenck v United States - 1919).
In "Matai v Tami" (2017), Justice Samuel Alito, writing in support of a unanimous Supreme Court decision affirming the judgment of the Court of Appeals, wrote: "Speech that demeans on the basis of race, ethnicity, gender, religion, age, disability or any other similar ground is hateful, but the proudest boast of our free speech jurisprudence is that we protect the freedom to express the thought that we hate."
Citing legal precedent, the ACLU, generally left-leaning, sued the city of Charlottesville to allow the "Unite the Right" rally to happen downtown. After numerous participants arrived carrying loaded firearms the organization appeared to retrench retroactively, publicly expressing that "firearms and free speech don't mix." However, legally, the relationship between the First and Second Amendment is complicated, and the issue was not pursued.
Had the demonstration taken place in countries like Germany, France, Denmark, The Netherlands and others, chances are that participants would have been fined or jailed. Many countries have laws forbidding hate speech. We don't, even though there have been times in our history where such legislation was actively pursued. An early assault on free speech came from the Alien and Sedition Acts of 1798, which permitted prosecution of individuals who voiced or printed what were deemed to be malicious remarks about the president or the government. The acts were passed by a Federalist Congress, signed by President John Adams, and designed to limit the power of the opposition Republican Party. Enforcement ended after Thomas Jefferson was elected president in 1800.
Another attempt was made in 1918 when Congress passed a different Sedition Act, essentially consisting of amendments to the Espionage Act of 1917, prohibiting many forms of speech, including "any disloyal, profane, scurrilous, or abusive language about the form of our government or our flag." It was intended to prevent insubordination in the military, and to prevent the support of US enemies during wartime. Some 1,500 prosecutions were carried out, resulting in more that 1,000 convictions (Paul Avrich, Sacco and Vanzetti; The Anarchist Background, Princeton University Press, 1991). The amendments were repealed in 1921. The Espionage Act was left intact.
While we correctly assert that the opinions expressed during the "Unite the Right" rally in Charlottesville ran counter to the values we as a country embrace, it would be un-American to deny sympathizers the right to hold these. Justice Alito's opinion in "Matal v Tami" should resonate with all of us. Marches and hate spewing diatribe from white nationalists and Nazis, and, for that matter, Colin Kaepernick protesting racial inequality by kneeling during our national anthem, may make many feel outright uncomfortable. Denying them the right to do so will make all of us less American.
The First Amendment to our Constitution, officially adopted Dec. 15, 1791, reads: "Congress shall make no law respecting the free exercise of religion, or prohibiting the free exercise thereof, or abridging the freedom of speech, or of the press, or the right of the people peaceably to assemble, and to petition the government for redress of grievances."
Predictably, legal challenges delineating specifically what these rights did and did not include proliferated. Multiple courts concluded that the amendment included concepts like: The right not to speak - specifically the right not to salute the flag (West Virginia Board of Education v Barnette - 1943); using certain offensive words and phrases to convey political messages (Cohen v California - 1971); and engaging in symbolic speech, like burning the flag in protest (Texas v Johnson -1989). Excluded was the right to incite actions that would harm others - like shouting "fire" in a crowded theater (Schenck v United States - 1919).
In "Matai v Tami" (2017), Justice Samuel Alito, writing in support of a unanimous Supreme Court decision affirming the judgment of the Court of Appeals, wrote: "Speech that demeans on the basis of race, ethnicity, gender, religion, age, disability or any other similar ground is hateful, but the proudest boast of our free speech jurisprudence is that we protect the freedom to express the thought that we hate."
Citing legal precedent, the ACLU, generally left-leaning, sued the city of Charlottesville to allow the "Unite the Right" rally to happen downtown. After numerous participants arrived carrying loaded firearms the organization appeared to retrench retroactively, publicly expressing that "firearms and free speech don't mix." However, legally, the relationship between the First and Second Amendment is complicated, and the issue was not pursued.
Had the demonstration taken place in countries like Germany, France, Denmark, The Netherlands and others, chances are that participants would have been fined or jailed. Many countries have laws forbidding hate speech. We don't, even though there have been times in our history where such legislation was actively pursued. An early assault on free speech came from the Alien and Sedition Acts of 1798, which permitted prosecution of individuals who voiced or printed what were deemed to be malicious remarks about the president or the government. The acts were passed by a Federalist Congress, signed by President John Adams, and designed to limit the power of the opposition Republican Party. Enforcement ended after Thomas Jefferson was elected president in 1800.
Another attempt was made in 1918 when Congress passed a different Sedition Act, essentially consisting of amendments to the Espionage Act of 1917, prohibiting many forms of speech, including "any disloyal, profane, scurrilous, or abusive language about the form of our government or our flag." It was intended to prevent insubordination in the military, and to prevent the support of US enemies during wartime. Some 1,500 prosecutions were carried out, resulting in more that 1,000 convictions (Paul Avrich, Sacco and Vanzetti; The Anarchist Background, Princeton University Press, 1991). The amendments were repealed in 1921. The Espionage Act was left intact.
While we correctly assert that the opinions expressed during the "Unite the Right" rally in Charlottesville ran counter to the values we as a country embrace, it would be un-American to deny sympathizers the right to hold these. Justice Alito's opinion in "Matal v Tami" should resonate with all of us. Marches and hate spewing diatribe from white nationalists and Nazis, and, for that matter, Colin Kaepernick protesting racial inequality by kneeling during our national anthem, may make many feel outright uncomfortable. Denying them the right to do so will make all of us less American.
Friday, September 8, 2017
THE FACES OF WHITE SUPREMACY
The contentious and ultimately lethal demonstration in Charlottesville, Va., over the weekend of Aug. 11 highlighted a conglomeration of groups which no longer seem to feel the need to operate under the cloak of obscurity. Marchers carrying Tiki torches, swastikas, confederate flags, banners reading "Jews will not replace us" and "blood and soil," while yelling Nazi slogans left little to the imagination.
In the aftermath of President Trump's botched and highly controversial pronouncements about the violence surrounding this event, every observer and columnist analyzed the repercussions about what happened from all angles and in great detail, leaving little to dissect. Some questions are still left unanswered, however. Who are these people? How many are there? What inspires them? What is their support structure? Do they have First Amendment rights to spout the venom that appears to unite them?
The Alabama-based Southern Poverty Law Center, an activist group focused specifically on the development and existence of "hate groups," defines such collectives as organizations with "beliefs or practices that attract or malign an ethnic class of people typically for their immutable characteristics." Since the turn of the century the number of hate groups have seen explosive growth, driven in part by anger over Latino immigration and demographic projections showing that whites will no longer hold majority status in the country by 2040. The increase in numbers accelerated in 2009 when President Obama took office, declined somewhat after that, and picked up speed again during the last two years because of a presidential campaign that flirted heavily with extremist ideas. (Southern Poverty Law Center, Hate Map, Aug. 17, 2017).
The groups most prominently identified during the discussion following the Charlottesville events include: Neo-Nazis, white supremacists or white nationalists, Ku Klux Klan, and Alt-Right. Although organizationally distinct, ideology and leadership of these groups often overlap. The neo-Nazis grew out of the National Socialist Movement, which was founded in1974 as the "National Socialist American Workers Freedom Party." This group seeks to revive the far-right tenets of Nazism. It borrows elements of Nazi doctrine, including ultra-nationalism, racism, ableism (discrimination in favor of able-bodied people), xenophobia and anti-Semitism. Neo-Nazi literature frequently highlights "14 words," referencing the white supremacist slogan: "We must secure the existence of our people and a future for white children," or alternatively: "Because the beauty of the white Aryan woman must not perish from this earth" ("Hate on Display: 14 words," Anti-Defamation League, June 1, 2007). The current national leader of the Nazi movement is Jeff Schoep, who has been a "true believer" since age 10, and who took over in 1994, propelling the NSM into the most active neo-Nazi organization in the country.
"White supremacy" refers to the conviction that white people are in many ways superior to people of other races, and, because of that, white people should dominate other races. This belief is rooted in scientific racism, which claims to establish a connection between race and intelligence, and distinguishes between superior and inferior races. "White nationalism," by extension, is the ideology that advocates a racial definition of national identity, suggesting that national citizenship should be reserved for white people only. Leading promoters of white nationalism are Matthew Heimbach and Richard Spencer. Heimbach, in an article entitled: "I Hate Freedom," wrote: This is our home and our kith and kin. Borders matter, identity matters, blood matters, libertarians and their capitalism can move to Somalia if they want to live without rules" (Traditionalist Youth Network, July7, 2013).
The Ku Klux Klan, better known, was founded in 1866 as a vehicle to oppose Reconstruction policies aimed at establishing political and economic equality for African-Americans. David Duke, a white-nationalist politician, anti-Semite, conspiracy theorist, holocaust denier, convicted felon, and former Imperial Wizard, remains influential. He made a point of thanking the president for having the courage "to tell the truth" following the Charlottesville events.
The "Alt-Right," a loosely defined group of people with far-right ideologies, makes a point of influencing these demonstrations wherever they take place. The group was initially identified as "Alternative-Right" by Paul Gottfried, and American paleo-conservative philosopher. Richard Spencer changed the name in 2010 to disguise overt racism, white supremacy and neo-Nazism. Lindy West, a New York Times opinion writer referred to this designation as an "unacceptable euphemism legitimizing an ideology that would be unacceptable if it were simply called white nationalism."
All of these far-right groups find editorial support and encouragement on websites like "Daily Stormer," a neo-Nazi news and commentary site, and Breitbart News, which expresses similar views, and takes the lead attacking all opinions not in line with its own. Steve Bannon, Donald Trump's Chief Strategist until just a few week ago, has retaken his previous position as Breitbart's Executive Chairman. These ideologies are substantially supported by individuals who occupy influential positions. Among them are people like Stephen Miller, Mr. Trump's Senior Advisor for Policy, and Sebastian Gorka, the president's deputy assistant. Both of these have well-established connections to the white supremacist movement and neo-Nazi extremism. And they have the president's ear.
Enough said. To quote Heather Heyer, the 32-year old paralegal, a counter-demonstrator, killed on Aug. 12 by Nazi sympathizer James Field: "If you're not outraged, you're not paying attention."
In the aftermath of President Trump's botched and highly controversial pronouncements about the violence surrounding this event, every observer and columnist analyzed the repercussions about what happened from all angles and in great detail, leaving little to dissect. Some questions are still left unanswered, however. Who are these people? How many are there? What inspires them? What is their support structure? Do they have First Amendment rights to spout the venom that appears to unite them?
The Alabama-based Southern Poverty Law Center, an activist group focused specifically on the development and existence of "hate groups," defines such collectives as organizations with "beliefs or practices that attract or malign an ethnic class of people typically for their immutable characteristics." Since the turn of the century the number of hate groups have seen explosive growth, driven in part by anger over Latino immigration and demographic projections showing that whites will no longer hold majority status in the country by 2040. The increase in numbers accelerated in 2009 when President Obama took office, declined somewhat after that, and picked up speed again during the last two years because of a presidential campaign that flirted heavily with extremist ideas. (Southern Poverty Law Center, Hate Map, Aug. 17, 2017).
The groups most prominently identified during the discussion following the Charlottesville events include: Neo-Nazis, white supremacists or white nationalists, Ku Klux Klan, and Alt-Right. Although organizationally distinct, ideology and leadership of these groups often overlap. The neo-Nazis grew out of the National Socialist Movement, which was founded in1974 as the "National Socialist American Workers Freedom Party." This group seeks to revive the far-right tenets of Nazism. It borrows elements of Nazi doctrine, including ultra-nationalism, racism, ableism (discrimination in favor of able-bodied people), xenophobia and anti-Semitism. Neo-Nazi literature frequently highlights "14 words," referencing the white supremacist slogan: "We must secure the existence of our people and a future for white children," or alternatively: "Because the beauty of the white Aryan woman must not perish from this earth" ("Hate on Display: 14 words," Anti-Defamation League, June 1, 2007). The current national leader of the Nazi movement is Jeff Schoep, who has been a "true believer" since age 10, and who took over in 1994, propelling the NSM into the most active neo-Nazi organization in the country.
"White supremacy" refers to the conviction that white people are in many ways superior to people of other races, and, because of that, white people should dominate other races. This belief is rooted in scientific racism, which claims to establish a connection between race and intelligence, and distinguishes between superior and inferior races. "White nationalism," by extension, is the ideology that advocates a racial definition of national identity, suggesting that national citizenship should be reserved for white people only. Leading promoters of white nationalism are Matthew Heimbach and Richard Spencer. Heimbach, in an article entitled: "I Hate Freedom," wrote: This is our home and our kith and kin. Borders matter, identity matters, blood matters, libertarians and their capitalism can move to Somalia if they want to live without rules" (Traditionalist Youth Network, July7, 2013).
The Ku Klux Klan, better known, was founded in 1866 as a vehicle to oppose Reconstruction policies aimed at establishing political and economic equality for African-Americans. David Duke, a white-nationalist politician, anti-Semite, conspiracy theorist, holocaust denier, convicted felon, and former Imperial Wizard, remains influential. He made a point of thanking the president for having the courage "to tell the truth" following the Charlottesville events.
The "Alt-Right," a loosely defined group of people with far-right ideologies, makes a point of influencing these demonstrations wherever they take place. The group was initially identified as "Alternative-Right" by Paul Gottfried, and American paleo-conservative philosopher. Richard Spencer changed the name in 2010 to disguise overt racism, white supremacy and neo-Nazism. Lindy West, a New York Times opinion writer referred to this designation as an "unacceptable euphemism legitimizing an ideology that would be unacceptable if it were simply called white nationalism."
All of these far-right groups find editorial support and encouragement on websites like "Daily Stormer," a neo-Nazi news and commentary site, and Breitbart News, which expresses similar views, and takes the lead attacking all opinions not in line with its own. Steve Bannon, Donald Trump's Chief Strategist until just a few week ago, has retaken his previous position as Breitbart's Executive Chairman. These ideologies are substantially supported by individuals who occupy influential positions. Among them are people like Stephen Miller, Mr. Trump's Senior Advisor for Policy, and Sebastian Gorka, the president's deputy assistant. Both of these have well-established connections to the white supremacist movement and neo-Nazi extremism. And they have the president's ear.
Enough said. To quote Heather Heyer, the 32-year old paralegal, a counter-demonstrator, killed on Aug. 12 by Nazi sympathizer James Field: "If you're not outraged, you're not paying attention."
Saturday, August 12, 2017
OUR HEALTH CARE DISCUSSION IS MISSING THE POINT
Our health care discussion, for all its political content an social consequences, has essentially been about managing the expense of providing insurance, rather than about investing in the appropriate elements of a health care safety net and affecting the outcomes of a comprehensive strategy. John Maurice Clark, a prominent social economists during the mid 20th century, agreed that our national health should be considered a paramount resource. Well before these issues surfaced during our current debate, he wrote: "It is clear that health is a national asset, and it is worth conserving, more or less regardless of whether it can be done on purely commercial principles." ("Economic means - to what ends?," American Economic Review, 1950). Many of us know that, as a country, we spend much more on health care than any other country in the world. Few of us realize that the health of Americans, by many different measures, is actually worse than the health of citizens in other wealthy countries.
Our health care system is expensive. In 1960 we spent 5.2% of GDP on health care. In 2009 this number had shot up to 17.6%. (Kaiser Foundation, "National Health Expenditures, 1960-2009," published April 6, 2012). While the increase in expenditure did help us improve our overall life expectancy, as well as our "healthy" life expectancy, our improvement has been considerably slower than what most other wealthy countries experienced. The Journal of the American Medical Association, in "The State of U.S. Health, 1990-2010," documented trends in mortality and morbidity (the rate of disease in a population) across 34 OECD countries. It concluded that life expectancy in the U.S. dropped from #20 to #27 , while our healthy life expectancy ranking dropped from #14 to #26. Another study, conducted by the National Research Council and Institute of Medicine, produced a report by a panel of experts examining health indicators in 17 high-income countries. It discovered that American men ranked dead last, while women did only slightly better, ranking second lowest.
The World Health Organization produced a chart showing life expectancy by country across the globe. The U.S. ranked # 31. A list produced by our own C.I.A. ranked us even lower, #43. This is well below countries like Japan, Switzerland, Singapore, Australia or Spain. The contrast between our significant health care expenditure and our disappointingly low healthy outcomes result, especially when compared to much of the developed world, seems paradoxical, and begs for an explanation. Fortunately, studies explaining why we rank as we do are readily available. Unfortunately, our representatives appear less interested in digging below the surface of complex challenges when they are not easily translated into simplistic sound bites.
Kristen Beckman, a senior editor on LifeHealthPro.com, serving the insurance industry, compiled a list of nine factors affecting longevity. A number of these suggest answers that could help us solve our paradox. "Gender," "genetics" and "marital status" are considered important factors. However, they won't help explain the discrepancy. "Pre-natal and childhood conditions" is significant, however. The U.S. has the highest rate of infant mortality among high-income countries. Children born in the U.S. have a lower chance of surviving to the age of five than children born in any other wealthy nation. "Socio-economic status" contributes to life expectancy as well. Poverty has an adverse affect on access to medical care and participation in healthier lifestyle habits. In the U.S. 17% of the population lives in poverty. The median for OECD countries is almost half that, 9%. (Steven Schroeder, past president of Robert Wood Johnson Foundation). "Education" also contributes. According to the Center for Disease Control, a 25 year old man without a high school diploma has a life expectancy 9.3 years less than a man with a bachelor's or higher degree. (A study published in the Washington Post in 2008 concluded that life expectancy in the U.S. was on the increase, but only among people with more than 12 years of education.} Ms. Beckman completes her list with "ethnicity," "lifestyle," and "medical technology." Not all of these explain why our outcomes are lower than those in other countries. Other researchers observed that we invest less than other wealthy countries in social programs like parental leave and early-childhood education. While we rank 1st among OECD countries in heath care expenditures, we are 25th in spending on social services. And, perhaps consequentially, among 17 wealthy democracies included in another report, the U.S. has the highest rate of adolescent pregnancy and sexual disease.
The research suggests that pouring money into health care is not the only answer. Experts estimate that, during the last century, modern medical care treatment delivered to individual patients, thru physician and hospital treatment covered by health insurance, has only been responsible for 10-25% of the improvements in life expectancy. The remainder came from changes in the social determinants of health, especially in early childhood. While other countries use governments to improve health, including, but not limited to, the development of universal health insurance, the U.S. has consistently invested less than other wealthy countries in social programs. Medical care can prolong survival and improve prognosis after some serious diseases. However, the social and economic conditions that make people ill and in need of medical care in the first place appear more important. Perhaps we should re-focus our health care discussion on the overall objective and rethink where our investments will do the most good. There are healthy outcomes in many countries that can serve as examples of how to do this.
Our health care system is expensive. In 1960 we spent 5.2% of GDP on health care. In 2009 this number had shot up to 17.6%. (Kaiser Foundation, "National Health Expenditures, 1960-2009," published April 6, 2012). While the increase in expenditure did help us improve our overall life expectancy, as well as our "healthy" life expectancy, our improvement has been considerably slower than what most other wealthy countries experienced. The Journal of the American Medical Association, in "The State of U.S. Health, 1990-2010," documented trends in mortality and morbidity (the rate of disease in a population) across 34 OECD countries. It concluded that life expectancy in the U.S. dropped from #20 to #27 , while our healthy life expectancy ranking dropped from #14 to #26. Another study, conducted by the National Research Council and Institute of Medicine, produced a report by a panel of experts examining health indicators in 17 high-income countries. It discovered that American men ranked dead last, while women did only slightly better, ranking second lowest.
The World Health Organization produced a chart showing life expectancy by country across the globe. The U.S. ranked # 31. A list produced by our own C.I.A. ranked us even lower, #43. This is well below countries like Japan, Switzerland, Singapore, Australia or Spain. The contrast between our significant health care expenditure and our disappointingly low healthy outcomes result, especially when compared to much of the developed world, seems paradoxical, and begs for an explanation. Fortunately, studies explaining why we rank as we do are readily available. Unfortunately, our representatives appear less interested in digging below the surface of complex challenges when they are not easily translated into simplistic sound bites.
Kristen Beckman, a senior editor on LifeHealthPro.com, serving the insurance industry, compiled a list of nine factors affecting longevity. A number of these suggest answers that could help us solve our paradox. "Gender," "genetics" and "marital status" are considered important factors. However, they won't help explain the discrepancy. "Pre-natal and childhood conditions" is significant, however. The U.S. has the highest rate of infant mortality among high-income countries. Children born in the U.S. have a lower chance of surviving to the age of five than children born in any other wealthy nation. "Socio-economic status" contributes to life expectancy as well. Poverty has an adverse affect on access to medical care and participation in healthier lifestyle habits. In the U.S. 17% of the population lives in poverty. The median for OECD countries is almost half that, 9%. (Steven Schroeder, past president of Robert Wood Johnson Foundation). "Education" also contributes. According to the Center for Disease Control, a 25 year old man without a high school diploma has a life expectancy 9.3 years less than a man with a bachelor's or higher degree. (A study published in the Washington Post in 2008 concluded that life expectancy in the U.S. was on the increase, but only among people with more than 12 years of education.} Ms. Beckman completes her list with "ethnicity," "lifestyle," and "medical technology." Not all of these explain why our outcomes are lower than those in other countries. Other researchers observed that we invest less than other wealthy countries in social programs like parental leave and early-childhood education. While we rank 1st among OECD countries in heath care expenditures, we are 25th in spending on social services. And, perhaps consequentially, among 17 wealthy democracies included in another report, the U.S. has the highest rate of adolescent pregnancy and sexual disease.
The research suggests that pouring money into health care is not the only answer. Experts estimate that, during the last century, modern medical care treatment delivered to individual patients, thru physician and hospital treatment covered by health insurance, has only been responsible for 10-25% of the improvements in life expectancy. The remainder came from changes in the social determinants of health, especially in early childhood. While other countries use governments to improve health, including, but not limited to, the development of universal health insurance, the U.S. has consistently invested less than other wealthy countries in social programs. Medical care can prolong survival and improve prognosis after some serious diseases. However, the social and economic conditions that make people ill and in need of medical care in the first place appear more important. Perhaps we should re-focus our health care discussion on the overall objective and rethink where our investments will do the most good. There are healthy outcomes in many countries that can serve as examples of how to do this.
Tuesday, August 1, 2017
CONSEQUENCES OF REPEALING OBAMACARE
Ever since last year's election results became known, the congressional majority has labored to repeal and replace the Affordable Care Act, a.k.a. Obamacare, President Obama's signature legislation which went into effect in January of 2010. Aside from attempting to fulfill a political promise seven years in the making, the most significant reasons behind the desire to repeal the law have been "government overreach," budget impact, and the additional taxes levied on high income earners. Moreover, legislators want to use the potential savings realized from repeal to fund another policy objective, tax reform.
Obamacare's major accomplishment was that it pushed the nation's health insurance uninsured rate to a record low 8.6 percent, composed of about 27.3 million people. Those numbers were 16 percent and 48.6 million in 2010. The law helped provide insurance for an additional 21.3 million people. This result was achieved by mandating that nearly all Americans have some form of health insurance or pay a tax penalty, barring insurance companies from declining coverage to people with pre-existing conditions, and authorizing an expansion of Medicaid to nearly all poor adults. (Thirty-one states expanded Medicaid programs with the federal government picking up most of the costs of providing coverage to the newly eligible.)
The most recent analysis by the Congressional Budget Office (CBO), which focused on the current proposal considered by the Senate, to "repeal without replacement," estimates that, when enacted, this legislation would increase the number of uninsured by 17 million in 2018, and 32 million by 2026. The CBO estimate also projected that insurance premiums in the individual market would double, and it suggested that by 2026 most of the country would no longer have an insurer selling individual plans. Its projections assumed that a total repeal would include the end of mandatory insurance requirements, an end to subsidies to help low- and middle-income people purchase individual plans, and, by 2020, the end of federal funding for the expansion of Medicaid to poor adults. It would decrease federal deficits. Most of the savings would be generated by repealing the Medicaid expansion. ("H.R. 1628, Obamacare Repeal Reconciliation Act of 2017," CBO, July 19, 2017.)
While many of the proposed policy nuances may appear convoluted and unnecessarily complex, the elements affecting our personal lives are simple: 1. How will policy changes affect our premiums? 2. How many more of us will end up uninsured? 3. What effect will this condition have on our lives, individually, cumulatively, and on the viability of our support network?
Simply put, the uninsured have very limited access to healthcare Their care will generally come too late. They will be sicker and die sooner One in five uninsured adults will go entirely without medical care due to cost. They won't receive preventive care and services for major health conditions and chronic diseases. ("Key Facts about the Uninsured Population," Kaiser Foundation.)
A 2003 study published by the National Academic Press, entitled "Social and Economic Costs of Uninsurance in Context," provides even greater detail. Its findings show that 18,000 will die prematurely, 8 million uninsured with chronic illnesses will receive fewer services and have increased morbidity, 41 million adults and children will be less likely to receive preventive and screening services, and people living in communities with higher than average uninsured rates will be at risk for reduced availability of health service and overtaxed resources.
Other analyses of Obamacare repeal suggest that repeal of tax credits and Medicaid expansion could lead to a loss of 2.6 million jobs in 2019, and cuts in federal spending for health reform would likely cause serious economic distress for states. "If replacement policies are not in place, there will be a cumulative $1.5 trillion loss in gross state products, and $2.6 trillion reduction in business output from 2019 thru 2023. ("Repealing Federal Health Reform: Economic and Employment Consequences for States." This study was financed by the Commonwealth Fund.)
We should also consider the plight of hospitals. The "Emergency Medical Treatment and Labor Act," passed in 1985, requires that hospitals treat all individuals in need of emergency care regardless of their insurance status. When people are uninsured hospitals effectively serve as insurers of last resort by providing care to uninsured patients who cannot afford to pay for their medical bills. The estimate is that each additional uninsured person costs local hospitals $900 per year. In 2013 alone the cost of uncompensated care was $84 billion. ("Who bears the cost of the uninsured? Non-profit hospitals." Kellog Insight, North-Western University, June 22, 2015.)
This all boils down to a question of priorities. How important is it to us, as a society, to protect those who can least afford it? Do we really think that politically flaunting "access" to health care equates with actually making care available and affordable for all? Shouldn't we consider the consequences simplistic political calculations impose on our personal lives and on the deteriorating health care support structures surrounding us? Do we really want our representatives to ignore these consequences because they promised they would repeal a law that has been on the books for seven-plus years, or because they need to realize a savings so they can squander it on tax reductions for some who don't need it?
The answer appears obvious. This can't just be about economics and politics, this is a moral issue as well.
Obamacare's major accomplishment was that it pushed the nation's health insurance uninsured rate to a record low 8.6 percent, composed of about 27.3 million people. Those numbers were 16 percent and 48.6 million in 2010. The law helped provide insurance for an additional 21.3 million people. This result was achieved by mandating that nearly all Americans have some form of health insurance or pay a tax penalty, barring insurance companies from declining coverage to people with pre-existing conditions, and authorizing an expansion of Medicaid to nearly all poor adults. (Thirty-one states expanded Medicaid programs with the federal government picking up most of the costs of providing coverage to the newly eligible.)
The most recent analysis by the Congressional Budget Office (CBO), which focused on the current proposal considered by the Senate, to "repeal without replacement," estimates that, when enacted, this legislation would increase the number of uninsured by 17 million in 2018, and 32 million by 2026. The CBO estimate also projected that insurance premiums in the individual market would double, and it suggested that by 2026 most of the country would no longer have an insurer selling individual plans. Its projections assumed that a total repeal would include the end of mandatory insurance requirements, an end to subsidies to help low- and middle-income people purchase individual plans, and, by 2020, the end of federal funding for the expansion of Medicaid to poor adults. It would decrease federal deficits. Most of the savings would be generated by repealing the Medicaid expansion. ("H.R. 1628, Obamacare Repeal Reconciliation Act of 2017," CBO, July 19, 2017.)
While many of the proposed policy nuances may appear convoluted and unnecessarily complex, the elements affecting our personal lives are simple: 1. How will policy changes affect our premiums? 2. How many more of us will end up uninsured? 3. What effect will this condition have on our lives, individually, cumulatively, and on the viability of our support network?
Simply put, the uninsured have very limited access to healthcare Their care will generally come too late. They will be sicker and die sooner One in five uninsured adults will go entirely without medical care due to cost. They won't receive preventive care and services for major health conditions and chronic diseases. ("Key Facts about the Uninsured Population," Kaiser Foundation.)
A 2003 study published by the National Academic Press, entitled "Social and Economic Costs of Uninsurance in Context," provides even greater detail. Its findings show that 18,000 will die prematurely, 8 million uninsured with chronic illnesses will receive fewer services and have increased morbidity, 41 million adults and children will be less likely to receive preventive and screening services, and people living in communities with higher than average uninsured rates will be at risk for reduced availability of health service and overtaxed resources.
Other analyses of Obamacare repeal suggest that repeal of tax credits and Medicaid expansion could lead to a loss of 2.6 million jobs in 2019, and cuts in federal spending for health reform would likely cause serious economic distress for states. "If replacement policies are not in place, there will be a cumulative $1.5 trillion loss in gross state products, and $2.6 trillion reduction in business output from 2019 thru 2023. ("Repealing Federal Health Reform: Economic and Employment Consequences for States." This study was financed by the Commonwealth Fund.)
We should also consider the plight of hospitals. The "Emergency Medical Treatment and Labor Act," passed in 1985, requires that hospitals treat all individuals in need of emergency care regardless of their insurance status. When people are uninsured hospitals effectively serve as insurers of last resort by providing care to uninsured patients who cannot afford to pay for their medical bills. The estimate is that each additional uninsured person costs local hospitals $900 per year. In 2013 alone the cost of uncompensated care was $84 billion. ("Who bears the cost of the uninsured? Non-profit hospitals." Kellog Insight, North-Western University, June 22, 2015.)
This all boils down to a question of priorities. How important is it to us, as a society, to protect those who can least afford it? Do we really think that politically flaunting "access" to health care equates with actually making care available and affordable for all? Shouldn't we consider the consequences simplistic political calculations impose on our personal lives and on the deteriorating health care support structures surrounding us? Do we really want our representatives to ignore these consequences because they promised they would repeal a law that has been on the books for seven-plus years, or because they need to realize a savings so they can squander it on tax reductions for some who don't need it?
The answer appears obvious. This can't just be about economics and politics, this is a moral issue as well.
Monday, July 17, 2017
"OUR COUNTRY, RIGHT OR WRONG!" - IS PATRIOTISM STILL ALIVE?
This quote, attributed to a toast offered by Stephen Decatur, a U.S. naval officer during the early 19th century, has been repeated over and over by those who want to either demonstrate their sense of patriotism, or censure an extreme form of it. Years after these words were absorbed into our culture, English writer Gilbert Chesterton rebutted that "no patriot would ever think of using these words, except when desperate." He continued by comparing that statement to saying: "My mother, drunk or sober."
Given the extremely contentious political climate our country is currently experiencing, believers on both sides of the fence routinely accuse the opposing contingent of being un-patriotic. With aggressive attitudes dominating the discussion, we might easily be convinced that patriotism, as a cultural characteristic, has become an endangered species. However, in a report which analyzed the results of a research project, Christian Rovsek, after traveling 12,500 miles through 44 states, and interviewing thousands of people, concluded that patriotism is still alive and flourishing. ("Is Patriotism Dead in America?," The Huffington Post, December 21, 2011.)
With the exception of some observers who straddle the fringes of the political spectrum, many who write about this subject seem to agree. Their conclusions ought to be comforting for most of us. However, we need to point out that not everyone agrees on how to define patriotism. Depending on our perspective, many of us may disagree on how to express our love of country. James Grossman, Executive Director of the American Historical Association, coined the question appropriately: "What constitutes patriotism in a nation founded on dissent and notable for its deep and vibrant traditions of activism and debate from every corner of the country and the political spectrum?" ("On Patriotism," May, 2015)
Throughout our history we have encountered many periods dominated by significant dispute, sometimes expressed violently, during which all sides question the patriotism of those they disagree with. Our Civil War stands out as a prime example. And those who lived through the Vietnam War era may well remember the mantra: "America , love it or leave it." When our country is at war, expressions of patriotism tend to veer towards the extreme, approaching a form of "nationalism." George Orwell distinguished between the two, defining patriotism as "devotion to a particular place and a particular way of life, which one believes to be the best in the world, but has no wish to force upon other people." He considered nationalism to be inseparable from the desire for power, and the drive to secure greater prestige for the nation.
Having said this, it is significant to acknowledge that patriotism is not just some concept we can discard at will. We ought to recognize that if it were no longer alive, the continued homogeneity or national cohesion of our country would be in jeopardy. During war times patriotism instills "esprit de corps" among our soldiers. It clarifies their objectives, and drives them to give the extra effort in battle. Domestically patriotism in all its forms is used to arouse the popular support for the war effort in other ways.
Over time countries created symbols intended to unite people through visual, verbal or iconic representations of the national psyche, values, goals or history. Examples of these are flags, anthems, monuments, myths, national colors, and others. Historical events represented by our Independence Day celebration or France's Bastille Day are symbolic representations of important occurrences in history, designed to regenerate a feeling of national pride or remembrance. These symbols help us maintain our identity as a people. One of the major challenges supranational organizations like the European Union has is the relative lack of these unifying symbols. The current reincarnation of nativist attitudes in some of its member states appear to be a reflection of heightened patriotism at home, feeding an identity crisis many of their citizens have begun to experience as a result of increased immigration and a weakening of national unity. The resulting increase in euroscepticism and anti-globalization sentiments are largely a result of this growing exhibition of domestic patriotism.
While patriotism tends to be a positive force for the viability of a state, be it in a supportive or critical capacity, we need to recognize that there will always be some who subscribe to a flexible form of patriotism, while others may adopt the concept as an article of faith in manner reminiscent of adhering to a fundamentalist religion. Uncritical love of country can become pathological. People whose belief system is that extreme will invoke patriotism in defense of principles that they can't logically defend in any other way. The propaganda machines that operate in Nazi-Germany and those that are state controlled in present-day Russia feed this extreme form of what would otherwise be a good thing.
When patriotism bleeds into nationalism, chauvinism or jingoism, we need to be vigilant, and resist these impulses before hey take hold and become dangerous. "Our Country, right or wrong" is an attitude politicians will attempt to take advantage of, to the detriment of us all.
Given the extremely contentious political climate our country is currently experiencing, believers on both sides of the fence routinely accuse the opposing contingent of being un-patriotic. With aggressive attitudes dominating the discussion, we might easily be convinced that patriotism, as a cultural characteristic, has become an endangered species. However, in a report which analyzed the results of a research project, Christian Rovsek, after traveling 12,500 miles through 44 states, and interviewing thousands of people, concluded that patriotism is still alive and flourishing. ("Is Patriotism Dead in America?," The Huffington Post, December 21, 2011.)
With the exception of some observers who straddle the fringes of the political spectrum, many who write about this subject seem to agree. Their conclusions ought to be comforting for most of us. However, we need to point out that not everyone agrees on how to define patriotism. Depending on our perspective, many of us may disagree on how to express our love of country. James Grossman, Executive Director of the American Historical Association, coined the question appropriately: "What constitutes patriotism in a nation founded on dissent and notable for its deep and vibrant traditions of activism and debate from every corner of the country and the political spectrum?" ("On Patriotism," May, 2015)
Throughout our history we have encountered many periods dominated by significant dispute, sometimes expressed violently, during which all sides question the patriotism of those they disagree with. Our Civil War stands out as a prime example. And those who lived through the Vietnam War era may well remember the mantra: "America , love it or leave it." When our country is at war, expressions of patriotism tend to veer towards the extreme, approaching a form of "nationalism." George Orwell distinguished between the two, defining patriotism as "devotion to a particular place and a particular way of life, which one believes to be the best in the world, but has no wish to force upon other people." He considered nationalism to be inseparable from the desire for power, and the drive to secure greater prestige for the nation.
Having said this, it is significant to acknowledge that patriotism is not just some concept we can discard at will. We ought to recognize that if it were no longer alive, the continued homogeneity or national cohesion of our country would be in jeopardy. During war times patriotism instills "esprit de corps" among our soldiers. It clarifies their objectives, and drives them to give the extra effort in battle. Domestically patriotism in all its forms is used to arouse the popular support for the war effort in other ways.
Over time countries created symbols intended to unite people through visual, verbal or iconic representations of the national psyche, values, goals or history. Examples of these are flags, anthems, monuments, myths, national colors, and others. Historical events represented by our Independence Day celebration or France's Bastille Day are symbolic representations of important occurrences in history, designed to regenerate a feeling of national pride or remembrance. These symbols help us maintain our identity as a people. One of the major challenges supranational organizations like the European Union has is the relative lack of these unifying symbols. The current reincarnation of nativist attitudes in some of its member states appear to be a reflection of heightened patriotism at home, feeding an identity crisis many of their citizens have begun to experience as a result of increased immigration and a weakening of national unity. The resulting increase in euroscepticism and anti-globalization sentiments are largely a result of this growing exhibition of domestic patriotism.
While patriotism tends to be a positive force for the viability of a state, be it in a supportive or critical capacity, we need to recognize that there will always be some who subscribe to a flexible form of patriotism, while others may adopt the concept as an article of faith in manner reminiscent of adhering to a fundamentalist religion. Uncritical love of country can become pathological. People whose belief system is that extreme will invoke patriotism in defense of principles that they can't logically defend in any other way. The propaganda machines that operate in Nazi-Germany and those that are state controlled in present-day Russia feed this extreme form of what would otherwise be a good thing.
When patriotism bleeds into nationalism, chauvinism or jingoism, we need to be vigilant, and resist these impulses before hey take hold and become dangerous. "Our Country, right or wrong" is an attitude politicians will attempt to take advantage of, to the detriment of us all.
Saturday, June 24, 2017
CLIMATE CHANGE DENIALS ARE EMBLEMATIC OF LARGER PROBLEMS
On June 1 President Donald Trump announced to the world that he would pull the U.S. out of the Paris climate accord. During an hour-long, 2,000 word, speech in which the president never made mention of "climate change," he made the unsubstantiated assertion that the agreement would cost the U.S. as many as 2.7 million jobs by the year 2025. This move placed our country in opposition to 194 treaty participants and alongside Syria and Nicaragua, the only two countries that did not sign on.
International and national condemnation was fierce and predictable, dividing Mr. Trump's inner circle as well. While the expressed rationale for terminating our participation in the accord appeared designed to appease his political base, the thought process behind the decision seems grounded in Mr. Trump's distrust of the science behind climate change - something he referred to as a "hoax" during the campaign.
Many of Mr. Trump's core supporters reject the expertise of the vast majority of scientists who believe that global warming is linked to human activity. Politicians like Texas Senator Ted Cruz even deny that any warming has been recorded during the past 15 years. Former Speaker of the House, Newt Gingrich, does not believe that human activity causes climate change. Michele Bachmann believes that nature itself is to blame. And Oklahoma Senator James Inhofe is famous for stating categorically: "My point is, God is still up there. The arrogance of people to think that we, human beings, would be able to change what he is doing in the climate is to me outrageous."
While this may seem like an isolated issue, the attitude of many people, politicians most prominently, is emblematic of what appears to be an expression of a pervasive anti-intellectual attitude lodged in our culture. The literature supporting this line of argument is quite substantial. In 1963 Columbia University historian Richard Hofstadter published a study entitled "Anti-Intellectualism in American Life," for which he received the Pulitzer Prize for non-fiction the following year. For a case study Hofstadter analyzed the 1952 presidential election battle between Dwight Eisenhower and Adlai Stevenson. He argued that the contest ultimately came down to a campaign contrasting relative ignorance and superior intellect. Intellect lost. Hofstadter ultimately concluded that, perhaps as a consequence of the "democratization of knowledge," the acquisition and spread of knowledge among the "common people," anti-intellectualism had become embedded in our national fabric.
Our intellectual history has long been grounded in what political scientists refer to as the "protestant ethic," which dictates that a person's duty is to achieve success through hard work and thrift. Success reflects a sign that we are "saved." Combined with "utilitarianism," an ethical theory that states that the best action is the one that maximizes utility, these ideas became the underpinning of capitalism, a dominant building block for U.S. success. (See Max Weber, "The Protestant Ethic and the Spirit of Capitalism"). Over time intellectual pursuit for its own sake began to be looked at as an impediment to economic development. Enter politicians of various stripes, and analysts observing what they see, and worrying about what we are in for.
For decades politicians have realized that scientific expertise did not sell well. Simplistic bombast, usually confused with ego-infused "common sense" did. Elected officials frequently attacked intellectuals by identifying them in terms such as an "effete core of impudent snobs who characterize themselves as individuals," (Spiro Agnew), or publishing statements like: "I would sooner live in a society governed by the first 2,000 names in the Boston telephone directory, than in a society governed by the 2,000 faculty members of Harvard University." (William F. Buckley). When Donald Trump mentions that he wants to "drain the swamp," he really means getting rid of arrogant technocrats, bookish intellectuals and politically correct elites.
The transition from intellectual pursuit to a dominantly utilitarian focus also infiltrated our educational institutions. U.C. Irvine professor Catherine Liu recently remarked that "We don't educate people anymore. We train them to get jobs." Our students used to rank at the top of the world in math and science. In a recent PEW Research Center test for 15-year-olds from 35 participating OECD countries, we now only placed 30th in math and 19th in science. Hardly surprising, since our educational emphasis has shifted, and since many of our supposed role-models pride themselves on their ignorance.
Decades ago Isaac Asimov warned us of "a culture of ignorance in the United States, nurtured by the fake notion that democracy means that my ignorance is just as good a your knowledge." In April of this year EPA administrator Scott Pruitt eliminated all climate change references from its website, and instructed his staff to eliminate them from their lexicon. The administration apparently does not want to talk about this.
The question is: "Do we still accept that 'E=MC squared,' or do we want to vote on this?" Ignorance may be comforting to some of our leaders, it is a curse for the future of our country.
International and national condemnation was fierce and predictable, dividing Mr. Trump's inner circle as well. While the expressed rationale for terminating our participation in the accord appeared designed to appease his political base, the thought process behind the decision seems grounded in Mr. Trump's distrust of the science behind climate change - something he referred to as a "hoax" during the campaign.
Many of Mr. Trump's core supporters reject the expertise of the vast majority of scientists who believe that global warming is linked to human activity. Politicians like Texas Senator Ted Cruz even deny that any warming has been recorded during the past 15 years. Former Speaker of the House, Newt Gingrich, does not believe that human activity causes climate change. Michele Bachmann believes that nature itself is to blame. And Oklahoma Senator James Inhofe is famous for stating categorically: "My point is, God is still up there. The arrogance of people to think that we, human beings, would be able to change what he is doing in the climate is to me outrageous."
While this may seem like an isolated issue, the attitude of many people, politicians most prominently, is emblematic of what appears to be an expression of a pervasive anti-intellectual attitude lodged in our culture. The literature supporting this line of argument is quite substantial. In 1963 Columbia University historian Richard Hofstadter published a study entitled "Anti-Intellectualism in American Life," for which he received the Pulitzer Prize for non-fiction the following year. For a case study Hofstadter analyzed the 1952 presidential election battle between Dwight Eisenhower and Adlai Stevenson. He argued that the contest ultimately came down to a campaign contrasting relative ignorance and superior intellect. Intellect lost. Hofstadter ultimately concluded that, perhaps as a consequence of the "democratization of knowledge," the acquisition and spread of knowledge among the "common people," anti-intellectualism had become embedded in our national fabric.
Our intellectual history has long been grounded in what political scientists refer to as the "protestant ethic," which dictates that a person's duty is to achieve success through hard work and thrift. Success reflects a sign that we are "saved." Combined with "utilitarianism," an ethical theory that states that the best action is the one that maximizes utility, these ideas became the underpinning of capitalism, a dominant building block for U.S. success. (See Max Weber, "The Protestant Ethic and the Spirit of Capitalism"). Over time intellectual pursuit for its own sake began to be looked at as an impediment to economic development. Enter politicians of various stripes, and analysts observing what they see, and worrying about what we are in for.
For decades politicians have realized that scientific expertise did not sell well. Simplistic bombast, usually confused with ego-infused "common sense" did. Elected officials frequently attacked intellectuals by identifying them in terms such as an "effete core of impudent snobs who characterize themselves as individuals," (Spiro Agnew), or publishing statements like: "I would sooner live in a society governed by the first 2,000 names in the Boston telephone directory, than in a society governed by the 2,000 faculty members of Harvard University." (William F. Buckley). When Donald Trump mentions that he wants to "drain the swamp," he really means getting rid of arrogant technocrats, bookish intellectuals and politically correct elites.
The transition from intellectual pursuit to a dominantly utilitarian focus also infiltrated our educational institutions. U.C. Irvine professor Catherine Liu recently remarked that "We don't educate people anymore. We train them to get jobs." Our students used to rank at the top of the world in math and science. In a recent PEW Research Center test for 15-year-olds from 35 participating OECD countries, we now only placed 30th in math and 19th in science. Hardly surprising, since our educational emphasis has shifted, and since many of our supposed role-models pride themselves on their ignorance.
Decades ago Isaac Asimov warned us of "a culture of ignorance in the United States, nurtured by the fake notion that democracy means that my ignorance is just as good a your knowledge." In April of this year EPA administrator Scott Pruitt eliminated all climate change references from its website, and instructed his staff to eliminate them from their lexicon. The administration apparently does not want to talk about this.
The question is: "Do we still accept that 'E=MC squared,' or do we want to vote on this?" Ignorance may be comforting to some of our leaders, it is a curse for the future of our country.
Thursday, June 1, 2017
ARE WE GETTING READY TO PROVIDE A UNIVERSAL BASIC INCOME FOR EVERYONE?
Facebook's CEO Mark Zuckerberg's commencement address to Harvard's class of 2017 on May 25 included a suggestion that appears to be gaining increasing support among contemporary business leaders. In his address Mr. Zuckerberg proposed that: "we should explore ideas like universal basic income to make sure that everyone has a cushion to try new ideas." For us, on the west coast, this came on the heels of a similar recommendation made by Tesla CEO Elon Musk and Y-Combinator president Sam Altman, which was quoted in an article written by Marisa Kendall for the Bay Area News Group, and published on May 21st. Initial responses to these suggestions indicate that the concept is not well understood by the public at large, and, for many, appears to be something entirely new and coming out of left field.
A substantially similar proposal was first published in 1797 when Thomas Paine, in a pamphlet titled "Agrarian Justice," advocated for what he called "asset based egalitarianism," a social insurance system for young and old financed by a 10% tax on inherited property. During the 1960s and 1970s other proposals emerged. Economist Milton Friedman, in his book "Capitalism and Freedom" (1962), proposed a "negative income tax," while Austrian Nobel Laureate economist Friedrich Hayek in "Law, Legislation and Liberty" (1973) made the case for a Universal Basic Income as well. President Nixon once even contemplated a policy that would have provided "unconditional income for all poor families." (Rutger Bregman, "Nixon's Basic Income Plan," Jacobin, May 5, 2016.)
A Universal Basic Income (UBI) refers to a form of social security in which all citizens or residents of a country regularly receive an unconditional sum of money in addition to any income received from elsewhere. With technology and automation changing the labor market at an increasing pace, and as new technologies replace working conditions, in many minds the question for the future becomes how to best provide economic security for all. Former Secretary of Labor Robert Reich produced a column entitled: "The "iEverything and the Redistributional Imperative" (March 16, 2015). In it Reich postulates a little gadget called "iEverything"," which will give us anything we need, and which will be here before we know it. He suggests, however, that once it arrives we won't be able to buy it, because there won't be any paying jobs left. Researchers estimate that almost half of all U.S.jobs are at risk of being automated in the next two decades. Reich and others conclude that because of the speed of technological change a universal basic income will eventually be inevitable.
It is tempting to compare today's technological revolution with the industrial revolution that started in the mid 18th century. Both of these significantly transformed society. At a superficial level, however, we should recognize two important differences. The industrial revolution matured over a period of a hundred years or so. The pace of change was relatively slow, giving workers more time to adjust. Although many workers were shifted from being highly skilled and valued specialists into a fairly cheap, easily replaceable unskilled labor force, they did not typically lose their ability to make a living. The revolution we are experiencing today is much more rapid, giving many workers not enough time or resources to shift into new marketable skills. Elon Musk and others foresee an impending robot revolution expected to leave a trail of unemployment in its wake. "Futurism," a newsletter designed to "cover breakthrough technologies and scientific discoveries that will shape humanity's future," report that robot to worker ratios are rapidly increasing, currently running from 1.64 per 100 workers in the U.S. to 4.78 per 100 workers in South Korea. It projects that occupations like insurance underwriters, farm laborers construction workers, fast food cooks, truck drivers and mail carriers are among the many that are at risk.
Proponents of the UBI approach argue that it will free welfare recipients from the paternalistic oversight of conditional welfare-state policies. They suggest that traditional welfare schemes create a disincentive to work, because they cause people to lose benefits at the same rate that their income rises. They project that UBI will be affordable because it serves as a substitute of a wide range of social welfare programs. Since most people are above the median income level, they will, in fact, financially underwrite a basic income for all through their income tax. They look at UBI as a promise of equal opportunity, and a new starting line set above the poverty line. Opponents basically disagree on all counts. Even though they might support the "trickle down" concept, they don't see it here.
Several countries have experimented with some form of UBI. Alaska implemented its own brand in 1982. Its system is called the Permanent Fund Dividend (PFD), which is derived from earnings on investment of the Alaska Permanent Fund (APF), a portfolio of diversified assets. Because of market fluctuations, the amount given to Alaskan residents vary. Canada, Finland, The Netherlands, and numerous developing African countries have also started to experiment with this approach. In June of 2016 Swiss citizens participated in a referendum asking whether a form of UBI should be incorporated into their constitution. The proposal was to provide a monthly income of 2,500 Swiss Francs to each citizen. It failed with 76.9% of voters voting against.
Even though many are still uncomfortable with the idea of giving people money simply for being a citizen, the heightened interest in these kinds of proposals suggest that intelligent people from all walks of life contemplate an impending need to address the negative consequences of a technological revolution that continues to pick up speed. Current conditional welfare-state policies may well be obsolete, and need rethinking.
A substantially similar proposal was first published in 1797 when Thomas Paine, in a pamphlet titled "Agrarian Justice," advocated for what he called "asset based egalitarianism," a social insurance system for young and old financed by a 10% tax on inherited property. During the 1960s and 1970s other proposals emerged. Economist Milton Friedman, in his book "Capitalism and Freedom" (1962), proposed a "negative income tax," while Austrian Nobel Laureate economist Friedrich Hayek in "Law, Legislation and Liberty" (1973) made the case for a Universal Basic Income as well. President Nixon once even contemplated a policy that would have provided "unconditional income for all poor families." (Rutger Bregman, "Nixon's Basic Income Plan," Jacobin, May 5, 2016.)
A Universal Basic Income (UBI) refers to a form of social security in which all citizens or residents of a country regularly receive an unconditional sum of money in addition to any income received from elsewhere. With technology and automation changing the labor market at an increasing pace, and as new technologies replace working conditions, in many minds the question for the future becomes how to best provide economic security for all. Former Secretary of Labor Robert Reich produced a column entitled: "The "iEverything and the Redistributional Imperative" (March 16, 2015). In it Reich postulates a little gadget called "iEverything"," which will give us anything we need, and which will be here before we know it. He suggests, however, that once it arrives we won't be able to buy it, because there won't be any paying jobs left. Researchers estimate that almost half of all U.S.jobs are at risk of being automated in the next two decades. Reich and others conclude that because of the speed of technological change a universal basic income will eventually be inevitable.
It is tempting to compare today's technological revolution with the industrial revolution that started in the mid 18th century. Both of these significantly transformed society. At a superficial level, however, we should recognize two important differences. The industrial revolution matured over a period of a hundred years or so. The pace of change was relatively slow, giving workers more time to adjust. Although many workers were shifted from being highly skilled and valued specialists into a fairly cheap, easily replaceable unskilled labor force, they did not typically lose their ability to make a living. The revolution we are experiencing today is much more rapid, giving many workers not enough time or resources to shift into new marketable skills. Elon Musk and others foresee an impending robot revolution expected to leave a trail of unemployment in its wake. "Futurism," a newsletter designed to "cover breakthrough technologies and scientific discoveries that will shape humanity's future," report that robot to worker ratios are rapidly increasing, currently running from 1.64 per 100 workers in the U.S. to 4.78 per 100 workers in South Korea. It projects that occupations like insurance underwriters, farm laborers construction workers, fast food cooks, truck drivers and mail carriers are among the many that are at risk.
Proponents of the UBI approach argue that it will free welfare recipients from the paternalistic oversight of conditional welfare-state policies. They suggest that traditional welfare schemes create a disincentive to work, because they cause people to lose benefits at the same rate that their income rises. They project that UBI will be affordable because it serves as a substitute of a wide range of social welfare programs. Since most people are above the median income level, they will, in fact, financially underwrite a basic income for all through their income tax. They look at UBI as a promise of equal opportunity, and a new starting line set above the poverty line. Opponents basically disagree on all counts. Even though they might support the "trickle down" concept, they don't see it here.
Several countries have experimented with some form of UBI. Alaska implemented its own brand in 1982. Its system is called the Permanent Fund Dividend (PFD), which is derived from earnings on investment of the Alaska Permanent Fund (APF), a portfolio of diversified assets. Because of market fluctuations, the amount given to Alaskan residents vary. Canada, Finland, The Netherlands, and numerous developing African countries have also started to experiment with this approach. In June of 2016 Swiss citizens participated in a referendum asking whether a form of UBI should be incorporated into their constitution. The proposal was to provide a monthly income of 2,500 Swiss Francs to each citizen. It failed with 76.9% of voters voting against.
Even though many are still uncomfortable with the idea of giving people money simply for being a citizen, the heightened interest in these kinds of proposals suggest that intelligent people from all walks of life contemplate an impending need to address the negative consequences of a technological revolution that continues to pick up speed. Current conditional welfare-state policies may well be obsolete, and need rethinking.
Friday, May 19, 2017
HAVE CHILDREN BECOME TARGETS IN TODAY'S WARS?
I APOLOGIZE UP FRONT THAT THIS ENTRY SEEMS CHRONOLOGICALLY OUT OF ORDER.
The optics triggered by Bashar al-Assad's chemical strike against civilians in Khan Sheikhoun, resulting in close to 90 casualties, including women, children and babies, prompted a response which is still reverberating in several national capitals. President Trump's decision to send 59 missiles into the airfield from which the Syrian government's planes took off for the attack has been revered by many, regretted by some, and condemned by a number of governments involved in the Syrian civil war which is now in its sixth year.
Mr.Trump proclaimed that the sarin gas attack was an "affront to humanity," which affected him profoundly and transformed his thinking about the Syrian president, and led him to order the missile attack. Skeptics believe that Barack Obama's failure to enforce a "red line" over Assad's use of chemical weapons provided a powerful impetus to show that there was "a new sheriff in town." But even if, for the sake of argument, we take the comments surrounding this military event at face value, some of the questions we should ask, especially as they relate to the millions of children that have become the victims of warfare, are: Where was the outrage when barrel bombs rained down on hospitals, market places and other civilian targets? And, while in the 18th, 19th and early 20th centuries roughly half of all deaths in conflict zones were civilian, by the end of the 20th century almost 90 percent have been civilian, many of them children. What changed to make these more vulnerable today?
Warfare has changed, and these changes include the identity of combatants and the relative vulnerability of civilian populations. Most early wars were wars of aggression, generally for conquest or subjugation. The conquests by Alexander the Great and Gaius Julius Ceasar and others were well documented, and mostly military "events." The religious conflicts, from the crusades of the 11th through the 13th centuries, the wars of religion during the 16th and 17th century, the Napoleonic wars, and even World War I, were also essentially dominated by competing military forces, only incidentally affecting civilian populations. World War II produced a shift when the Nazis eventually combined conquest with ethnic cleansing, which, by its nature involved significant numbers of vulnerable children.
As wars evolved from predominantly interstate conflicts - fought between two or more states, - to intra-state armed conflicts, civil conflicts between a government and a non-state group, which largely takes place within the territory of the state in question, the incidence of civilian casualties increased dramatically. These conflict are as likely to be fought in villages and on suburban streets as anywhere else. The enemy camp is all around, and distinctions between combatants and no-combatants melt away in the suspicions and confusions of daily strife. Casualties are often not random. Where civil wars resulted from an ethnic conflict between two or more groups fighting for their ethnic group's position in society, children have often been targeted for "preventative" reasons.
Although the Nazi experience could not be identified as an ethnic conflict per-se, Hitler's enforcers unquestionably practiced ethnic cleansing. They openly, for political purposes, scapegoated various ethnic groups, and advocated killing children of "unwanted" or "dangerous" groups, either as part of the "racial struggle" or as a measure of preventative security. During a five-year period they killed as many as 1.5 million children, including over 1 million Jewish children and tens of thousands gypsy children and children with physical or mental disabilities. (Holocaust Encyclopedia - U.S Holocaust Memorial Museum).
In previous centuries children sometimes ended up in the cross-fire. In contemporary conflicts they have often become targets. Today one billion children are living in countries and territories affected by war or conflict. It is fair to conclude that large numbers suffer violent injuries and death. In Afghanistan, since 1979, at least 35,000 children have been victims of land mines alone (U.N. reports). As of 2015, the estimated civilian death toll in Afghanistan was 26,000. Iraq counted 120,000 since 2003. Well over 50 percent are children. After six years of civil war in Syria, the death count stands at 470,000 - 55,000 of these are children. ("I Am Syria," February, 2017). As early as 1996, UNICEF reported that during the preceding decade 2 million children were killed; 4.5 million disabled; 12 million left homeless; more than 1 million orphaned or separated from their parents; and some 10 million psychologically traumatized. ("The State of the World's Children." UNICEF, 1996). This was well before the Syrian civil war and myriad other conflicts broke out.
Whereas the optics of vulnerable children dying while foaming at their mouths might signify that a "red line" was crossed, justifying an international response, these "red lines" are set way too high. Barrel bombs dropped on civilian targets have produced significantly more carnage. Still, government forces are getting away with these criminal acts by hiding behind the concept of "national sovereignty" and blaming "terrorists."
This should be enough to force the international community to pull its collective head out of the sand, increase the visibility of these atrocities, and begin to hold governments and institutions perpetrating these acts accountable. National sovereignty be damned.
COMEY DISMISSAL FUELS TEMPEST IN WASHINGTON D.C.
Donald trump's decision to fire James Comey fueled a national uproar, predictably especially among members of Congress. While apologists for the administration suggested that terminating his appointment had been favored by everyone on both sides of the aisle, others remarked that this would have been O.K. had it been done immediately following Mr. Trump's inauguration, but that at this juncture the discussion turned to the motive behind the decision. In an interview shortly after Mr. Comey was terminated Mr. Trump confirmed that the FBI investigation into collusion between his campaign staff and Russian operatives very much influenced his decision. Threats tweeted subsequently, suggesting that the president employed some kind of recording device in the oval office, conjured up images of President Nixon's Watergate scandal. The optics did not improve when Mr. Trump met with Russia's foreign minister Sergei Lavrov and ambassador Sergei Kislyak in the Oval Office the day after the firing.
Reactions to what turned out to be a very eventful week were predictable. Republicans stayed on the fence, be it uncomfortably. Democrats and activists throughout the country cried foul. The charges heard throughout concentrated on "obstruction of Justice" linked to suggestions that impeachment proceedings might be in order, and to the fear that the administration precipitated sliding down the slippery slope towards autocratic governance.
Anti-Trump forces have regularly charged that the administration displays a tendency to be autocratic. Coming from the business world, the president appears to be more comfortable making independent, rash decisions, without considering the consequences to democratic norms. His use of executive orders, he signed 32 in 100 days, rather than going the legislative route, appears to support that notion. His constant battle with the media, identifying coverage unflattering to him as "fake news," demonizing Muslims and illegal immigrants, derogatory comments about judicial decisions and "so-called judges," and arbitrary, impulsive, decisions, all fuelled the fear that we may be on the cusp of a dictatorial take-over.
The International Encyclopedia of the Social Sciences states that dictatorship refers to the "unrestricted domination of the state by an individual, a clique, or a small group." Its supporting article states that all forms of dictatorship share the following characteristics: "1. Exclusivity and arbitrariness in the exercise of power; 2. Abolition or loosening of the judicial bonds of political power; 3. Elimination or substantial restriction of civil liberties; 4. The predominantly aggressive, impulsive, form of decision making; and 5. Employment of despotic methods of political and social control." Stephen Walt, in the November 23, 2016 issue of "Foreign Policy," in an article titled: "10 Ways to tell if your president is a dictator," adds a few other features to the mix. These include: "Systematic efforts to intimidate the media, using state power to reward corporate backers and punish opponents, fear mongering, and demonizing the opposition." Benjamin Friedman, in "The Moral Consequences of Economic Growth," argues that growth, "more often than not, fosters greater opportunity, tolerance of diversity and dedication to democracy." When living standards stagnate or decline most societies retrogress. When we start blaming the rest of the world for loss of our prestige "we'd be ripe for a demagogue who feeds those insecurities with xenophobic sloganeering." For a totalitarian takeover to take root, ordinary people would have to let it happen.
While we consider the implications of the elements involved in this discussion, we should turn to a demand we are beginning to hear with greater frequency: "impeachment." During the summer of 1973, as a relatively recent immigrant to this country, I was engrossed in the televised hearings into the Watergate burglary, which ultimately led to President Nixon's resignation. The first article of impeachment Nixon was charged with was "obstruction of justice." Subsequent to Director Comey's dismissal this concept resurfaced. Harvard constitutional law professor Laurence Tribe, in a Washington Post op-ed, called for an impeachment investigation into Donald Trump for obstruction of justice. According to professor Tribe "the firing of FBI Director James Comey was an obvious effort to interfere with a probe involving national security." Obstruction involves any interference with a judicial or congressional proceeding, or attempt to do so. The real question is whether Trump intended to impede the FBI's investigation. Key words in a relevant charge will be "corrupt intent."
Article II of our Constitution stipulates that "the President, Vice President, and all civil officers of the United States shall be removed from office on impeachment charges for, and conviction of, treason, bribery, or other high crimes and misdemeanors." Historically, and cynically, President Gerald Ford observed that "impeachable offenses are whatever Congress says they are." An impeachment inquiry begins in the Judiciary Committee of the House of Representative, which is currently controlled by the Republican Party, and unlikely to start the process.
The inquiry has a long way to go before any of this becomes relevant. It helps to remember that only twice in our history has the House impeached a president, Andrew Johnson in 1868 and Bill Clinton in 1998. In neither case did the Senate convict. Richard Nixon resigned before the articles of impeachment were voted on by the full House.
So, sit back and let events unfold. They most likely will.
Reactions to what turned out to be a very eventful week were predictable. Republicans stayed on the fence, be it uncomfortably. Democrats and activists throughout the country cried foul. The charges heard throughout concentrated on "obstruction of Justice" linked to suggestions that impeachment proceedings might be in order, and to the fear that the administration precipitated sliding down the slippery slope towards autocratic governance.
Anti-Trump forces have regularly charged that the administration displays a tendency to be autocratic. Coming from the business world, the president appears to be more comfortable making independent, rash decisions, without considering the consequences to democratic norms. His use of executive orders, he signed 32 in 100 days, rather than going the legislative route, appears to support that notion. His constant battle with the media, identifying coverage unflattering to him as "fake news," demonizing Muslims and illegal immigrants, derogatory comments about judicial decisions and "so-called judges," and arbitrary, impulsive, decisions, all fuelled the fear that we may be on the cusp of a dictatorial take-over.
The International Encyclopedia of the Social Sciences states that dictatorship refers to the "unrestricted domination of the state by an individual, a clique, or a small group." Its supporting article states that all forms of dictatorship share the following characteristics: "1. Exclusivity and arbitrariness in the exercise of power; 2. Abolition or loosening of the judicial bonds of political power; 3. Elimination or substantial restriction of civil liberties; 4. The predominantly aggressive, impulsive, form of decision making; and 5. Employment of despotic methods of political and social control." Stephen Walt, in the November 23, 2016 issue of "Foreign Policy," in an article titled: "10 Ways to tell if your president is a dictator," adds a few other features to the mix. These include: "Systematic efforts to intimidate the media, using state power to reward corporate backers and punish opponents, fear mongering, and demonizing the opposition." Benjamin Friedman, in "The Moral Consequences of Economic Growth," argues that growth, "more often than not, fosters greater opportunity, tolerance of diversity and dedication to democracy." When living standards stagnate or decline most societies retrogress. When we start blaming the rest of the world for loss of our prestige "we'd be ripe for a demagogue who feeds those insecurities with xenophobic sloganeering." For a totalitarian takeover to take root, ordinary people would have to let it happen.
While we consider the implications of the elements involved in this discussion, we should turn to a demand we are beginning to hear with greater frequency: "impeachment." During the summer of 1973, as a relatively recent immigrant to this country, I was engrossed in the televised hearings into the Watergate burglary, which ultimately led to President Nixon's resignation. The first article of impeachment Nixon was charged with was "obstruction of justice." Subsequent to Director Comey's dismissal this concept resurfaced. Harvard constitutional law professor Laurence Tribe, in a Washington Post op-ed, called for an impeachment investigation into Donald Trump for obstruction of justice. According to professor Tribe "the firing of FBI Director James Comey was an obvious effort to interfere with a probe involving national security." Obstruction involves any interference with a judicial or congressional proceeding, or attempt to do so. The real question is whether Trump intended to impede the FBI's investigation. Key words in a relevant charge will be "corrupt intent."
Article II of our Constitution stipulates that "the President, Vice President, and all civil officers of the United States shall be removed from office on impeachment charges for, and conviction of, treason, bribery, or other high crimes and misdemeanors." Historically, and cynically, President Gerald Ford observed that "impeachable offenses are whatever Congress says they are." An impeachment inquiry begins in the Judiciary Committee of the House of Representative, which is currently controlled by the Republican Party, and unlikely to start the process.
The inquiry has a long way to go before any of this becomes relevant. It helps to remember that only twice in our history has the House impeached a president, Andrew Johnson in 1868 and Bill Clinton in 1998. In neither case did the Senate convict. Richard Nixon resigned before the articles of impeachment were voted on by the full House.
So, sit back and let events unfold. They most likely will.
Thursday, May 4, 2017
EUROPE STILL ON EDGE
After the results of the first round of voting in the long anticipated French election became known, mainstream EU politicians breathed a sigh of relief, and expressed cautious optimism about the eventual outcome of another nationalist-populist attack on the eventual viability of the European Union. Out of a field of eleven candidates, Emmanuel Macron, the centrist leader of "En Marche!" (On the move), came in first, 2.2% ahead of second place finisher Marine Le Pen, long-time euroskeptic leader of the far-right National Front.
Headlines throughout the west proclaimed: "French Vote Calms EU Fears," (Wall Styreet Journal); "The Right Knocked Out," (Le Figaro); "Presidentielle: One Step Away," (Liberation). The Euro surged to a six month high as the markets shook off fears of two anti-European candidates, Marine Le Pen and hard-left Communist endorsed Jean-Luc Melenchon, making the run-off. Pro European diplomats rushed to congratulate Mr. Macron, who never held elected office, but who was Economic Minister in France's Socialist government. European Union foreign policy chief Frederick Mogherini praised Mr. Macron, calling him "the hope of a generation.." And former UK Chancellor George Osborne congratulated Mr. Macron, expressing his belief that, at last, France "may acquire the leadership it needs." Ms. Le Pen received supporting comments from other European populists like Geert Wilders, the Dutch politician who lost his race back in January, and Nigel Farage, architect of the UK Brexit movement.
The nervously anticipated run-off, scheduled for May 7, could not be between more contrasting candidates. Marine Le Pen, far-right candidate with a populist economic agenda, is a known quantity in French politics. She wants to see legal immigration reduced from 200,000 to 10,000, and access to public services significantly limited. She believes in political isolationism, and is adamantly opposed to "Anglo-Saxon multiculturalism" and politically correct liberalism. Being anti-EU and anti-Euro, she favors return to the French Franc currency and proposes to hold a Brexit-like referendum (Frexit) on remaining in the EU. She wants closer ties with Russia, and has received millions in financial support from Russian banks.
Socially liberal centrist and pro-business candidate Emmanuel Macron, a former member of the Socialist Party, is strongly pro EU, pro Euro,, and believes the EU needs more integration, not less. However, he does want to initiate some changes to make it stronger, and proposes to strengthen the EU's external borders, while believing that France's security policies have unfairly targeted Muslims. He strongly identifies with the business community, and supports reducing the corporate tax rate from 35% to 25%. Macron also favors increasing defense spending to 2% of GDP, and encourages intervention in Syria.
Where in other European elections in which populists threatened mainstream positions, establishment parties survived by co-opting populist positions and moving further to the right, this French election has exposed a contrasting strategy. Macron campaigns from the center, strongly supporting the EU and what it stands for. Le Pen may be softening her positions on some aspects of her well documented anti-EU platform, if not fully co-opting a more centrist position - at least for the duration of the campaign. The rhetoric is shifting from "France or Europe"
to "France in Europe."
Polling preceding the May 7 run-off gives Mr. Macron a 59% to 41% edge over Ms. Le Pen. While these numbers ought to be comforting for EU leaders, all remember the polls leading up to the Brexit referendum results in the UK and Donald Trump's surprise victory in the US. Another imponderable comes from supporters of Jean-Luc Melenchon, whose candidacy led to a relatively strong 4th place finish in the first round, winning 19.6% of the vote, only 4% behind Macron and 2% behind Le Pen. While most mainstream losin g candidates endorsed Macron immediately following announcement of the results of the vote, Mr. Melanchon, whose positions substantially reflect those of the National Front, be it from the political left, has refused to do so. In the US significant numbers of Bernie Sanders supporters voted for Donald Trump, suggesting that populists on the right and the left share many policy positions. Something similar could happen in France and lead to a surprise outcome. It is not entirely surprising why many in Europe are still on edge.
Headlines throughout the west proclaimed: "French Vote Calms EU Fears," (Wall Styreet Journal); "The Right Knocked Out," (Le Figaro); "Presidentielle: One Step Away," (Liberation). The Euro surged to a six month high as the markets shook off fears of two anti-European candidates, Marine Le Pen and hard-left Communist endorsed Jean-Luc Melenchon, making the run-off. Pro European diplomats rushed to congratulate Mr. Macron, who never held elected office, but who was Economic Minister in France's Socialist government. European Union foreign policy chief Frederick Mogherini praised Mr. Macron, calling him "the hope of a generation.." And former UK Chancellor George Osborne congratulated Mr. Macron, expressing his belief that, at last, France "may acquire the leadership it needs." Ms. Le Pen received supporting comments from other European populists like Geert Wilders, the Dutch politician who lost his race back in January, and Nigel Farage, architect of the UK Brexit movement.
The nervously anticipated run-off, scheduled for May 7, could not be between more contrasting candidates. Marine Le Pen, far-right candidate with a populist economic agenda, is a known quantity in French politics. She wants to see legal immigration reduced from 200,000 to 10,000, and access to public services significantly limited. She believes in political isolationism, and is adamantly opposed to "Anglo-Saxon multiculturalism" and politically correct liberalism. Being anti-EU and anti-Euro, she favors return to the French Franc currency and proposes to hold a Brexit-like referendum (Frexit) on remaining in the EU. She wants closer ties with Russia, and has received millions in financial support from Russian banks.
Socially liberal centrist and pro-business candidate Emmanuel Macron, a former member of the Socialist Party, is strongly pro EU, pro Euro,, and believes the EU needs more integration, not less. However, he does want to initiate some changes to make it stronger, and proposes to strengthen the EU's external borders, while believing that France's security policies have unfairly targeted Muslims. He strongly identifies with the business community, and supports reducing the corporate tax rate from 35% to 25%. Macron also favors increasing defense spending to 2% of GDP, and encourages intervention in Syria.
Where in other European elections in which populists threatened mainstream positions, establishment parties survived by co-opting populist positions and moving further to the right, this French election has exposed a contrasting strategy. Macron campaigns from the center, strongly supporting the EU and what it stands for. Le Pen may be softening her positions on some aspects of her well documented anti-EU platform, if not fully co-opting a more centrist position - at least for the duration of the campaign. The rhetoric is shifting from "France or Europe"
to "France in Europe."
Polling preceding the May 7 run-off gives Mr. Macron a 59% to 41% edge over Ms. Le Pen. While these numbers ought to be comforting for EU leaders, all remember the polls leading up to the Brexit referendum results in the UK and Donald Trump's surprise victory in the US. Another imponderable comes from supporters of Jean-Luc Melenchon, whose candidacy led to a relatively strong 4th place finish in the first round, winning 19.6% of the vote, only 4% behind Macron and 2% behind Le Pen. While most mainstream losin g candidates endorsed Macron immediately following announcement of the results of the vote, Mr. Melanchon, whose positions substantially reflect those of the National Front, be it from the political left, has refused to do so. In the US significant numbers of Bernie Sanders supporters voted for Donald Trump, suggesting that populists on the right and the left share many policy positions. Something similar could happen in France and lead to a surprise outcome. It is not entirely surprising why many in Europe are still on edge.
Saturday, April 8, 2017
COULD BREXIT DISMANTLE THE UNITED KINGDOM?
On March 29 Tim Barrow, UK Permanent Representative to the EU, handed EU Council President Donald Tusk UK Prime Minister Theresa May's letter triggering Article 50 of the Lisbon Treaty, the mechanism for nations to exit the European Union. In London, in the House of Commons, the UK Prime Minister announced: "This is a historic moment from which there can be no turning back." Invoking Article 50 opened a two year window for Britain, after having been a member for 44 years, to negotiate an exit agreement from the EU. By all accounts the negotiations promise to be messy and acrimonious. Within the UK the British government needs to start working on an enormous legislative to-do list. CNN published "50 things the UK needs to do after triggering Article 50." These include transposing all current EU laws into the UK statute books. "Nearly 20,000 EU legislative acts are in force, dictating everything from how much clean energy a country should use to the acceptable curvature of a grocery store banana." (Kara Fox et.al., CNN, March 29, 207). Within the remaining 27 EU bloc members' attitudes towards Britain have significantly hardened. The concern is that the bloc can't afford to grant Britain a better deal outside of the EU than it had in it. They can't afford to set a precedent other current members might want to take advantage of in the future. Ms. May proclaimed that she recognizes that negotiations will be difficult, and that "there will be consequences for the United Kingdom of leaving the EU." Not all of these consequences are existential. Some of these could well pop up in her own backyard, threatening to dismantle the cohesion of the UK as it exists today.
The most immediate threat to UK cohesion is Scotland's demand for another referendum on independence. Scotland's First Minister, Nicola Sturgeon, angered Theresa May days before March 29 by calling for a new referendum to be held in early 2019. In their 2014 referendum Scotland voted to stay in the UK by a 55%-45% margin. But in the Brexit referendum Scotland voted 62% to 38% to remain in the European Union, in dramatic contrast to the overall outcome. On March 31, when Ms. Sturgeon, with support of the Scottish Parliament, sent a pointed request to the UK Prime Minister, she remarked that: "The UK government has decided to remove Scotland not just from the European Union but from the single market as well, and that is clearly against the will of the majority of people who live here." Ms. May has already said that the referendum cannot happen until two years from now when Britain leaves the European Union. (Steven Erlanger, "Brexit Moves Drives the Push," NY Times, April 2, 2017.) In reality the British government would need to give permission for such a referendum. Its Prime Minister has made it clear in the past that keeping the UK intact was a priority of her premiership, reiterating: "It means we believe in the Union; The precious bond between England, Scotland, Wales and Northern Ireland." Even if Ms. Sturgeon succeeds in completing a successful referendum, Scotland would the need to apply for EU membership, which could well take several more years.
Another potentially significant challenge to UK unity rests in Ireland. Ireland consists of the independent Republic of Ireland and Northern Ireland, which is now a part of the UK. The Good Friday Agreement - a.k.a. the Belfast Agreement - which ended what was referred to euphemistically as "The Troubles," that lasted from the late 1960s until the agreement was signed on April 10, 1998, stipulated that: "The agreement reached that Northern Ireland would remain part of the UK until a majority, both of the people of Northern Ireland and of the Republic of Ireland wished otherwise. Should that happen, then the British and Irish government would be under a "binding obligation" to implement that choice." As long as the UK and Ireland were members of the EU, this agreement held together quite comfortably. The UK is the Republic of Ireland's biggest trading partner. Around $1.3 billion worth of goods and services criss - cross the border every week. However, after Brexit the Irish border becomes an EU frontier. While currently no border controls exist between the signatories, Brexit would prompt the EU to re-establish customs barriers, thereby increasing the cost of doing business. Besides, Brexit would also end the 3.5 billion Euros in farm subsidies and structural grants Northern Ireland is receiving for the 2014-2020 period. Needless to say, the push to unify the two Irish entities is intensifying. Sinn Fein MPs are already touring England and Scotland to open the
unity debate. According to Pat Doherty, one of the MPs, "It isn't a matter of if we will achieve a united Ireland, it is a matter of how and when."
And finally there is the matter of Gibraltar. This British Overseas Territory on Spain's south coast with a population of 30,000 voted overwhelmingly, by 96 percent, to stay in the EU during the Brexit referendum, even though most of its inhabitants apparently want to remain British subjects. This territory, ceded to Britain by Spain in 1713, is twelve miles off the coast of Africa. It borders Spain, and it houses a British military base. After Brexit Spain could effectively isolate Gibraltar by reinforcing border controls that now don't exist. The European Council, whose members comprise the EU member states, shocked Downing Street recently by announcing that Gibraltar could only be included in a trade deal between London and Brussels with Spain's consent, effectively giving the latter a veto over whether any deal would apply to this territory.
Thus far last June's vote to exit the EU has appeared much less consequential than initially expected. However, the effects of actually initiating the divorce negotiations will relatively soon begin to sink in. It is not clear that Brexit's potentially destabilizing effect on domestic cohesion was anticipated. The British government recently did appoint a functionary whose sole task is to head off any potential Brexit-related consequential challenges to national unity. Nevertheless, David Martin, Scotland's longest serving Member of the European Parliament, made headlines just a few days ago predicting that the United Kingdom would no longer exist unless a "flexible and imaginative" Brexit solution could be found for Scotland.
It is clear that Britain's Prime Minister will have her hands full during the next few years.
The most immediate threat to UK cohesion is Scotland's demand for another referendum on independence. Scotland's First Minister, Nicola Sturgeon, angered Theresa May days before March 29 by calling for a new referendum to be held in early 2019. In their 2014 referendum Scotland voted to stay in the UK by a 55%-45% margin. But in the Brexit referendum Scotland voted 62% to 38% to remain in the European Union, in dramatic contrast to the overall outcome. On March 31, when Ms. Sturgeon, with support of the Scottish Parliament, sent a pointed request to the UK Prime Minister, she remarked that: "The UK government has decided to remove Scotland not just from the European Union but from the single market as well, and that is clearly against the will of the majority of people who live here." Ms. May has already said that the referendum cannot happen until two years from now when Britain leaves the European Union. (Steven Erlanger, "Brexit Moves Drives the Push," NY Times, April 2, 2017.) In reality the British government would need to give permission for such a referendum. Its Prime Minister has made it clear in the past that keeping the UK intact was a priority of her premiership, reiterating: "It means we believe in the Union; The precious bond between England, Scotland, Wales and Northern Ireland." Even if Ms. Sturgeon succeeds in completing a successful referendum, Scotland would the need to apply for EU membership, which could well take several more years.
Another potentially significant challenge to UK unity rests in Ireland. Ireland consists of the independent Republic of Ireland and Northern Ireland, which is now a part of the UK. The Good Friday Agreement - a.k.a. the Belfast Agreement - which ended what was referred to euphemistically as "The Troubles," that lasted from the late 1960s until the agreement was signed on April 10, 1998, stipulated that: "The agreement reached that Northern Ireland would remain part of the UK until a majority, both of the people of Northern Ireland and of the Republic of Ireland wished otherwise. Should that happen, then the British and Irish government would be under a "binding obligation" to implement that choice." As long as the UK and Ireland were members of the EU, this agreement held together quite comfortably. The UK is the Republic of Ireland's biggest trading partner. Around $1.3 billion worth of goods and services criss - cross the border every week. However, after Brexit the Irish border becomes an EU frontier. While currently no border controls exist between the signatories, Brexit would prompt the EU to re-establish customs barriers, thereby increasing the cost of doing business. Besides, Brexit would also end the 3.5 billion Euros in farm subsidies and structural grants Northern Ireland is receiving for the 2014-2020 period. Needless to say, the push to unify the two Irish entities is intensifying. Sinn Fein MPs are already touring England and Scotland to open the
unity debate. According to Pat Doherty, one of the MPs, "It isn't a matter of if we will achieve a united Ireland, it is a matter of how and when."
And finally there is the matter of Gibraltar. This British Overseas Territory on Spain's south coast with a population of 30,000 voted overwhelmingly, by 96 percent, to stay in the EU during the Brexit referendum, even though most of its inhabitants apparently want to remain British subjects. This territory, ceded to Britain by Spain in 1713, is twelve miles off the coast of Africa. It borders Spain, and it houses a British military base. After Brexit Spain could effectively isolate Gibraltar by reinforcing border controls that now don't exist. The European Council, whose members comprise the EU member states, shocked Downing Street recently by announcing that Gibraltar could only be included in a trade deal between London and Brussels with Spain's consent, effectively giving the latter a veto over whether any deal would apply to this territory.
Thus far last June's vote to exit the EU has appeared much less consequential than initially expected. However, the effects of actually initiating the divorce negotiations will relatively soon begin to sink in. It is not clear that Brexit's potentially destabilizing effect on domestic cohesion was anticipated. The British government recently did appoint a functionary whose sole task is to head off any potential Brexit-related consequential challenges to national unity. Nevertheless, David Martin, Scotland's longest serving Member of the European Parliament, made headlines just a few days ago predicting that the United Kingdom would no longer exist unless a "flexible and imaginative" Brexit solution could be found for Scotland.
It is clear that Britain's Prime Minister will have her hands full during the next few years.
Thursday, March 23, 2017
IN E.U. ELECTIONS THE DUTCH HAVE CLAIMED CENTER-STAGE
Not since they dominated the world, recognized during its "Golden Age" in the 17th Century as the foremost maritime and economic power, have the Dutch received as much attention as they did this past month. They received all this attention because the Netherlands was the first E.U. country this year to hold a national election in which a far-right populist politician had a realistic chance of winning. Populist activists throughout Europe and the U.S. wondered out loud if the trend established by the U.K. Brexit decision, Rodrigo Duterte's election in the Philippines, and Donald Trump's ascendance in the U.S. would continue in advance of the French, the German, and a possible Italian election later this year. Geert Wilders, leader of the Party for Freedom, a vocal champion of this movement, achieved recognition as its poster-child, and gained support from many hard-right political operatives. Analysts contributing to the New York Times, the Wall Street Journal and other major media outlets recognized the impending threat to the status quo in the E.U., and cautioned about consequences linked to its success. In a column titled: "How the Dutch Stopped Being Decent and Dull," (NYT, March 12, 2017) Ian Buruma, professor at Bard College, wrote: "What happens in the Netherlands could be a harbinger for other elections in Europe, and this also means that the future of the European Union is at stake." A month out, 77% of the electorate had yet to decide who to vote for, and three days out Wilders' party was still favored to win. When the March 15 election concluded, many in the European political establishment sighed with relief. Disaster was avoided - for now. Wilders lost and, by political standards, he lost big, only winning 20 out of 150 seats in the Dutch Parliament, well below current Prime Minister Mark Rutte's VVD party's 33.
What put him in the cross-hairs of supporters and opponents alike was his rabid stance on hard-right populist issues. He wanted to stop immigration from Muslim countries, close all mosques, ban the Quran and the burqua, and, for preventative reasons, imprison radical Muslims, even those who had not committed any crimes, while promising a "Brexit" type of referendum on the Netherlands' continued membership in the E.U.. His domestic supporters mirrored a segment of the electorate similar to that which helped elect Donald Trump in the U.S., and which pushed the U.K. to exit the Union. Within Europe Wilders was openly supported by prominent populists like Marine Le Pen in France, Frauke Petry in Germany, and Nigel Farage in the U.K.. Americans got into the act as well. David Horowitz, a home-grown right-wing activists who called Wilders the "Paul Revere of Europe," donated $150,000 to Wilders' effort - a sizable sum by Dutch standards. And Iowa Representative Steve King made news when he expressed his support for Mr. Wilders because of his stance on immigrants, saying that civilizations can't be restored with "somebody else's babies." Aside from overtly expressed support, Dutch officials continued to be fearful of surreptitious meddling by Russian operatives.
Even if Wilders' party had ended up winning the most seats, it was extremely unlikely that he would have become the next Prime Minister. All of the establishment parties had pledged not to work with him. Since the Dutch system is based on proportional representation, no party has ever won the majority outright. Coalition government is inevitable. Any party winning 0.67% of the vote - 1/150 of all votes cast - will secure one seat. Thirty parties participated in the recent election, some very much on the fringe. Labels like: "Party For Animals;" "Jesus Lives;" "Think!;" and the "Non Voters" are not unusual in the Dutch idiosyncratic system. Of these 13 gained seats. Ultimately Prime Minister Mark Rutte's platform, touting the country's economic stability under his People's Party for Freedom and Democracy, and the European Commission's forecast that the Dutch economy was projected to grow steadily at 2% this year, outperforming the E.U. as a whole, won out with he most, be it unexpected.
While some may zoom in on the fact that the hard-right populists were unsuccessful this time around, few analysts are predicting similar outcomes elsewhere in Europe later this year. Some media outlets suggested that the Dutch vote set the tone for Europe. However, Wilders' loss is only part of the story. The populists still control the narrative. To defeat the insurgency Mr. Rutte was forced to relinquish his "centrist" position on the political spectrum, and move further to the right. His strategy to beat back populism included co-opting it. He liked to say that Wilders' type of populism was the "wrong kind of populism." But it is clear that, because of the populist threat, much of the European electorate has moved further to the right. It has become more acceptable to be against Islam, Muslims and immigration.
One of the biggest shifts at the polls was the "collapse of the once powerful Labor Party, which won less that 6% of the vote, compared with 25% in the 2012 parliamentary elections." (Marcus Walker, Wall Street Journal). One of the big stories is that the top three political parties in the Netherlands, which won 85% of the total in 1985, and 74% in 2003, only collected 45% of the seats this year. The traditional left-right axis, with the communists and the socialists on the left, conservatives and capitalists on the right, bridged by moderate centrists, is going away. In the Netherlands "Labor" is in shambles, in the U.K. it is in turmoil, and in France the Socialists have become irrelevant. Smaller parties are flourishing and unstable coalitions are becoming the norm all over Europe. The main issues have shifted from the economy to immigration, E.U. membership, crime, security, national identity and globalization.
During the run-up to the election, Geert Wilders predicted that his exclusion from power would start a revolution. Establishment parties may have won the battle this time. However, the war is far from over. Some will claim that the "revolution" is still picking up speed.
What put him in the cross-hairs of supporters and opponents alike was his rabid stance on hard-right populist issues. He wanted to stop immigration from Muslim countries, close all mosques, ban the Quran and the burqua, and, for preventative reasons, imprison radical Muslims, even those who had not committed any crimes, while promising a "Brexit" type of referendum on the Netherlands' continued membership in the E.U.. His domestic supporters mirrored a segment of the electorate similar to that which helped elect Donald Trump in the U.S., and which pushed the U.K. to exit the Union. Within Europe Wilders was openly supported by prominent populists like Marine Le Pen in France, Frauke Petry in Germany, and Nigel Farage in the U.K.. Americans got into the act as well. David Horowitz, a home-grown right-wing activists who called Wilders the "Paul Revere of Europe," donated $150,000 to Wilders' effort - a sizable sum by Dutch standards. And Iowa Representative Steve King made news when he expressed his support for Mr. Wilders because of his stance on immigrants, saying that civilizations can't be restored with "somebody else's babies." Aside from overtly expressed support, Dutch officials continued to be fearful of surreptitious meddling by Russian operatives.
Even if Wilders' party had ended up winning the most seats, it was extremely unlikely that he would have become the next Prime Minister. All of the establishment parties had pledged not to work with him. Since the Dutch system is based on proportional representation, no party has ever won the majority outright. Coalition government is inevitable. Any party winning 0.67% of the vote - 1/150 of all votes cast - will secure one seat. Thirty parties participated in the recent election, some very much on the fringe. Labels like: "Party For Animals;" "Jesus Lives;" "Think!;" and the "Non Voters" are not unusual in the Dutch idiosyncratic system. Of these 13 gained seats. Ultimately Prime Minister Mark Rutte's platform, touting the country's economic stability under his People's Party for Freedom and Democracy, and the European Commission's forecast that the Dutch economy was projected to grow steadily at 2% this year, outperforming the E.U. as a whole, won out with he most, be it unexpected.
While some may zoom in on the fact that the hard-right populists were unsuccessful this time around, few analysts are predicting similar outcomes elsewhere in Europe later this year. Some media outlets suggested that the Dutch vote set the tone for Europe. However, Wilders' loss is only part of the story. The populists still control the narrative. To defeat the insurgency Mr. Rutte was forced to relinquish his "centrist" position on the political spectrum, and move further to the right. His strategy to beat back populism included co-opting it. He liked to say that Wilders' type of populism was the "wrong kind of populism." But it is clear that, because of the populist threat, much of the European electorate has moved further to the right. It has become more acceptable to be against Islam, Muslims and immigration.
One of the biggest shifts at the polls was the "collapse of the once powerful Labor Party, which won less that 6% of the vote, compared with 25% in the 2012 parliamentary elections." (Marcus Walker, Wall Street Journal). One of the big stories is that the top three political parties in the Netherlands, which won 85% of the total in 1985, and 74% in 2003, only collected 45% of the seats this year. The traditional left-right axis, with the communists and the socialists on the left, conservatives and capitalists on the right, bridged by moderate centrists, is going away. In the Netherlands "Labor" is in shambles, in the U.K. it is in turmoil, and in France the Socialists have become irrelevant. Smaller parties are flourishing and unstable coalitions are becoming the norm all over Europe. The main issues have shifted from the economy to immigration, E.U. membership, crime, security, national identity and globalization.
During the run-up to the election, Geert Wilders predicted that his exclusion from power would start a revolution. Establishment parties may have won the battle this time. However, the war is far from over. Some will claim that the "revolution" is still picking up speed.
Monday, March 13, 2017
WHILE COVER-UP UNRAVELS, WATERGATE RESURFACES
The White House is in turmoil. Politicians from both parties and media of all stripes seem more focused on rumored Trump campaign contacts with Russia than on policy. The question heard all over Washington is the same one former Senator Howard Baker asked in 1973: "What did the President know, and when did he know it?"
Much of what is going on today resembles what happened during "Watergate." The Trump White House consistently denied that there was any contact between its operatives and members of the Russian government. On March 3, USA Today listed 21 denials members of the Trump team used when denying that any contacts took place. Nevertheless, on February 13 National Security Advisor Michael Flynn "resigned" for lying about is discussions with Russian Ambassador Sergey Kisliak..
On March 2, Attorney General Jeff Sessions, accused of lying under oath to the Senate Judiciary Committee about his meetings with the same ambassador, recused himself from investigations involving contacts between the Trump team and Russia. A few days later several other team members, including Trump's son-in-law Jared Kushner, were identified as having been involved as well.
In the meantime the administration appears to have developed palpable paranoia, showing greater concern about the leaks of information, than about the substance of the investigation. Finally, last week the president accused his predecessor of committing a felony by tapping is phone, which caused the director of the FBI to ask the Justice Department to tell the president to stop lying.
As this political melodrama continues to unfold, it may be useful to review the chronology of events that surrounded the Watergate scandal. Much of the following was first reported by the Washington Post:
* June 13, 1971: The New York Times begins printing the "Pentagon Papers" released earlier by former defense analyst Daniel Elsberg.
* Sept. 9, 1971: The White House "Plumbers," a covert special investigations unit established July 24, 1971 during Richard Nixon's presidency tasked with stopping leaks of classified information, burglarizes Elsberg's psychiatrist's office.
* June 17, 1972: Five men are arrested trying to bug the offices of the Democratic National Committee at the Watergate Hotel and office complex.
* June 19, 1972: Reports surface that a GOP security aide is among the Watergate burglars. Former Attorney General John Mitchell, head of the Nixon reelection campaign, denies any complicity in the operation.
* Aug. 1, 1972: A $25,000 cashier's check, earmarked for the Nixon campaign, ended up in the bank account of a Watergate burglar.
*Sept. 29, 1972: The Post reports that John Mitchell, while serving as Attorney General, controlled a secret Republican fund used to finance a wide-spread intelligence gathering operation against the Democrats.
* Oct. 10, 1972: A Post article asserts that the Watergate break-in stemmed from a massive political spying and sabotage campaign conducted on behalf of the Nixon re-election effort.
* Nov. 11, 1972: Nixon re-elected in a landslide , crushing Democratic nominee George McGovern of South Dakota.
* Jan. 30, 1973: Former Nixon aide G. Gordon Liddy and James W. McCord Jr. convicted of conspiracy and wiretapping in the Watergate incident. Five others plead guilty.
* April 30, 1973: Top White House staffers H.R. Haldeman, John Ehrlichman and Attorney General Richard Kleindienst resign over the scandal. White House counsel John Dean is fired.
* May 18, 1973: Senate Watergate Committee begins its nationally televised hearings. Attorney General Elliot Richardson taps former Solicitor General Archibald Cox to be Special Prosecutor.
* June 3, 1973: John Dean tells Watergate investigators that he discussed the cover-up with Nixon at least 35 times.
* June 13, 1973: Watergate prosecutors find a memo to John Ehrlichman describing Daniel Elsberg's psychiatrist's break-in in great detail.
* July 13, 1973: Former White House aide Alexander Butterfield tells Senate Watergate Committee that Nixon, since 1971, has recorded all conversations and telephone calls in his office.
* July 18, 1973: Nixon orders White House taping system disconnected.
* July 23, 1973: Nixon refuses to turn over presidential tape recordings to the Watergate Committee.
* Oct. 20, 1973: Saturday night massacre - Nixon fires Archibald Cox and abolishes the office of the special prosecutor. Attorney General Richardson and his deputy, William Ruckelshaus, resign.
* Dec 7, 1973: The White House can't explain an 18 1/2 minute gap in one of the subpoenaed tapes.
* April 30, 1974: White House releases more than 1,200 edited transcripts of Nixon tapes. Committee insists that the actual tapes must be turned over.
* July 24, 1974: Supreme Court unanimously rejects Nixon's claim of executive privilege, and directs that the White House must turn over tape recordings of presidential conversations.
* July 27, 1974: The House Judiciary Committee passes the first of three articles of impeachment, charging obstruction of justice.
* August 8, 1974: Richard Nixon becomes the first U.S. President to resign his office.
None of this is intended to suggest that the current investigation will lead to a similar outcome. The party in power controls the scope and intensity of the inquiry. Impeachment, after all, tends to be more political than strictly constitutional.
Much of what is going on today resembles what happened during "Watergate." The Trump White House consistently denied that there was any contact between its operatives and members of the Russian government. On March 3, USA Today listed 21 denials members of the Trump team used when denying that any contacts took place. Nevertheless, on February 13 National Security Advisor Michael Flynn "resigned" for lying about is discussions with Russian Ambassador Sergey Kisliak..
On March 2, Attorney General Jeff Sessions, accused of lying under oath to the Senate Judiciary Committee about his meetings with the same ambassador, recused himself from investigations involving contacts between the Trump team and Russia. A few days later several other team members, including Trump's son-in-law Jared Kushner, were identified as having been involved as well.
In the meantime the administration appears to have developed palpable paranoia, showing greater concern about the leaks of information, than about the substance of the investigation. Finally, last week the president accused his predecessor of committing a felony by tapping is phone, which caused the director of the FBI to ask the Justice Department to tell the president to stop lying.
As this political melodrama continues to unfold, it may be useful to review the chronology of events that surrounded the Watergate scandal. Much of the following was first reported by the Washington Post:
* June 13, 1971: The New York Times begins printing the "Pentagon Papers" released earlier by former defense analyst Daniel Elsberg.
* Sept. 9, 1971: The White House "Plumbers," a covert special investigations unit established July 24, 1971 during Richard Nixon's presidency tasked with stopping leaks of classified information, burglarizes Elsberg's psychiatrist's office.
* June 17, 1972: Five men are arrested trying to bug the offices of the Democratic National Committee at the Watergate Hotel and office complex.
* June 19, 1972: Reports surface that a GOP security aide is among the Watergate burglars. Former Attorney General John Mitchell, head of the Nixon reelection campaign, denies any complicity in the operation.
* Aug. 1, 1972: A $25,000 cashier's check, earmarked for the Nixon campaign, ended up in the bank account of a Watergate burglar.
*Sept. 29, 1972: The Post reports that John Mitchell, while serving as Attorney General, controlled a secret Republican fund used to finance a wide-spread intelligence gathering operation against the Democrats.
* Oct. 10, 1972: A Post article asserts that the Watergate break-in stemmed from a massive political spying and sabotage campaign conducted on behalf of the Nixon re-election effort.
* Nov. 11, 1972: Nixon re-elected in a landslide , crushing Democratic nominee George McGovern of South Dakota.
* Jan. 30, 1973: Former Nixon aide G. Gordon Liddy and James W. McCord Jr. convicted of conspiracy and wiretapping in the Watergate incident. Five others plead guilty.
* April 30, 1973: Top White House staffers H.R. Haldeman, John Ehrlichman and Attorney General Richard Kleindienst resign over the scandal. White House counsel John Dean is fired.
* May 18, 1973: Senate Watergate Committee begins its nationally televised hearings. Attorney General Elliot Richardson taps former Solicitor General Archibald Cox to be Special Prosecutor.
* June 3, 1973: John Dean tells Watergate investigators that he discussed the cover-up with Nixon at least 35 times.
* June 13, 1973: Watergate prosecutors find a memo to John Ehrlichman describing Daniel Elsberg's psychiatrist's break-in in great detail.
* July 13, 1973: Former White House aide Alexander Butterfield tells Senate Watergate Committee that Nixon, since 1971, has recorded all conversations and telephone calls in his office.
* July 18, 1973: Nixon orders White House taping system disconnected.
* July 23, 1973: Nixon refuses to turn over presidential tape recordings to the Watergate Committee.
* Oct. 20, 1973: Saturday night massacre - Nixon fires Archibald Cox and abolishes the office of the special prosecutor. Attorney General Richardson and his deputy, William Ruckelshaus, resign.
* Dec 7, 1973: The White House can't explain an 18 1/2 minute gap in one of the subpoenaed tapes.
* April 30, 1974: White House releases more than 1,200 edited transcripts of Nixon tapes. Committee insists that the actual tapes must be turned over.
* July 24, 1974: Supreme Court unanimously rejects Nixon's claim of executive privilege, and directs that the White House must turn over tape recordings of presidential conversations.
* July 27, 1974: The House Judiciary Committee passes the first of three articles of impeachment, charging obstruction of justice.
* August 8, 1974: Richard Nixon becomes the first U.S. President to resign his office.
None of this is intended to suggest that the current investigation will lead to a similar outcome. The party in power controls the scope and intensity of the inquiry. Impeachment, after all, tends to be more political than strictly constitutional.
Subscribe to:
Posts (Atom)